Science.gov

Sample records for additional quantitative analysis

  1. The quantitative surface analysis of an antioxidant additive in a lubricant oil matrix by desorption electrospray ionization mass spectrometry

    PubMed Central

    Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S

    2013-01-01

    RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398

  2. Quantitative mass spectrometric analysis of dipeptides in protein hydrolysate by a TNBS derivatization-aided standard addition method.

    PubMed

    Hanh, Vu Thi; Kobayashi, Yutaro; Maebuchi, Motohiro; Nakamori, Toshihiro; Tanaka, Mitsuru; Matsui, Toshiro

    2016-01-01

    The aim of this study was to establish, through a standard addition method, a convenient quantification assay for dipeptides (GY, YG, SY, YS, and IY) in soybean hydrolysate using 2,4,6-trinitrobenzene sulfonate (TNBS) derivatization-aided LC-TOF-MS. Soybean hydrolysate samples (25.0 mg mL(-1)) spiked with target standards were subjected to TNBS derivatization. Under the optimal LC-MS conditions, five target dipeptides derivatized with TNBS were successfully detected. Examination of the standard addition curves, with a correlation coefficient of r(2) > 0.979, provided a reliable quantification of the target dipeptides, GY, YG, SY, YS, and IY, in soybean hydrolysate to be 424 ± 20, 184 ± 9, 2188 ± 199, 327 ± 16, and 2211 ± 133 μg g(-1) of hydrolysate, respectively. The proposed LC-MS assay is a reliable and convenient assay method, with no interference from matrix effects in hydrolysate, and with no requirement for the use of an isotope labeled internal standard. PMID:26212980

  3. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  4. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  5. Effect of preservative addition on sensory and dynamic profile of Lucanian dry-sausages as assessed by quantitative descriptive analysis and temporal dominance of sensations.

    PubMed

    Braghieri, Ada; Piazzolla, Nicoletta; Galgano, Fernanda; Condelli, Nicola; De Rosa, Giuseppe; Napolitano, Fabio

    2016-12-01

    The quantitative descriptive analysis (QDA) was combined with temporal dominance of sensations (TDS) to assess the sensory properties of Lucanian dry-sausages either added with nitrate, nitrite and l-ascorbic acid (NS), or not (NNS). Both QDA and TDS differentiated the two groups of sausages. NNS products were perceived with higher intensity of hardness (P<0.05) and tended to be perceived with higher intensities of flavor (P<0.10), pepper (P<0.20), and oiliness (P<0.20), while resulting lower in chewiness (P<0.20). TDS showed that in all the sausages hardness was the first dominant attribute; then, in NNS products flavor remained dominant until the end of tasting, whereas in NS products oiliness prevailed. In conclusion, TDS showed that the perception of some textural parameters, such as oiliness, during mastication was more dominant in NS products, whereas using conventional QDA this attribute appeared higher in sausages manufactured without preservatives. Therefore, TDS provided additional information for the description and differentiation of Lucanian sausages. PMID:27486959

  6. Detection of multivessel disease in patients with sustained myocardial infarction by thallium 201 myocardial scintigraphy: No additional value of quantitative analysis

    SciTech Connect

    Niemeyer, M.G.; Pauwels, E.K.; van der Wall, E.E.; Cramer, M.J.; Verzijlbergen, J.F.; Zwinderman, A.H.; Ascoop, C.A. )

    1989-01-01

    This study was performed to determine the value of visual and quantitative thallium 201 scintigraphy for the detection of multivessel disease in 67 patients with a sustained transmural myocardial infarction. Also the viability of the myocardial regions corresponding to pathologic Q-waves was evaluated. Of the 67 patients, 51 patients had multivessel coronary artery disease (76%). The sensitivity of the exercise test was 53%, of thallium scintigraphy 69%, when interpreted visually, and 67%, when analysed quantitatively. The specificity of these methods was 69%, 56%, and 50%, respectively. Sixty-two infarct-related flow regions were detected by visual analysis of the thallium scans, total redistribution was observed in 11/62 (18%) of patients, partial redistribution in 26/62 (42%), and no redistribution in 25/62 (40%) of patients. The infarct-related areas with total redistribution on the thallium scintigrams were more likely to be associated with normal or hypokinetic wall motion (7/11: 64%) than the areas with a persistent defect (7/25:28%) (P = 0.05), which were more related with akinetic or dyskinetic wall motion. Based on our results, it is concluded that (1) both visual and quantitative analysis of thallium exercise scintigraphy have limited value to predict the presence or absence of multivessel coronary artery disease in patients with sustained myocardial infarction, and (2) exercise-induced thallium redistribution may occur within the infarct zone, suggesting the presence of viable but jeopardized myocardium in presumed fibrotic myocardial areas.

  7. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-01

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses. PMID

  8. Quantitative Analysis of Glaciated Landscapes

    NASA Astrophysics Data System (ADS)

    Huerta, A. D.

    2005-12-01

    The evolution of glaciated mountains is at the heart of the debate over Late Cenozoic linkages between climate and tectonics. Traditionally, the development of high summit elevations is attributed to tectonic processes. However, much of the high elevation of the Transantarctic Mountains can be attributed solely to uplift in response to glacial erosion (Stern et al., 2005). The Transantarctic Mountains (TAM) provide an unparalleled opportunity to study glacial erosion. The mountain range has experienced glacial conditions since Oligocene time. In the higher and dryer regions of the TAM there is only a thin veneer of ice and snow draping the topography. In these regions landforms that were shaped during earlier climatic conditions are preserved. In fact, both glacial and fluvial landforms dating as far back as 18 Ma are preserved locally. In addition, the TAM are ideal for studying glacial erosion since the range has experienced minimal tectonic uplift since late Oligocene time, thus isolating the erosion signal from any tectonic signal. With the advent of digital data sets and GIS methodologies, quantitative analysis can identify key aspects of glaciated landscape morphology, and thus develop powerful analytical techniques for objective study of glaciation. Inspection of USGS topographic maps of the TAM reveals that mountain tops display an extreme range of glacial modification. For example, in the Mt. Rabot region (83°-84° S), mountain peaks are strongly affected by glaciation; cirque development is advanced with cirque diameters on the range of several kilometers, and cirque confluence has resulted in the formation of ``knife-edge'' arêtes up to 10 km long. In contrast, in the Mt. Murchison area (73°-74° S) cirque development is youthful, and there is minimal development of arêtes. Preliminary work indicates that analysis of DEM's and contour lines can be used to distinguish degree of glaciation. In particular, slope, curvature, and power spectrum analysis

  9. Quantitative intracerebral brain hemorrhage analysis

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Dhawan, Atam P.; Cosic, Dubravko; Kovacevic, Domagoj; Broderick, Joseph; Brott, Thomas

    1999-05-01

    In this paper a system for 3-D quantitative analysis of human spontaneous intracerebral brain hemorrhage (ICH) is described. The purpose of the developed system is to perform quantitative 3-D measurements of the parameters of ICH region and from computed tomography (CT) images. The measured parameter in this phase of the system development is volume of the hemorrhage region. The goal of the project is to measure parameters for a large number of patients having ICH and to correlate measured parameters to patient morbidity and mortality.

  10. Software for quantitative trait analysis

    PubMed Central

    2005-01-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed. PMID:16197737

  11. Image analysis and quantitative morphology.

    PubMed

    Mandarim-de-Lacerda, Carlos Alberto; Fernandes-Santos, Caroline; Aguila, Marcia Barbosa

    2010-01-01

    Quantitative studies are increasingly found in the literature, particularly in the fields of development/evolution, pathology, and neurosciences. Image digitalization converts tissue images into a numeric form by dividing them into very small regions termed picture elements or pixels. Image analysis allows automatic morphometry of digitalized images, and stereology aims to understand the structural inner three-dimensional arrangement based on the analysis of slices showing two-dimensional information. To quantify morphological structures in an unbiased and reproducible manner, appropriate isotropic and uniform random sampling of sections, and updated stereological tools are needed. Through the correct use of stereology, a quantitative study can be performed with little effort; efficiency in stereology means as little counting as possible (little work), low cost (section preparation), but still good accuracy. This short text provides a background guide for non-expert morphologists. PMID:19960334

  12. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses. PMID:26850283

  13. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  14. Optimization of quantitative infrared analysis

    NASA Astrophysics Data System (ADS)

    Duerst, Richard W.; Breneman, W. E.; Dittmar, Rebecca M.; Drugge, Richard E.; Gagnon, Jim E.; Pranis, Robert A.; Spicer, Colleen K.; Stebbings, William L.; Westberg, J. W.; Duerst, Marilyn D.

    1994-01-01

    A number of industrial processes, especially quality assurance procedures, accept information on relative quantities of components in mixtures, whenever absolute values for the quantitative analysis are unavailable. These relative quantities may be determined from infrared intensity ratios even though known standards are unavailable. Repeatability [vs precisionhl in quantitative analysis is a critical parameter for meaningful results. In any given analysis, multiple runs provide "answers" with a certain standard deviation. Obviously, the lower the standard deviation, the better the precision. In attempting to minimize the standard deviation and thus improve precision, we need to delineate which contributing factors we have control over (such as sample preparation techniques, data analysis methodology) and which factors we have little control over (environmental and instrument noise, for example). For a given set of conditions, the best instrumental precision achievable on an IR instrument should be determinable. Traditionally, the term "signal-to-noise" (S/N) has been used for a single spectrum, realizing that S/N improves with an increase in number of scans coadded for generation of that single spectrum. However, the S/N ratio does not directly reflect the precision achievable for an absorbing band. We prefer to use the phrase "maximum achievable instrument precision" (MAIP), which is equivalent to the minimum relative standard deviation for a given peak (either height or area) in spectra. For a specific analysis, the analyst should have in mind the desired precision. Only if the desired precision is less than the MA1P will the analysis be feasible. Once the MAIP is established, other experimental procedures may be modified to improve the analytical precision, if it is below that which is expected (the MAIP).

  15. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  16. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  17. A Quantitative Fitness Analysis Workflow

    PubMed Central

    Lydall, D.A.

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel1,2,3,4. QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods5,6. However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases3. For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously1. Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and imaging

  18. A quantitative fitness analysis workflow.

    PubMed

    Banks, A P; Lawless, C; Lydall, D A

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel(1,2,3,4). QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods(5,6). However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases(3). For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously(1). Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and

  19. Quantitative analysis of sandstone porosity

    SciTech Connect

    Ferrell, R.E. Jr.; Carpenter, P.K.

    1988-01-01

    A quantitative analysis of changes in porosity associated with sandstone diagenesis was accomplished with digital back-scattered electron image analysis techniques. The volume percent (vol. %) of macroporosity, quartz, clay minerals, feldspar, and other constituents combined with stereological parameters, such as the size and shape of the analyzed features, permitted the determination of cement volumes, the ratio of primary to secondary porosity, and the relative abundance of detrital and authigenic clay minerals. The analyses were produced with a JEOL 733 Superprobe and a TRACOR/NORTHERN 5700 Image Analyzer System. The results provided a numerical evaluation of sedimentological facies controls and diagenetic effects on the permeabilities of potential reservoirs. In a typical application, subtle differences in the diagnetic development of porosity were detected in Wilcox sandstones from central Louisiana. Mechanical compaction of these shoreface sandstones has reduced the porosity to approximately 20%. In most samples with permeabilities greater than 10 md, the measured ratio of macroporosity to microporosity associated with pore-filling kaolinite was 3:1. In other sandstones with lower permeabilities, the measured ratio was higher, but the volume of pore-filling clay was essentially the same. An analysis of the frequency distribution of pore diameters and shapes revealed that the latter samples contained 2-3 vol% of grain-dissolution or moldic porosity. Fluid entry to these large pores was restricted and the clays produced from the grain dissolution products reduced the observed permeability. The image analysis technique provided valuable data for the distinction of productive and nonproductive intervals in this reservoir.

  20. Quantitative analysis of retinal OCT.

    PubMed

    Sonka, Milan; Abràmoff, Michael D

    2016-10-01

    Clinical acceptance of 3-D OCT retinal imaging brought rapid development of quantitative 3-D analysis of retinal layers, vasculature, retinal lesions as well as facilitated new research in retinal diseases. One of the cornerstones of many such analyses is segmentation and thickness quantification of retinal layers and the choroid, with an inherently 3-D simultaneous multi-layer LOGISMOS (Layered Optimal Graph Image Segmentation for Multiple Objects and Surfaces) segmentation approach being extremely well suited for the task. Once retinal layers are segmented, regional thickness, brightness, or texture-based indices of individual layers can be easily determined and thus contribute to our understanding of retinal or optic nerve head (ONH) disease processes and can be employed for determination of disease status, treatment responses, visual function, etc. Out of many applications, examples provided in this paper focus on image-guided therapy and outcome prediction in age-related macular degeneration and on assessing visual function from retinal layer structure in glaucoma. PMID:27503080

  1. Quantitative Analysis of Face Symmetry.

    PubMed

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait. PMID:26080172

  2. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  3. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  4. Qualitative and Quantitative Analysis: Interpretation of Electropherograms

    NASA Astrophysics Data System (ADS)

    Szumski, Michał; Buszewski, Bogusław

    In this chapter the basic information on qualitative and quantitative analysis in CE is provided. Migration time and spectral data are described as the most important parameters used for identification of compounds. The parameters that negatively influence qualitative analysis are briefly mentioned. In the quantitative analysis section the external standard and internal standard calibration methods are described. Variables influencing peak height and peak area in capillary electrophoresis are briefly summarized. Also, a discussion on electrodisperssion and its influence on a observed peak shape is provided.

  5. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  6. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics. PMID:25420202

  7. Genetic interactions contribute less than additive effects to quantitative trait variation in yeast

    PubMed Central

    Bloom, Joshua S.; Kotenko, Iulia; Sadhu, Meru J.; Treusch, Sebastian; Albert, Frank W.; Kruglyak, Leonid

    2015-01-01

    Genetic mapping studies of quantitative traits typically focus on detecting loci that contribute additively to trait variation. Genetic interactions are often proposed as a contributing factor to trait variation, but the relative contribution of interactions to trait variation is a subject of debate. Here we use a very large cross between two yeast strains to accurately estimate the fraction of phenotypic variance due to pairwise QTL–QTL interactions for 20 quantitative traits. We find that this fraction is 9% on average, substantially less than the contribution of additive QTL (43%). Statistically significant QTL–QTL pairs typically have small individual effect sizes, but collectively explain 40% of the pairwise interaction variance. We show that pairwise interaction variance is largely explained by pairs of loci at least one of which has a significant additive effect. These results refine our understanding of the genetic architecture of quantitative traits and help guide future mapping studies. PMID:26537231

  8. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  9. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  10. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  11. Quantitative Proteomics Analysis of Leukemia Cells.

    PubMed

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  12. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  13. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  14. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  15. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  16. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  17. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants. PMID:27153828

  18. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo)

    PubMed Central

    Li, Yi; Kim, Jong-Joo

    2015-01-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  19. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  20. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  1. Quantitative Bias Analysis in Regulatory Settings.

    PubMed

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  2. Accuracy in Quantitative 3D Image Analysis

    PubMed Central

    Bassel, George W.

    2015-01-01

    Quantitative 3D imaging is becoming an increasingly popular and powerful approach to investigate plant growth and development. With the increased use of 3D image analysis, standards to ensure the accuracy and reproducibility of these data are required. This commentary highlights how image acquisition and postprocessing can introduce artifacts into 3D image data and proposes steps to increase both the accuracy and reproducibility of these analyses. It is intended to aid researchers entering the field of 3D image processing of plant cells and tissues and to help general readers in understanding and evaluating such data. PMID:25804539

  3. Quantitative architectural analysis of bronchial intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Guillaud, Martial; MacAulay, Calum E.; Le Riche, Jean C.; Dawe, Chris; Korbelik, Jagoda; Lam, Stephen

    2000-04-01

    Considerable variation exists among pathologist in the interpretation of intraepithelial neoplasia making it difficult to determine the natural history of these lesion and to establish management guidelines for chemoprevention. The aim of the study is to evaluate architectural features of pre-neoplastic progression in lung cancer, and to search for a correlation between architectural index and conventional pathology. Quantitative architectural analysis was performed on a series of normal lung biopsies and Carcinoma In Situ (CIS). Centers of gravity of the nuclei within a pre-defined region of interest were used as seeds to generate a Voronoi Diagram. About 30 features derived from the Voronoi diagram, its dual the Delaunay tessellation, and the Minimum Spanning Tree were extracted. A discriminant analysis was performed to separate between the two groups. The architectural Index was calculated for each of the bronchial biopsies that were interpreted as hyperplasia, metaplasia, mild, moderate or severe dysplasia by conventional histopathology criteria. As a group, lesions classified as CIS by conventional histopathology criteria could be distinguished from dysplasia using the architectural Index. Metaplasia was distinct from hyperplasia and hyperplasia from normal. There was overlap between severe and moderate dysplasia but mild dysplasia could be distinguished form moderate dysplasia. Bronchial intraepithelial neoplastic lesions can be degraded objectively by architectural features. Combination of architectural features and nuclear morphometric features may improve the quantitation of the changes occurring during the intra-epithelial neoplastic process.

  4. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  5. Quantitative interactome analysis reveals a chemoresistant edgotype.

    PubMed

    Chavez, Juan D; Schweppe, Devin K; Eng, Jimmy K; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for 'edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  6. Quantitative analysis of NMR spectra with chemometrics

    NASA Astrophysics Data System (ADS)

    Winning, H.; Larsen, F. H.; Bro, R.; Engelsen, S. B.

    2008-01-01

    The number of applications of chemometrics to series of NMR spectra is rapidly increasing due to an emerging interest for quantitative NMR spectroscopy e.g. in the pharmaceutical and food industries. This paper gives an analysis of advantages and limitations of applying the two most common chemometric procedures, Principal Component Analysis (PCA) and Multivariate Curve Resolution (MCR), to a designed set of 231 simple alcohol mixture (propanol, butanol and pentanol) 1H 400 MHz spectra. The study clearly demonstrates that the major advantage of chemometrics is the visualisation of larger data structures which adds a new exploratory dimension to NMR research. While robustness and powerful data visualisation and exploration are the main qualities of the PCA method, the study demonstrates that the bilinear MCR method is an even more powerful method for resolving pure component NMR spectra from mixtures when certain conditions are met.

  7. Additional EIPC Study Analysis. Final Report

    SciTech Connect

    Hadley, Stanton W; Gotham, Douglas J.; Luciani, Ralph L.

    2014-12-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 14 topics was developed for further analysis. This paper brings together the earlier interim reports of the first 13 topics plus one additional topic into a single final report.

  8. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  9. Quantitative proteomic analysis of single pancreatic islets

    PubMed Central

    Waanders, Leonie F.; Chwalek, Karolina; Monetti, Mara; Kumar, Chanchal; Lammert, Eckhard; Mann, Matthias

    2009-01-01

    Technological developments make mass spectrometry (MS)-based proteomics a central pillar of biochemical research. MS has been very successful in cell culture systems, where sample amounts are not limiting. To extend its capabilities to extremely small, physiologically distinct cell types isolated from tissue, we developed a high sensitivity chromatographic system that measures nanogram protein mixtures for 8 h with very high resolution. This technology is based on splitting gradient effluents into a capture capillary and provides an inherent technical replicate. In a single analysis, this allowed us to characterize kidney glomeruli isolated by laser capture microdissection to a depth of more than 2,400 proteins. From pooled pancreatic islets of Langerhans, another type of “miniorgan,” we obtained an in-depth proteome of 6,873 proteins, many of them involved in diabetes. We quantitatively compared the proteome of single islets, containing 2,000–4,000 cells, treated with high or low glucose levels, and covered most of the characteristic functions of beta cells. Our ultrasensitive analysis recapitulated known hyperglycemic changes but we also find components up-regulated such as the mitochondrial stress regulator Park7. Direct proteomic analysis of functionally distinct cellular structures opens up perspectives in physiology and pathology. PMID:19846766

  10. Towards a Quantitative OCT Image Analysis

    PubMed Central

    Garcia Garrido, Marina; Beck, Susanne C.; Mühlfriedel, Regine; Julien, Sylvie; Schraermeyer, Ulrich; Seeliger, Mathias W.

    2014-01-01

    Background Optical coherence tomography (OCT) is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study. Methods Spectral-Domain Optical Coherence Tomography (OCT), confocal Scanning-La­ser Ophthalmoscopy (SLO), and Fluorescein Angiography (FA) were performed in mice (Mus musculus), gerbils (Gerbillus perpadillus), and cynomolgus monkeys (Macaca fascicularis) using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer). Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/). Results Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates) with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP), the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated. Conclusions OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions

  11. Quantitative Motion Analysis in Two and Three Dimensions.

    PubMed

    Wessels, Deborah J; Lusche, Daniel F; Kuhl, Spencer; Scherer, Amanda; Voss, Edward; Soll, David R

    2016-01-01

    This chapter describes 2D quantitative methods for motion analysis as well as 3D motion analysis and reconstruction methods. Emphasis is placed on the analysis of dynamic cell shape changes that occur through extension and retraction of force generating structures such as pseudopodia and lamellipodia. Quantitative analysis of these structures is an underutilized tool in the field of cell migration. Our intent, therefore, is to present methods that we developed in an effort to elucidate mechanisms of basic cell motility, directed cell motion during chemotaxis, and metastasis. We hope to demonstrate how application of these methods can more clearly define alterations in motility that arise due to specific mutations or disease and hence, suggest mechanisms or pathways involved in normal cell crawling and treatment strategies in the case of disease. In addition, we present a 4D tumorigenesis model for high-resolution analysis of cancer cells from cell lines and human cancer tissue in a 3D matrix. Use of this model led to the discovery of the coalescence of cancer cell aggregates and unique cell behaviors not seen in normal cells or normal tissue. Graphic illustrations to visually display and quantify cell shape are presented along with algorithms and formulae for calculating select 2D and 3D motion analysis parameters. PMID:26498790

  12. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  13. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  14. Multiple quantitative trait analysis using bayesian networks.

    PubMed

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  15. Modular Skeletal Evolution in Sticklebacks Is Controlled by Additive and Clustered Quantitative Trait Loci

    PubMed Central

    Miller, Craig T.; Glazer, Andrew M.; Summers, Brian R.; Blackman, Benjamin K.; Norman, Andrew R.; Shapiro, Michael D.; Cole, Bonnie L.; Peichel, Catherine L.; Schluter, Dolph; Kingsley, David M.

    2014-01-01

    Understanding the genetic architecture of evolutionary change remains a long-standing goal in biology. In vertebrates, skeletal evolution has contributed greatly to adaptation in body form and function in response to changing ecological variables like diet and predation. Here we use genome-wide linkage mapping in threespine stickleback fish to investigate the genetic architecture of evolved changes in many armor and trophic traits. We identify >100 quantitative trait loci (QTL) controlling the pattern of serially repeating skeletal elements, including gill rakers, teeth, branchial bones, jaws, median fin spines, and vertebrae. We use this large collection of QTL to address long-standing questions about the anatomical specificity, genetic dominance, and genomic clustering of loci controlling skeletal differences in evolving populations. We find that most QTL (76%) that influence serially repeating skeletal elements have anatomically regional effects. In addition, most QTL (71%) have at least partially additive effects, regardless of whether the QTL controls evolved loss or gain of skeletal elements. Finally, many QTL with high LOD scores cluster on chromosomes 4, 20, and 21. These results identify a modular system that can control highly specific aspects of skeletal form. Because of the general additivity and genomic clustering of major QTL, concerted changes in both protective armor and trophic traits may occur when sticklebacks inherit either marine or freshwater alleles at linked or possible “supergene” regions of the stickleback genome. Further study of these regions will help identify the molecular basis of both modular and coordinated changes in the vertebrate skeleton. PMID:24652999

  16. Modular skeletal evolution in sticklebacks is controlled by additive and clustered quantitative trait Loci.

    PubMed

    Miller, Craig T; Glazer, Andrew M; Summers, Brian R; Blackman, Benjamin K; Norman, Andrew R; Shapiro, Michael D; Cole, Bonnie L; Peichel, Catherine L; Schluter, Dolph; Kingsley, David M

    2014-05-01

    Understanding the genetic architecture of evolutionary change remains a long-standing goal in biology. In vertebrates, skeletal evolution has contributed greatly to adaptation in body form and function in response to changing ecological variables like diet and predation. Here we use genome-wide linkage mapping in threespine stickleback fish to investigate the genetic architecture of evolved changes in many armor and trophic traits. We identify >100 quantitative trait loci (QTL) controlling the pattern of serially repeating skeletal elements, including gill rakers, teeth, branchial bones, jaws, median fin spines, and vertebrae. We use this large collection of QTL to address long-standing questions about the anatomical specificity, genetic dominance, and genomic clustering of loci controlling skeletal differences in evolving populations. We find that most QTL (76%) that influence serially repeating skeletal elements have anatomically regional effects. In addition, most QTL (71%) have at least partially additive effects, regardless of whether the QTL controls evolved loss or gain of skeletal elements. Finally, many QTL with high LOD scores cluster on chromosomes 4, 20, and 21. These results identify a modular system that can control highly specific aspects of skeletal form. Because of the general additivity and genomic clustering of major QTL, concerted changes in both protective armor and trophic traits may occur when sticklebacks inherit either marine or freshwater alleles at linked or possible "supergene" regions of the stickleback genome. Further study of these regions will help identify the molecular basis of both modular and coordinated changes in the vertebrate skeleton. PMID:24652999

  17. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  18. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    PubMed Central

    Tillack, Jana; Paczia, Nicole; Nöh, Katharina; Wiechert, Wolfgang; Noack, Stephan

    2012-01-01

    Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

  19. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  20. Acid Rain Analysis by Standard Addition Titration.

    ERIC Educational Resources Information Center

    Ophardt, Charles E.

    1985-01-01

    The standard addition titration is a precise and rapid method for the determination of the acidity in rain or snow samples. The method requires use of a standard buret, a pH meter, and Gran's plot to determine the equivalence point. Experimental procedures used and typical results obtained are presented. (JN)

  1. [Accounting for Expected Linkage in Biometric Analysis of Quantitative Traits].

    PubMed

    Mikhailov, M E

    2015-08-01

    The problem of accounting for a genetic estimation of expected linkage in the disposition of random loci was solved for the additive-dominant model. The Comstock-Robinson estimations for the sum of squares of dominant effects, the sum of squares of additive effects, and the average degree of dominance were modified. Also, the Wright's estimation for the number of loci controlling the variation of a quantitative trait was modified and its application sphere was extended. Formulas that should eliminate linkage, on average, were derived for these estimations. Nonbiased estimations were applied to the analysis of maize data. Our result showed that the most likely cause of heterosis is dominance rather than overdominance and that the main part of the heterotic effect is provided by dozens of genes. PMID:26601496

  2. Quantitative transcriptome analysis using RNA-seq.

    PubMed

    Külahoglu, Canan; Bräutigam, Andrea

    2014-01-01

    RNA-seq has emerged as the technology of choice to quantify gene expression. This technology is a convenient accurate tool to quantify diurnal changes in gene expression, gene discovery, differential use of promoters, and splice variants for all genes expressed in a single tissue. Thus, RNA-seq experiments provide sequence information and absolute expression values about transcripts in addition to relative quantification available with microarrays or qRT-PCR. The depth of information by sequencing requires careful assessment of RNA intactness and DNA contamination. Although the RNA-seq is comparatively recent, a standard analysis framework has emerged with the packages of Bowtie2, TopHat, and Cufflinks. With rising popularity of RNA-seq tools have become manageable for researchers without much bioinformatical knowledge or programming skills. Here, we present a workflow for a RNA-seq experiment from experimental planning to biological data extraction. PMID:24792045

  3. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  4. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  5. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  6. Quantitative analysis of comparative genomic hybridization

    SciTech Connect

    Manoir, S. du; Bentz, M.; Joos, S. |

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a program for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.

  7. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  8. Sensitive LC MS quantitative analysis of carbohydrates by Cs+ attachment.

    PubMed

    Rogatsky, Eduard; Jayatillake, Harsha; Goswami, Gayotri; Tomuta, Vlad; Stein, Daniel

    2005-11-01

    The development of a sensitive assay for the quantitative analysis of carbohydrates from human plasma using LC/MS/MS is described in this paper. After sample preparation, carbohydrates were cationized by Cs(+) after their separation by normal phase liquid chromatography on an amino based column. Cesium is capable of forming a quasi-molecular ion [M + Cs](+) with neutral carbohydrate molecules in the positive ion mode of electrospray ionization mass spectrometry. The mass spectrometer was operated in multiple reaction monitoring mode, and transitions [M + 133] --> 133 were monitored (M, carbohydrate molecular weight). The new method is robust, highly sensitive, rapid, and does not require postcolumn addition or derivatization. It is useful in clinical research for measurement of carbohydrate molecules by isotope dilution assay. PMID:16182559

  9. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  10. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved. PMID:25823584

  11. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    PubMed

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. PMID:26833935

  12. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-01-01

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF. PMID:22373162

  13. Quantitative data analysis of ESAR data

    NASA Astrophysics Data System (ADS)

    Phruksahiran, N.; Chandra, M.

    2013-07-01

    A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

  14. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system. PMID:26360033

  15. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  16. Validation and Estimation of Additive Genetic Variation Associated with DNA Tests for Quantitative Beef Cattle Traits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The U.S. National Beef Cattle Evaluation Consortium (NBCEC) has been involved in the validation of commercial DNA tests for quantitative beef quality traits since their first appearance on the U.S. market in the early 2000s. The NBCEC Advisory Council initially requested that the NBCEC set up a syst...

  17. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  18. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  19. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  20. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. PMID:21705250

  1. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  2. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    PubMed Central

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework. PMID:27141321

  3. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  4. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  5. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  6. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  7. Influence of corrosion layers on quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.

    2005-09-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.

  8. Quantitative analysis of airway abnormalities in CT

    NASA Astrophysics Data System (ADS)

    Petersen, Jens; Lo, Pechin; Nielsen, Mads; Edula, Goutham; Ashraf, Haseem; Dirksen, Asger; de Bruijne, Marleen

    2010-03-01

    A coupled surface graph cut algorithm for airway wall segmentation from Computed Tomography (CT) images is presented. Using cost functions that highlight both inner and outer wall borders, the method combines the search for both borders into one graph cut. The proposed method is evaluated on 173 manually segmented images extracted from 15 different subjects and shown to give accurate results, with 37% less errors than the Full Width at Half Maximum (FWHM) algorithm and 62% less than a similar graph cut method without coupled surfaces. Common measures of airway wall thickness such as the Interior Area (IA) and Wall Area percentage (WA%) was measured by the proposed method on a total of 723 CT scans from a lung cancer screening study. These measures were significantly different for participants with Chronic Obstructive Pulmonary Disease (COPD) compared to asymptomatic participants. Furthermore, reproducibility was good as confirmed by repeat scans and the measures correlated well with the outcomes of pulmonary function tests, demonstrating the use of the algorithm as a COPD diagnostic tool. Additionally, a new measure of airway wall thickness is proposed, Normalized Wall Intensity Sum (NWIS). NWIS is shown to correlate better with lung function test values and to be more reproducible than previous measures IA, WA% and airway wall thickness at a lumen perimeter of 10 mm (PI10).

  9. Quantitative analysis of the economically recoverable resource

    SciTech Connect

    Pulle, C.V.; Seskus, A.P.

    1981-05-01

    The objective of this study is to obtain estimates of the economically recoverable gas in the Appalachian Basin. The estimates were obtained in terms of a probability distribution, which quantifies the inherent uncertainty associated with estimates where geologic and production uncertainties prevail. It is established that well productivity on a county and regional basis is lognormally distributed, and the total recoverable gas is Normally distributed. The expected (mean), total economically recoverable gas is 20.2 trillion cubic feet (TCF) with a standard deviation of 1.6 TCF, conditional on the use of shooting technology on 160-acre well-spacing. From properties of the Normal distribution, it is seen that a 95 percent probability exists for the total recoverable gas to lie between 17.06 and 23.34 TCF. The estimates are sensitive to well spacings and the technology applied to a particular geologic environment. It is observed that with smaller well spacings - for example, at 80 acres - the estimate is substantially increased, and that advanced technology, such as foam fracturing, has the potential of significantly increasing gas recovery. However, the threshold and optimum conditions governing advanced exploitation technology, based on well spacing and other parameters, were not analyzed in this study. Their technological impact on gas recovery is mentioned in the text where relevant; and on the basis of a rough projection an additional 10 TCF could be expected with the use of foam fracturing on wells with initial open flows lower than 300 MCFD. From the exploration point of view, the lognormal distribution of well productivity suggests that even in smaller areas, such as a county basis, intense exploration might be appropriate. This is evident from the small tail probabilities of the lognormal distribution, which represent the small number of wells with relatively very high productivity.

  10. Quantitative surface spectroscopic analysis of multicomponent polymers

    NASA Astrophysics Data System (ADS)

    Zhuang, Hengzhong

    Angle-dependent electron spectroscopy for chemical analysis (ESCA) has been successfully used to examine the surface compositional gradient of a multicomponent polymer. However, photoelectron intensities detected at each take-off angle of ESCA measurements are convoluted signals. The convoluted nature of the signal distorts depth profiles for samples having compositional gradients. To recover the true concentration profiles for the samples, a deconvolution program has been described in Chapter 2. The compositional profiles of two classes of important multicomponent polymers, i.e., poly(dimethysiloxane urethane) (PU-DMS) segmented copolymers and fluorinated poly(amide urethane) block copolymers, are achieved using this program. The effects of the polymer molecular structure and the processing variation on its surface compositional profile have been studied. Besides surface composition, it is desirable to know whether the distribution of segment or block lengths at the surface is different than in the bulk, because this aspect of surface structure may lead to properties different than that predicted simply by knowledge of the surface composition and the bulk structure. In Chapter 3, we pioneered the direct determination of the distribution of polydimethylsiloxane (PDMS) segment lengths at the surface of PU-DMS using time-of-flight secondary ion mass spectrometry (SUMS). Exciting preliminary results are provided: for the thick film of PU-DMS with nominal MW of PDMS = 1000, the distribution of the PDMS segment lengths at the surface is nearly identical to that in the bulk, whereas in the case of the thick films of PU-DMS with nominal MW of PDMS = 2400, only those PDMS segments with MW of ca. 1000 preferentially segregated at the surface. As a potential minimal fouling coating or biocompatible cardio-vascular materials, PU-DMS copolymers eventually come into contact with water once in use. Could such an environmental change (from air to aqueous) induce any undesirable

  11. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  12. [Qualitative and quantitative gamma-hydroxybutyrate analysis].

    PubMed

    Petek, Maja Jelena; Vrdoljak, Ana Lucić

    2006-12-01

    Gamma-hydroxybutyrate (GHB) is a naturally occurring compound present in the brain and peripheral tissues of mammals. It is a minor metabolite and precursor of gamma-aminobutyric acid (GABA). Just as GABA, GHB is believed to play a role in neurotransmission. GHB was first synthesized in vitro in 1960, when it revealed depressive and hypnotic effects on the central nervous system. In 1960s it was used as an anaesthetic and later as an alternative to anabolic steroids, in order to enhance muscle growth. However, after it was shown that it caused strong physical dependence and severe side effects, GHB was banned. For the last fifteen years, GHB has been abused for its intoxicating effects such as euphoria, reduced inhibitions and sedation. Illicitly it is available as white powder or as clear liquid. Paradoxically GHB can easily be manufactured from its precursor gamma-butyrolactone (GBL), which has not yet been banned. Because of many car accidents and criminal acts in which it is involved, GHB has become an important object of forensic laboratory analysis. This paper describes gas and liquid chromatography, infrared spectroscopy, microscopy, colourimetry and nuclear magnetic resonance as methods for detection and quantification of GHB in urine and illicit products. PMID:17265679

  13. Quantitative analysis of in vivo cell proliferation.

    PubMed

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  14. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  15. Gender Differences in Learning Styles: A Narrative Review and Quantitative Meta-Analysis.

    ERIC Educational Resources Information Center

    Severiens, Sabine E.; Ten Dam, Geert T. N.

    1994-01-01

    Research since 1980 on gender and learning styles of students over age 18 is reviewed for commonalities in theory and research methodology. In addition, a quantitative meta-analysis was undertaken on two measures of learning style and study behavior to determine the direction and magnitude of gender differences in various samples. (Author/MSE)

  16. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  17. Separation and quantitative analysis of alkyl sulfate ethoxymers by HPLC.

    PubMed

    Morvan, Julien; Hubert-Roux, Marie; Agasse, Valérie; Cardinael, Pascal; Barbot, Florence; Decock, Gautier; Bouillon, Jean-Philippe

    2008-01-01

    Separation of alkyl sulfate ethoxymers is investigated on various high-performance liquid chromatography (HPLC) stationary phases: Acclaim C18 Surfactant, Surfactant C8, and Hypercarb. For a fixed alkyl chain length, ethoxymers are eluted in the order of increasing number of ethoxylated units on Acclaim C18 Surfactant, whereas a reversed elution order is observed on Surfactant C8 and Hypercarb. Moreover, on an Acclaim C18 Surfactant column, non-ethoxylated compounds are eluted in their ethoxymers distribution and the use of sodium acetate additive in mobile phase leads to a co-elution of ethoxymers. HPLC stationary phases dedicated to surfactants analysis are evaluated by means of the Tanaka test. Surfactant C8 presents a great silanol activity whereas Acclaim C18 Surfactant shows a high steric selectivity. For alkyl sulfates, linearity of the calibration curve and limits of detection and quantitation are evaluated. The amount of sodium laureth sulfate raw material found in commercial body product is in agreement with the specification of the manufacturer. PMID:19007494

  18. Quantitative phase imaging applied to laser damage detection and analysis.

    PubMed

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  19. Quantitative analysis of flagellar proteins in Drosophila sperm tails.

    PubMed

    Mendes Maia, Teresa; Paul-Gilloteaux, Perrine; Basto, Renata

    2015-01-01

    The cilium has a well-defined structure, which can still accommodate some morphological and molecular composition diversity to suit the functional requirements of different cell types. The sperm flagellum of the fruit fly Drosophila melanogaster appears as a good model to study the genetic regulation of axoneme assembly and motility, due to the wealth of genetic tools publically available for this organism. In addition, the fruit fly's sperm flagellum displays quite a long axoneme (∼1.8mm), which may facilitate both histological and biochemical analyses. Here, we present a protocol for imaging and quantitatively analyze proteins, which associate with the fly differentiating, and mature sperm flagella. We will use as an example the quantification of tubulin polyglycylation in wild-type testes and in Bug22 mutant testes, which present defects in the deposition of this posttranslational modification. During sperm biogenesis, flagella appear tightly bundled, which makes it more challenging to get accurate measurements of protein levels from immunostained specimens. The method we present is based on the use of a novel semiautomated, macro installed in the image processing software ImageJ. It allows to measure fluorescence levels in closely associated sperm tails, through an exact distinction between positive and background signals, and provides background-corrected pixel intensity values that can directly be used for data analysis. PMID:25837396

  20. Quantitative transverse flow measurement using OCT speckle decorrelation analysis

    PubMed Central

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.

    2014-01-01

    We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

  1. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  2. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  3. Development of a quantitative autoradiography image analysis system

    SciTech Connect

    Hoffman, T.J.; Volkert, W.A.; Holmes R.A.

    1986-03-01

    A low cost image analysis system suitable for quantitative autoradiography (QAR) analysis has been developed. Autoradiographs can be digitized using a conventional Newvicon television camera interfaced to an IBM-XT microcomputer. Software routines for image digitization and capture permit the acquisition of thresholded or windowed images with graphic overlays that can be stored on storage devices. Image analysis software performs all background and non-linearity corrections prior to display as black/white or pseudocolor images. The relationship of pixel intensity to a standard radionuclide concentration allows the production of quantitative maps of tissue radiotracer concentrations. An easily modified subroutine is provided for adaptation to use appropriate operational equations when parameters such as regional cerebral blood flow or regional cerebral glucose metabolism are under investigation. This system could provide smaller research laboratories with the capability of QAR analysis at relatively low cost.

  4. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  5. Quantitative numerical analysis of transient IR-experiments on buildings

    NASA Astrophysics Data System (ADS)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  6. Scanning tunneling microscopy on rough surfaces-quantitative image analysis

    NASA Astrophysics Data System (ADS)

    Reiss, G.; Brückl, H.; Vancea, J.; Lecheler, R.; Hastreiter, E.

    1991-07-01

    In this communication, the application of scanning tunneling microscopy (STM) for a quantitative evaluation of roughnesses and mean island sizes of polycrystalline thin films is discussed. Provided strong conditions concerning the resolution are satisfied, the results are in good agreement with standard techniques as, for example, transmission electron microscopy. Owing to its high resolution, STM can supply a better characterization of surfaces than established methods, especially concerning the roughness. Microscopic interpretations of surface dependent physical properties thus can be considerably improved by a quantitative analysis of STM images.

  7. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  8. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  9. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  10. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  11. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  12. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    PubMed Central

    Bosschaart, Nienke; van Leeuwen, Ton G.; Aalders, Maurice C.G.; Faber, Dirk J.

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of Kraszewski, in support of their conclusion that SOCT optimization should include window shape, next to choice of window size and analysis algorithm. PMID:25401016

  13. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  14. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  15. An Analysis of Critical Factors for Quantitative Immunoblotting

    PubMed Central

    Janes, Kevin A.

    2015-01-01

    Immunoblotting (also known as Western blotting) combined with digital image analysis can be a reliable method for analyzing the abundance of proteins and protein modifications, but not every immunoblot-analysis combination produces an accurate result. Here, I illustrate how sample preparation, protocol implementation, detection scheme, and normalization approach profoundly affect the quantitative performance of immunoblotting. This study implemented diagnostic experiments that assess an immunoblot-analysis workflow for accuracy and precision. The results showed that ignoring such diagnostics can lead to pseudoquantitative immunoblot data that dramatically overestimate or underestimate true differences in protein abundance. PMID:25852189

  16. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  17. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  18. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  19. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  20. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored. PMID:27444661

  1. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  2. Quantitative analysis of chaotic synchronization by means of coherence

    NASA Astrophysics Data System (ADS)

    Shabunin, A.; Astakhov, V.; Kurths, J.

    2005-07-01

    We use an index of chaotic synchronization based on the averaged coherence function for the quantitative analysis of the process of the complete synchronization loss in unidirectionally coupled oscillators and maps. We demonstrate that this value manifests different stages of the synchronization breaking. It is invariant to time delay and insensitive to small noise and distortions, which can influence the accessible signals at measurements. Peculiarities of the synchronization destruction in maps and oscillators are investigated.

  3. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    NASA Astrophysics Data System (ADS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  4. Optimal Multicomponent Analysis Using the Generalized Standard Addition Method.

    ERIC Educational Resources Information Center

    Raymond, Margaret; And Others

    1983-01-01

    Describes an experiment on the simultaneous determination of chromium and magnesium by spectophotometry modified to include the Generalized Standard Addition Method computer program, a multivariate calibration method that provides optimal multicomponent analysis in the presence of interference and matrix effects. Provides instructions for…

  5. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described. PMID:8905629

  6. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    PubMed

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes. PMID:26095427

  7. Quantitative analysis of surface electromyography: Biomarkers for convulsive seizures.

    PubMed

    Beniczky, Sándor; Conradsen, Isa; Pressler, Ronit; Wolf, Peter

    2016-08-01

    Muscle activity during seizures is in electroencephalographical (EEG) praxis often considered an irritating artefact. This article discusses ways by surface electromyography (EMG) to turn it into a valuable tool of epileptology. Muscles are in direct synaptic contact with motor neurons. Therefore, EMG signals provide direct information about the electric activity in the motor cortex. Qualitative analysis of EMG has traditionally been a part of the long-term video-EEG recordings. Recent development in quantitative analysis of EMG signals yielded valuable information on the pathomechanisms of convulsive seizures, demonstrating that it was different from maximal voluntary contraction, and different from convulsive psychogenic non-epileptic seizures. Furthermore, the tonic phase of the generalised tonic-clonic seizures (GTCS) proved to have different quantitative features than tonic seizures. The high temporal resolution of EMG allowed detailed characterisation of temporal dynamics of the GTCS, suggesting that the same inhibitory mechanisms that try to prevent the build-up of the seizure activity, contribute to ending the seizure. These findings have clinical implications: the quantitative EMG features provided the pathophysiologic substrate for developing neurophysiologic biomarkers that accurately identify GTCS. This proved to be efficient both for seizure detection and for objective, automated distinction between convulsive and non-convulsive epileptic seizures. PMID:27212115

  8. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  9. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  10. Microcomputer-based digital image analysis system for quantitative autoradiography

    SciTech Connect

    Hoffman, T.J.; Volkert, W.A.; Holmes, R.A.

    1988-01-01

    A computerized image processing system utilizing an IBM-XT personal microcomputer with the capability of performing quantitative cerebral autoradiography is described. All of the system components are standard computer and optical hardware that can be easily assembled. The system has 512 horizontal by 512 vertical axis resolution with 8 bits per pixel (256 gray levels). Unlike other dedicated image processing systems, the IBM-XT permits the assembly of an efficient, low-cost image analysis system without sacrificing other capabilities of the IBM personal computer. The application of this system in both qualitative and quantitative autoradiography has been the principal factor in developing a new radiopharmaceutical to measure regional cerebral blood flow.

  11. Quantitative analysis of astrogliosis in drug-dependent humans.

    PubMed

    Weber, Marco; Scherf, Nico; Kahl, Thomas; Braumann, Ulf-Dietrich; Scheibe, Patrick; Kuska, Jens-Peer; Bayer, Ronny; Büttner, Andreas; Franke, Heike

    2013-03-15

    Drug addiction is a chronic, relapsing disease caused by neurochemical and molecular changes in the brain. In this human autopsy study qualitative and quantitative changes of glial fibrillary acidic protein (GFAP)-positive astrocytes in the hippocampus of 26 lethally intoxicated drug addicts and 35 matched controls are described. The morphological characterization of these cells reflected alterations representative for astrogliosis. But, neither quantification of GFAP-positive cells nor the Western blot analysis indicated statistical significant differences between drug fatalities versus controls. However, by semi-quantitative scoring a significant shift towards higher numbers of activated astrocytes in the drug group was detected. To assess morphological changes quantitatively, graph-based representations of astrocyte morphology were obtained from single cell images captured by confocal laser scanning microscopy. Their underlying structures were used to quantify changes in astroglial fibers in an automated fashion. This morphometric analysis yielded significant differences between the investigated groups for four different measures of fiber characteristics (Euclidean distance, graph distance, number of graph elements, fiber skeleton distance), indicating that, e.g., astrocytes in drug addicts on average exhibit significant elongation of fiber structures as well as two-fold increase in GFAP-positive fibers as compared with those in controls. In conclusion, the present data show characteristic differences in morphology of hippocampal astrocytes in drug addicts versus controls and further supports the involvement of astrocytes in human pathophysiology of drug addiction. The automated quantification of astrocyte morphologies provides a novel, testable way to assess the fiber structures in a quantitative manner as opposed to standard, qualitative descriptions. PMID:23337617

  12. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation.

    PubMed

    Cone, J T; Segrest, J P; Chung, B H; Ragland, J B; Sabesin, S M; Glasscock, A

    1982-08-01

    A method has been developed for rapidly quantitating the cholesterol concentration of normal and certain variant lipoproteins in a large number of patients (over 240 in one week). The method employs a microcomputer interfaced to the vertical autoprofiler (VAP) described earlier (Chung et al. 1981. J. Lipid Res. 22: 1003-1014). Software developed to accomplish rapid on-line analysis of the VAP signal uses peak shapes and positions derived from prior VAP analysis of isolated authentic lipoproteins HDL, LDL, and VLDL to quantitate these species in a VAP profile. Variant lipoproteins VHDL (a species with density greater than that of HDL(3)), MDL (a species, most likely Lp(a), with density intermediate between that of HDL and LDL), and IDL are subsequently quantitated by a method combining difference calculations with curve shapes. The procedure has been validated qualitatively by negative stain electron microscopy, gradient gel electrophoresis, strip electrophoresis, chemical analysis of the lipids, radioimmunoassay of the apolipoproteins, and measurement of the density of the peak centers. It has been validated quantitatively by comparison with Lipid Research Clinic methodology for HDL-, LDL-, and VLDL-cholesterol, and for MDL- and IDL-cholesterol by comparison of the amounts of MDL or IDL predicted to be present by the method with that known to be present following standard addition to whole plasma. These validations show that the method is a rapid and accurate technique of lipoprotein analysis suitable for the routine screening of patients for abnormal amounts of normal or variant lipoproteins, as well as for use as a research tool for quantitation of changes in cholesterol content of six or seven different plasma lipoprotein fractions.-Cone, J. T., J. P. Segrest, B. H. Chung, J. B. Ragland, S. M. Sabesin, and A. Glasscock. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation. PMID:7130860

  13. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  14. Bayesian Shrinkage Analysis of Quantitative Trait Loci for Dynamic Traits

    PubMed Central

    Yang, Runqing; Xu, Shizhong

    2007-01-01

    Many quantitative traits are measured repeatedly during the life of an organism. Such traits are called dynamic traits. The pattern of the changes of a dynamic trait is called the growth trajectory. Studying the growth trajectory may enhance our understanding of the genetic architecture of the growth trajectory. Recently, we developed an interval-mapping procedure to map QTL for dynamic traits under the maximum-likelihood framework. We fit the growth trajectory by Legendre polynomials. The method intended to map one QTL at a time and the entire QTL analysis involved scanning the entire genome by fitting multiple single-QTL models. In this study, we propose a Bayesian shrinkage analysis for estimating and mapping multiple QTL in a single model. The method is a combination between the shrinkage mapping for individual quantitative traits and the Legendre polynomial analysis for dynamic traits. The multiple-QTL model is implemented in two ways: (1) a fixed-interval approach where a QTL is placed in each marker interval and (2) a moving-interval approach where the position of a QTL can be searched in a range that covers many marker intervals. Simulation study shows that the Bayesian shrinkage method generates much better signals for QTL than the interval-mapping approach. We propose several alternative methods to present the results of the Bayesian shrinkage analysis. In particular, we found that the Wald test-statistic profile can serve as a mechanism to test the significance of a putative QTL. PMID:17435239

  15. Analysis and Evaluation of Supersonic Underwing Heat Addition

    NASA Technical Reports Server (NTRS)

    Luidens, Roger W.; Flaherty, Richard J.

    1959-01-01

    The linearized theory for heat addition under a wing has been developed to optimize wing geometry, heat addition, and angle of attack. The optimum wing has all of the thickness on the underside of the airfoil, with maximum-thickness point well downstream, has a moderate thickness ratio, and operates at an optimum angle of attack. The heat addition is confined between the fore Mach waves from under the trailing surface of the wing. By linearized theory, a wing at optimum angle of attack may have a range efficiency about twice that of a wing at zero angle of attack. More rigorous calculations using the method of characteristics for particular flow models were made for heating under a flat-plate wing and for several wings with thickness, both with heat additions concentrated near the wing. The more rigorous calculations yield in practical cases efficiencies about half those estimated by linear theory. An analysis indicates that distributing the heat addition between the fore waves from the undertrailing portion of the wing is a way of improving the performance, and further calculations appear desirable. A comparison of the conventional ramjet-plus wing with underwing heat addition when the heat addition is concentrated near the wing shows the ramjet to be superior on a range basis up to Mach number of about B. The heat distribution under the wing and the assumed ramjet and airframe performance may have a marked effect on this conclusion. Underwing heat addition can be useful in providing high-altitude maneuver capability at high flight Mach numbers for an airplane powered by conventional ramjets during cruise.

  16. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  17. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  18. A quantitative analysis of IRAS maps of molecular clouds

    NASA Astrophysics Data System (ADS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-11-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  19. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  20. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    PubMed

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  1. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  2. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  3. Quantitative Northern Blot Analysis of Mammalian rRNA Processing.

    PubMed

    Wang, Minshi; Pestov, Dimitri G

    2016-01-01

    Assembly of eukaryotic ribosomes is an elaborate biosynthetic process that begins in the nucleolus and requires hundreds of cellular factors. Analysis of rRNA processing has been instrumental for studying the mechanisms of ribosome biogenesis and effects of stress conditions on the molecular milieu of the nucleolus. Here, we describe the quantitative analysis of the steady-state levels of rRNA precursors, applicable to studies in mammalian cells and other organisms. We include protocols for gel electrophoresis and northern blotting of rRNA precursors using procedures optimized for the large size of these RNAs. We also describe the ratio analysis of multiple precursors, a technique that facilitates the accurate assessment of changes in the efficiency of individual pre-rRNA processing steps. PMID:27576717

  4. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  5. Quantitative Analysis Of Cristobalite In The Presence Of Quartz

    NASA Astrophysics Data System (ADS)

    Totten, Gary A.

    1985-12-01

    The detection and quantitation of-cristobalite in quartz is necessary to calculate threshold value limits (TVL) for free crystalline silica (FCS) as proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). The cristobalite standard used in this study was made by heating diatomaceous earth to the transition temperature for cristobalite. The potassium bromide (KBR) pellet method was used for the analysis. Potassium cyanide (KCN) was used as an internal standard. Samples ranged from 5% to 30% cris-tobalite in quartz. Precision for this method is within 2%.

  6. Quantitative proteomic analysis of drug-induced changes in mycobacteria.

    PubMed

    Hughes, Minerva A; Silva, Jeffrey C; Geromanos, Scott J; Townsend, Craig A

    2006-01-01

    A new approach for qualitative and quantitative proteomic analysis using capillary liquid chromatography and mass spectrometry to study the protein expression response in mycobacteria following isoniazid treatment is discussed. In keeping with known effects on the fatty acid synthase II pathway, proteins encoded by the kas operon (AcpM, KasA, KasB, Accd6) were significantly overexpressed, as were those involved in iron metabolism and cell division suggesting a complex interplay of metabolic events leading to cell death. PMID:16396495

  7. ANALYSIS OF MPC ACCESS REQUIREMENTS FOR ADDITION OF FILLER MATERIALS

    SciTech Connect

    W. Wallin

    1996-09-03

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) in response to a request received via a QAP-3-12 Design Input Data Request (Ref. 5.1) from WAST Design (formerly MRSMPC Design). The request is to provide: Specific MPC access requirements for the addition of filler materials at the MGDS (i.e., location and size of access required). The objective of this analysis is to provide a response to the foregoing request. The purpose of this analysis is to provide a documented record of the basis for the response. The response is stated in Section 8 herein. The response is based upon requirements from an MGDS perspective.

  8. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits

    PubMed Central

    Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-01-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI’s Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  9. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  10. Spectral Envelopes and Additive + Residual Analysis/Synthesis

    NASA Astrophysics Data System (ADS)

    Rodet, Xavier; Schwarz, Diemo

    The subject of this chapter is the estimation, representation, modification, and use of spectral envelopes in the context of sinusoidal-additive-plus-residual analysis/synthesis. A spectral envelope is an amplitude-vs-frequency function, which may be obtained from the envelope of a short-time spectrum (Rodet et al., 1987; Schwarz, 1998). [Precise definitions of such an envelope and short-time spectrum (STS) are given in Section 2.] The additive-plus-residual analysis/synthesis method is based on a representation of signals in terms of a sum of time-varying sinusoids and of a non-sinusoidal residual signal [e.g., see Serra (1989), Laroche et al. (1993), McAulay and Quatieri (1995), and Ding and Qian (1997)]. Many musical sound signals may be described as a combination of a nearly periodic waveform and colored noise. The nearly periodic part of the signal can be viewed as a sum of sinusoidal components, called partials, with time-varying frequency and amplitude. Such sinusoidal components are easily observed on a spectral analysis display (Fig. 5.1) as obtained, for instance, from a discrete Fourier transform.

  11. Multivariate calibration applied to the quantitative analysis of infrared spectra

    NASA Astrophysics Data System (ADS)

    Haaland, David M.

    1992-03-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in- situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mid- or near-infrared spectra of the blood. Progress toward the noninvasive determination of glucose levels in diabetics is an ultimate goal of this research.

  12. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  13. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGESBeta

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  14. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  15. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  16. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  17. Segmentation and quantitative analysis of individual cells in developmental tissues.

    PubMed

    Nandy, Kaustav; Kim, Jusub; McCullough, Dean P; McAuliffe, Matthew; Meaburn, Karen J; Yamaguchi, Terry P; Gudla, Prabhakar R; Lockett, Stephen J

    2014-01-01

    Image analysis is vital for extracting quantitative information from biological images and is used extensively, including investigations in developmental biology. The technique commences with the segmentation (delineation) of objects of interest from 2D images or 3D image stacks and is usually followed by the measurement and classification of the segmented objects. This chapter focuses on the segmentation task and here we explain the use of ImageJ, MIPAV (Medical Image Processing, Analysis, and Visualization), and VisSeg, three freely available software packages for this purpose. ImageJ and MIPAV are extremely versatile and can be used in diverse applications. VisSeg is a specialized tool for performing highly accurate and reliable 2D and 3D segmentation of objects such as cells and cell nuclei in images and stacks. PMID:24318825

  18. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  19. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  20. A method for quantitative wet chemical analysis of urinary calculi.

    PubMed

    Larsson, L; Sörbo, B; Tiselius, H G; Ohman, S

    1984-06-27

    We describe a simple method for quantitative chemical analysis of urinary calculi requiring no specialized equipment. Pulverized calculi are dried over silica gel at room temperature and dissolved in nitric acid, which was the only effective agent for complete dissolution. Calcium, magnesium, ammonium, and phosphate are then determined by conventional methods. Oxalate is determined by a method based on the quenching action of oxalate on the fluorescence of a zirconium-flavonol complex. Uric acid, when treated with nitric acid, is stoichiometrically converted to alloxan, which is determined fluorimetrically with 1,2-phenylenediamine. Similarly, cystine is oxidized by nitric acid to sulfate, which is determined turbidimetrically as barium sulfate. Protein is determined spectrophotometrically as xanthoprotein. The total mass recovery of authentic calculi was 92.2 +/- 6.7 (SD) per cent. The method permits analysis of calculi as small as 1.0 mg. Internal quality control is performed with specially designed control samples. PMID:6086179

  1. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    PubMed

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm. PMID:24369641

  2. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  3. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  4. The Analysis of Quantitative Traits for Simple Genetic Models from Parental, F1 and Backcross Data

    PubMed Central

    Elston, R. C.; Stewart, John

    1973-01-01

    The following models are considered for the genetic determination of quantitative traits: segregation at one locus, at two linked loci, at any number of equal and additive unlinked loci, and at one major locus and an indefinite number of equal and additive loci. In each case an appropriate likelihood is given for data on parental, F1 and backcross individuals, assuming that the environmental variation is normally distributed. Methods of testing and comparing the various models are presented, and methods are suggested for the simultaneous analysis of two or more traits. PMID:4711900

  5. Qualitative and quantitative analysis of volatile constituents from latrines.

    PubMed

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  6. Quantitative analysis of Babesia ovis infection in sheep and ticks.

    PubMed

    Erster, Oran; Roth, Asael; Wollkomirsky, Ricardo; Leibovich, Benjamin; Savitzky, Igor; Zamir, Shmuel; Molad, Thea; Shkap, Varda

    2016-05-15

    A quantitative PCR, based on the gene encoding Babesia ovis Surface Protein D (BoSPD) was developed and applied to investigate the presence of Babesia ovis (B. ovis) in its principal vector, the tick Rhipicephalus bursa (R. bursa), and in the ovine host. Quantification of B. ovis in experimentally-infected lambs showed a sharp increase in parasitemia 10-11days in blood-inoculated and adult tick-infested lambs, and 24days in a larvae-infested lamb. A gradual decrease of parasitemia was observed in the following months, with parasites detectable 6-12 months post-infection. Examination of the parasite load in adult R. bursa during the post-molting period using the quantitative PCR assay revealed a low parasite load during days 2-7 post-molting, followed by a sharp increase, until day 11, which corresponded to the completion of the pre-feeding period. The assay was then used to detect B. ovis in naturally-infected sheep and ticks. Examination of samples from 8 sheep and 2 goats from infected flocks detected B. ovis in both goats and in 7 out of the 8 sheep. Additionally, B. ovis was detected in 9 tick pools (5 ticks in each pool) and two individual ticks removed from sheep in infected flocks. PMID:27084469

  7. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  8. Quantitative analysis of CT scans of ceramic candle filters

    SciTech Connect

    Ferer, M.V.; Smith, D.H.

    1996-12-31

    Candle filters are being developed to remove coal ash and other fine particles (<15{mu}m) from hot (ca. 1000 K) gas streams. In the present work, a color scanner was used to digitize hard-copy CT X-ray images of cylindrical SiC filters, and linear regressions converted the scanned (color) data to a filter density for each pixel. These data, with the aid of the density of SiC, gave a filter porosity for each pixel. Radial averages, density-density correlation functions, and other statistical analyses were performed on the density data. The CT images also detected the presence and depth of cracks that developed during usage of the filters. The quantitative data promise to be a very useful addition to the color images.

  9. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). PMID:25913743

  10. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  11. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-05-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  12. Key Parameters Affecting Quantitative Analysis of STEM-EDS Spectrum Images

    SciTech Connect

    Brewer, Luke; Parish, Chad M

    2010-06-01

    In this article, we use simulated and experimental data to explore how three operator-controllable parameters - (1) signal level, (2) detector resolution, and (3) number of factors chosen for analysis - affect quantitative analyses of scanning transmission electron microscopy-energy dispersive X-ray spectroscopy spectrum images processed by principal component analysis (PCA). We find that improvements in both signal level and detector resolution improve the precision of quantitative analyses, but that signal level is the most important. We also find that if the rank of the PCA solution is not chosen properly, it may be possible to improperly fit the underlying data and degrade the accuracy of results. Additionally, precision is degraded in the case when too many factors are included in the model.

  13. Quantitative wake analysis of a freely swimming fish using 3D synthetic aperture PIV

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra H.

    2015-07-01

    Synthetic aperture PIV (SAPIV) is used to quantitatively analyze the wake behind a giant danio ( Danio aequipinnatus) swimming freely in a seeded quiescent tank. The experiment is designed with minimal constraints on animal behavior to ensure that natural swimming occurs. The fish exhibits forward swimming and turning behaviors at speeds between 0.9 and 1.5 body lengths/second. Results show clearly isolated and linked vortex rings in the wake structure, as well as the thrust jet coming off of a visual hull reconstruction of the fish body. As a benchmark for quantitative analysis of volumetric PIV data, the vortex circulation and impulse are computed using methods consistent with those applied to planar PIV data. Volumetric momentum analysis frameworks are discussed for linked and asymmetric vortex structures, laying a foundation for further volumetric studies of swimming hydrodynamics with SAPIV. Additionally, a novel weighted refocusing method is presented as an improvement to SAPIV reconstruction.

  14. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  15. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  16. Quantitative analysis of gene function in the Drosophila embryo.

    PubMed Central

    Tracey, W D; Ning, X; Klingler, M; Kramer, S G; Gergen, J P

    2000-01-01

    The specific functions of gene products frequently depend on the developmental context in which they are expressed. Thus, studies on gene function will benefit from systems that allow for manipulation of gene expression within model systems where the developmental context is well defined. Here we describe a system that allows for genetically controlled overexpression of any gene of interest under normal physiological conditions in the early Drosophila embryo. This regulated expression is achieved through the use of Drosophila lines that express a maternal mRNA for the yeast transcription factor GAL4. Embryos derived from females that express GAL4 maternally activate GAL4-dependent UAS transgenes at uniform levels throughout the embryo during the blastoderm stage of embryogenesis. The expression levels can be quantitatively manipulated through the use of lines that have different levels of maternal GAL4 activity. Specific phenotypes are produced by expression of a number of different developmental regulators with this system, including genes that normally do not function during Drosophila embryogenesis. Analysis of the response to overexpression of runt provides evidence that this pair-rule segmentation gene has a direct role in repressing transcription of the segment-polarity gene engrailed. The maternal GAL4 system will have applications both for the measurement of gene activity in reverse genetic experiments as well as for the identification of genetic factors that have quantitative effects on gene function in vivo. PMID:10628987

  17. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  18. Quantitative PCR analysis of laryngeal muscle fiber types

    PubMed Central

    Van Daele, Douglas J.

    2013-01-01

    Voice and swallowing dysfunction as a result of recurrent laryngeal nerve paralysis can be improved with vocal fold injections or laryngeal framework surgery. However, denervation atrophy can cause late-term clinical failure. A major determinant of skeletal muscle physiology is myosin heavy chain (MyHC) expression, and previous protein analyses have shown changes in laryngeal muscle fiber MyHC isoform with denervation. RNA analyses in this setting have not been performed, and understanding RNA levels will allow interventions better designed to reverse processes such as denervation in the future. Total RNA was extracted from bilateral rat thyroarytenoid (TA), posterior cricoarytenoid (PCA), and cricothyroid (CT) muscles in rats. Primers were designed using published MyHC isoform sequences. SYBR Green real time reverse transcription-polymerase chain reaction (SYBR-RT-PCR) was used for quantification. The electropherogram showed a clear separation of total RNA to 28S and 18S subunits. Melting curves illustrated single peaks for all type MyHC primers. All MyHC isoforms were identified in all muscles with various degrees of expression. Quantitative PCR is a sensitive method to detect MyHC isoforms in laryngeal muscle. Isoform expression using mRNA analysis was similar to previous analyses but showed some important differences. This technique can be used to quantitatively assess response to interventions targeted to maintain muscle bulk after denervation. PMID:20430402

  19. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  20. EDXRF quantitative analysis of chromophore chemical elements in corundum samples.

    PubMed

    Bonizzoni, L; Galli, A; Spinolo, G; Palanza, V

    2009-12-01

    Corundum is a crystalline form of aluminum oxide (Al(2)O(3)) and is one of the rock-forming minerals. When aluminum oxide is pure, the mineral is colorless, but the presence of trace amounts of other elements such as iron, titanium, and chromium in the crystal lattice gives the typical colors (including blue, red, violet, pink, green, yellow, orange, gray, white, colorless, and black) of gemstone varieties. The starting point for our work is the quantitative evaluation of the concentration of chromophore chemical elements with a precision as good as possible to match the data obtained by different techniques as such as optical absorption photoluminescence. The aim is to give an interpretation of the absorption bands present in the NIR and visible ranges which do not involve intervalence charge transfer transitions (Fe(2+) --> Fe(3+) and Fe(2+) --> Ti(4+)), commonly considered responsible of the important features of the blue sapphire absorption spectra. So, we developed a method to evaluate as accurately as possible the autoabsorption effects and the secondary excitation effects which frequently are sources of relevant errors in the quantitative EDXRF analysis. PMID:19821113

  1. Analysis of generalized interictal discharges using quantitative EEG.

    PubMed

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  2. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  3. Sources of Technical Variability in Quantitative LC-MS Proteomics: Human Brain Tissue Sample Analysis.

    SciTech Connect

    Piehowski, Paul D.; Petyuk, Vladislav A.; Orton, Daniel J.; Xie, Fang; Moore, Ronald J.; Ramirez Restrepo, Manuel; Engel, Anzhelika; Lieberman, Andrew P.; Albin, Roger L.; Camp, David G.; Smith, Richard D.; Myers, Amanda J.

    2013-05-03

    To design a robust quantitative proteomics study, an understanding of both the inherent heterogeneity of the biological samples being studied as well as the technical variability of the proteomics methods and platform is needed. Additionally, accurately identifying the technical steps associated with the largest variability would provide valuable information for the improvement and design of future processing pipelines. We present an experimental strategy that allows for a detailed examination of the variability of the quantitative LC-MS proteomics measurements. By replicating analyses at different stages of processing, various technical components can be estimated and their individual contribution to technical variability can be dissected. This design can be easily adapted to other quantitative proteomics pipelines. Herein, we applied this methodology to our label-free workflow for the processing of human brain tissue. For this application, the pipeline was divided into four critical components: Tissue dissection and homogenization (extraction), protein denaturation followed by trypsin digestion and SPE clean-up (digestion), short-term run-to-run instrumental response fluctuation (instrumental variance), and long-term drift of the quantitative response of the LC-MS/MS platform over the 2 week period of continuous analysis (instrumental stability). From this analysis, we found the following contributions to variability: extraction (72%) >> instrumental variance (16%) > instrumental stability (8.4%) > digestion (3.1%). Furthermore, the stability of the platform and its’ suitability for discovery proteomics studies is demonstrated.

  4. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  5. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  6. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  7. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  8. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. PMID:25990413

  9. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  10. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  11. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  12. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  13. A quantitative histological analysis of the dilated ureter of childhood.

    PubMed

    Lee, B R; Partin, A W; Epstein, J I; Quinlan, D M; Gosling, J A; Gearhart, J P

    1992-11-01

    A quantitative histological study of the dilated ureter of childhood was performed on 26 ureters. The specimens were from 15 male and 11 female patients 10 days to 12 years old (mean age 2.0 years). A color image analysis system was used to examine and compare collagen and smooth muscle components of the muscularis layers to normal control ureters of similar age. In comparing primary obstructed (12) to primary refluxing (14) megaureters and control ureters (6), there was a statistically different collagen-to-smooth muscle ratio (p < 0.001) between the primary obstructed and primary refluxing megaureter groups. For patients with primary refluxing megaureter there was a 2-fold increase in the tissue matrix ratio of collagen-to-smooth muscle when compared to patients with primary obstructed megaureter. In the primary obstructed megaureters the amount of collagen and smooth muscle was not statistically different from controls (p > 0.01). The increased tissue matrix ratio of 2.0 +/- 0.35 (collagen-to-smooth muscle) in the refluxing megaureter group compared to 0.78 +/- 0.22 in the obstructed megaureter group and 0.52 +/- 0.12 in controls was found to be due not only to a marked increase in collagen but also a significant decrease in the smooth muscle component of the tissue. Primary obstructed and normal control ureters had similar quantitative amounts of smooth muscle with 60 +/- 5% and 61 +/- 6%, respectively, while refluxing megaureters had only 40 +/- 5% smooth muscle. The percentage collagen was 36 +/- 5 in the obstructed megaureter group and 30 +/- 5 in controls, with refluxing megaureters having 58 +/- 5% collagen on analysis. Our findings emphasize the significant differences in the structural components (collagen and smooth muscle) of the dilated ureter of childhood, and provide us with further insight into the pathological nature of these dilated ureters and their surgical repair. PMID:1433552

  14. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  15. Quantitative analysis of synaptic release at the photoreceptor synapse.

    PubMed

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B

    2010-05-19

    Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca(2+) and exhibits an unusually shallow dependence on presynaptic Ca(2+). To provide a quantitative description of the photoreceptor Ca(2+) sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca(2+)-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca(2+): exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca(2+) binding sites on the rod Ca(2+) sensor rather than the typical four or five. For most models, the on-rates for Ca(2+) binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca(2+) unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca(2+) sensor, slow Ca(2+) unbinding may support the fusion of vesicles located at a distance from Ca(2+) channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  16. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    PubMed Central

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Abstract Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca2+ binding sites on the rod Ca2+ sensor rather than the typical four or five. For most models, the on-rates for Ca2+ binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca2+ unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca2+ sensor, slow Ca2+ unbinding may support the fusion of vesicles located at a distance from Ca2+ channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  17. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  18. Quantitative Analysis of Cancer Metastasis using an Avian Embryo Model

    PubMed Central

    Palmer, Trenis D.; Lewis, John; Zijlstra, Andries

    2011-01-01

    During metastasis cancer cells disseminate from the primary tumor, invade into surrounding tissues, and spread to distant organs. Metastasis is a complex process that can involve many tissue types, span variable time periods, and often occur deep within organs, making it difficult to investigate and quantify. In addition, the efficacy of the metastatic process is influenced by multiple steps in the metastatic cascade making it difficult to evaluate the contribution of a single aspect of tumor cell behavior. As a consequence, metastasis assays are frequently performed in experimental animals to provide a necessarily realistic context in which to study metastasis. Unfortunately, these models are further complicated by their complex physiology. The chick embryo is a unique in vivo model that overcomes many limitations to studying metastasis, due to the accessibility of the chorioallantoic membrane (CAM), a well-vascularized extra-embryonic tissue located underneath the eggshell that is receptive to the xenografting of tumor cells (figure 1). Moreover, since the chick embryo is naturally immunodeficient, the CAM readily supports the engraftment of both normal and tumor tissues. Most importantly, the avian CAM successfully supports most cancer cell characteristics including growth, invasion, angiogenesis, and remodeling of the microenvironment. This makes the model exceptionally useful for the investigation of the pathways that lead to cancer metastasis and to predict the response of metastatic cancer to new potential therapeutics. The detection of disseminated cells by species-specific Alu PCR makes it possible to quantitatively assess metastasis in organs that are colonized by as few as 25 cells. Using the Human Epidermoid Carcinoma cell line (HEp3) we use this model to analyze spontaneous metastasis of cancer cells to distant organs, including the chick liver and lung. Furthermore, using the Alu-PCR protocol we demonstrate the sensitivity and reproducibility of the

  19. Sensitivity analysis of geometric errors in additive manufacturing medical models.

    PubMed

    Pinto, Jose Miguel; Arrieta, Cristobal; Andia, Marcelo E; Uribe, Sergio; Ramos-Grez, Jorge; Vargas, Alex; Irarrazaval, Pablo; Tejos, Cristian

    2015-03-01

    Additive manufacturing (AM) models are used in medical applications for surgical planning, prosthesis design and teaching. For these applications, the accuracy of the AM models is essential. Unfortunately, this accuracy is compromised due to errors introduced by each of the building steps: image acquisition, segmentation, triangulation, printing and infiltration. However, the contribution of each step to the final error remains unclear. We performed a sensitivity analysis comparing errors obtained from a reference with those obtained modifying parameters of each building step. Our analysis considered global indexes to evaluate the overall error, and local indexes to show how this error is distributed along the surface of the AM models. Our results show that the standard building process tends to overestimate the AM models, i.e. models are larger than the original structures. They also show that the triangulation resolution and the segmentation threshold are critical factors, and that the errors are concentrated at regions with high curvatures. Errors could be reduced choosing better triangulation and printing resolutions, but there is an important need for modifying some of the standard building processes, particularly the segmentation algorithms. PMID:25649961

  20. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  1. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  2. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  3. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  4. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  5. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  6. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-01

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  7. Inside single cells: quantitative analysis with advanced optics and nanomaterials.

    PubMed

    Cui, Yi; Irudayaraj, Joseph

    2015-01-01

    Single-cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites, and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single-cell activity. To obtain quantitative information (e.g., molecular quantity, kinetics, and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single-cell studies, both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live-cell analysis. Although a considerable proportion of single-cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single-cell analysis. PMID:25430077

  8. Quantitative genetic analysis of the metabolic syndrome in Hispanic children.

    PubMed

    Butte, Nancy F; Comuzzie, Anthony G; Cole, Shelley A; Mehta, Nitesh R; Cai, Guowen; Tejero, Maria; Bastarrachea, Raul; Smith, E O'Brian

    2005-12-01

    Childhood obesity is associated with a constellation of metabolic derangements including glucose intolerance, hypertension, and dyslipidemia, referred to as metabolic syndrome. The purpose of this study was to investigate genetic and environmental factors contributing to the metabolic syndrome in Hispanic children. Metabolic syndrome, defined as having three or more metabolic risk components, was determined in 1030 Hispanic children, ages 4-19 y, from 319 families enrolled in the VIVA LA FAMILIA study. Anthropometry, body composition by dual energy x-ray absorptiometry, clinical signs, and serum biochemistries were measured using standard techniques. Risk factor analysis and quantitative genetic analysis were performed. Of the overweight children, 20%, or 28% if abnormal liver function is included in the definition, presented with the metabolic syndrome. Odds ratios for the metabolic syndrome were significantly increased by body mass index z-score and fasting serum insulin; independent effects of sex, age, puberty, and body composition were not seen. Heritabilities +/- SE for waist circumference, triglycerides (TG), HDL, systolic blood pressure (SBP), glucose, and alanine aminotransferase (ALT) were highly significant. Pleiotropy (a common set of genes affecting two traits) detected between SBP and waist circumference, SBP and glucose, HDL and waist circumference, ALT and waist circumference, and TG and ALT may underlie the clustering of the components of the metabolic syndrome. Significant heritabilities and pleiotropy seen for the components of the metabolic syndrome indicate a strong genetic contribution to the metabolic syndrome in overweight Hispanic children. PMID:16306201

  9. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  10. Quantitative analysis of noninvasive diagnostic procedures for induction motor drives

    NASA Astrophysics Data System (ADS)

    Eltabach, Mario; Antoni, Jerome; Najjar, Micheline

    2007-10-01

    This paper reports quantitative analyses of spectral fault components in five noninvasive diagnostic procedures that use input electric signals to detect different types of abnormalities in induction motors. Besides the traditional one phase current spectrum analysis "SC", the diagnostic procedures based on spectrum analysis of the instantaneous partial powers " P ab", " P cb", total power " P abc", and the current space vector modulus " csvm" are considered. The aim of this comparison study is to improve the diagnosis tools for detection of electromechanical faults in electrical machines by using the best suitable diagnostic procedure knowing some motor and fault characteristics. Defining a severity factor as the increase in amplitude of the fault characteristic frequency, with respect to the healthy condition, enables us to study the sensitivity of the electrical diagnostic tools. As a result, it is shown that the relationship between the angular displacement of the current side-bands components at frequencies ( f± fosc) is directly related to the type of induction motor faults. It is also proved that the total instantaneous power diagnostic procedure was observed to exhibit the highest values of the detection criterion in case of mechanical faults while in case of electrical ones the most reliable diagnostic procedure is tightly related to the value of the motor power factor angle and the group motor-load inertia. Finally, simulation and experimental results show good agreement with the fault modeling theoretical results.

  11. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  12. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  13. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this. PMID:19602858

  14. Quantitative analysis of the relationship between nucleotide sequence and functional activity.

    PubMed Central

    Stormo, G D; Schneider, T D; Gold, L

    1986-01-01

    Matrices can be used to evaluate sequences for functional activity. Multiple regression can solve for the matrix that gives the best fit between sequence evaluations and quantitative activities. This analysis shows that the best model for context effects on suppression by su2 involves primarily the two nucleotides 3' to the amber codon, and that their contributions are independent and additive. Context effects on 2AP mutagenesis also involve the two nucleotides 3' to the 2AP insertion, but their effects are not independent. In a construct for producing beta-galactosidase, the effects on translational yields of the tri-nucleotide 5' to the initiation codon are dependent on the entire triplet. Models based on these quantitative results are presented for each of the examples. PMID:3092188

  15. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  16. Quantitative analysis of electroluminescence images from polymer solar cells

    NASA Astrophysics Data System (ADS)

    Seeland, Marco; Rösch, Roland; Hoppe, Harald

    2012-01-01

    We introduce the micro-diode-model (MDM) based on a discrete network of interconnected diodes, which allows for quantitative description of lateral electroluminescence emission images obtained from organic bulk heterojunction solar cells. Besides the distributed solar cell description, the equivalent circuit, respectively, network model considers interface and bulk resistances as well as the sheet resistance of the semitransparent electrode. The application of this model allows direct calculation of the lateral current and voltage distribution within the solar cell and thus accounts well for effects known as current crowding. In addition, network parameters such as internal resistances and the sheet-resistance of the higher resistive electrode can be determined. Furthermore, upon introduction of current sources the micro-diode-model also is able to describe and predict current-voltage characteristics for solar cell devices under illumination. The local nature of this description yields important conclusions concerning the geometry dependent performance and the validity of classical models and equivalent circuits describing thin film solar cells.

  17. Quantitative Dopant/Impurity Analysis for ICF Targets

    NASA Astrophysics Data System (ADS)

    Huang, Haibo; Nikroo, Abbas; Stephens, Richard; Eddinger, Samual; Xu, Hongwei; Chen, K. C.; Moreno, Kari

    2008-11-01

    We developed a number of new or improved metrology techniques to measure the spatial distributions of multiple elements in ICF ablator capsules to tight NIF specifications (0.5±0.1 at% Cu, 0.25±0.10 at% Ar, 0.4±0.4 at% O). The elements are either the graded dopants for shock timing, such as Cu in Be, or process-induced impurities, such as Ar and O. Their low concentration, high spatial variation and simultaneous presence make the measurement very difficult. We solved this metrology challenge by combining several techniques: Cu and Ar profiles can be nondestructively measured by operating Contact Radiography (CR) in a differential mode. The result, as well as the O profile, can be checked destructively by a quantitative Energy Dispersive Spectroscopy (EDS) method. Non-spatially resolved methods, such as absorption edge spectroscopy (and to a lesser accuracy, x-ray fluorescence) can calibrate the Ar and Cu measurement in EDS and CR. In addition, oxygen pick-up during mandrel removal can be validated by before-and-after CR and by density change. Use of all these methods gives multiple checks on the reported profiles.

  18. Nonparametric survival analysis using Bayesian Additive Regression Trees (BART).

    PubMed

    Sparapani, Rodney A; Logan, Brent R; McCulloch, Robert E; Laud, Purushottam W

    2016-07-20

    Bayesian additive regression trees (BART) provide a framework for flexible nonparametric modeling of relationships of covariates to outcomes. Recently, BART models have been shown to provide excellent predictive performance, for both continuous and binary outcomes, and exceeding that of its competitors. Software is also readily available for such outcomes. In this article, we introduce modeling that extends the usefulness of BART in medical applications by addressing needs arising in survival analysis. Simulation studies of one-sample and two-sample scenarios, in comparison with long-standing traditional methods, establish face validity of the new approach. We then demonstrate the model's ability to accommodate data from complex regression models with a simulation study of a nonproportional hazards scenario with crossing survival functions and survival function estimation in a scenario where hazards are multiplicatively modified by a highly nonlinear function of the covariates. Using data from a recently published study of patients undergoing hematopoietic stem cell transplantation, we illustrate the use and some advantages of the proposed method in medical investigations. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26854022

  19. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli

    PubMed Central

    Hur, Kwang-Ho; Mueller, Joachim D.

    2015-01-01

    The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell. PMID:26099032

  20. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  1. Salicylate Detection by Complexation with Iron(III) and Optical Absorbance Spectroscopy: An Undergraduate Quantitative Analysis Experiment

    ERIC Educational Resources Information Center

    Mitchell-Koch, Jeremy T.; Reid, Kendra R.; Meyerhoff, Mark E.

    2008-01-01

    An experiment for the undergraduate quantitative analysis laboratory involving applications of visible spectrophotometry is described. Salicylate, a component found in several medications, as well as the active by-product of aspirin decomposition, is quantified. The addition of excess iron(III) to a solution of salicylate generates a deeply…

  2. Rapid inorganic ion analysis using quantitative microchip capillary electrophoresis.

    PubMed

    Vrouwe, Elwin X; Luttge, Regina; Olthuis, Wouter; van den Berg, Albert

    2006-01-13

    Rapid quantitative microchip capillary electrophoresis (CE) for online monitoring of drinking water enabling inorganic ion separation in less than 15 s is presented. Comparing cationic and anionic standards at different concentrations the analysis of cationic species resulted in non-linear calibration curves. We interpret this effect as a variation in the volume of the injected sample plug caused by changes of the electroosmotic flow (EOF) due to the strong interaction of bivalent cations with the glass surface. This explanation is supported by the observation of severe peak tailing. Conducting microchip CE analysis in a glass microchannel, optimized conditions are received for the cationic species K+, Na+, Ca2+, Mg2+ using a background electrolyte consisting of 30 mmol/L histidine and 2-(N-morpholino)ethanesulfonic acid, containing 0.5 mmol/L potassium chloride to reduce surface interaction and 4 mmol/L tartaric acid as a complexing agent resulting in a pH-value of 5.8. Applying reversed EOF co-migration for the anionic species Cl-, SO42- and HCO3- optimized separation occurs in a background electrolyte consisting of 10 mmol/L 4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid (HEPES) and 10 mmol/L HEPES sodium salt, containing 0.05 mmol/L CTAB (cetyltrimethylammonium bromide) resulting in a pH-value of 7.5. The detection limits are 20 micromol/L for the monovalent cationic and anionic species and 10 micromol/L for the divalent species. These values make the method very suitable for many applications including the analysis of abundant ions in tap water as demonstrated in this paper. PMID:16310794

  3. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  4. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  5. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  6. Complete multipoint sib-pair analysis of qualitative and quantitative traits

    SciTech Connect

    Kruglyak, L.; Lander, E.S.

    1995-08-01

    Sib-pair analysis is an increasingly important tool for genetic dissection of complex traits. Current methods for sib-pair analysis are primarily based on studying individual genetic markers one at a time and thus fail to use the full inheritance information provided by multipoint linkage analysis. In this paper, we describe how to extract the complete multipoint inheritance information for each sib pair. We then describe methods that use this information to map loci affecting traits, thereby providing a unified approach to both qualitative and quantitative traits. Specifically, complete multipoint approaches are presented for (1) exclusion mapping of qualitative traits; (2) maximum-likelihood mapping of qualitative traits; (3) information-content mapping, showing the extent to which all inheritance information has been extracted at each location in the genome; and (4) quantitative-trait mapping, by two parametric methods and one nonparametric method. In addition, we explore the effects of marker density, marker polymorphism, and availability of parents on the information content of a study. We have implemented the analysis methods in a new computer package, MAPMAKER/SIBS. With this computer package, complete multipoint analysis with dozens of markers in hundreds of sib pairs can be carried out in minutes. 25 refs., 8 figs.

  7. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems. PMID:21710159

  8. Quantitative analysis of protein dynamics during asymmetric cell division.

    PubMed

    Mayer, Bernd; Emery, Gregory; Berdnik, Daniela; Wirtz-Peitz, Frederik; Knoblich, Juergen A

    2005-10-25

    In dividing Drosophila sensory organ precursor (SOP) cells, the fate determinant Numb and its associated adaptor protein Pon localize asymmetrically and segregate into the anterior daughter cell, where Numb influences cell fate by repressing Notch signaling. Asymmetric localization of both proteins requires the protein kinase aPKC and its substrate Lethal (2) giant larvae (Lgl). Because both Numb and Pon localization require actin and myosin, lateral transport along the cell cortex has been proposed as a possible mechanism for their asymmetric distribution. Here, we use quantitative live analysis of GFP-Pon and Numb-GFP fluorescence and fluorescence recovery after photobleaching (FRAP) to characterize the dynamics of Numb and Pon localization during SOP division. We demonstrate that Numb and Pon rapidly exchange between a cytoplasmic pool and the cell cortex and that preferential recruitment from the cytoplasm is responsible for their asymmetric distribution during mitosis. Expression of a constitutively active form of aPKC impairs membrane recruitment of GFP-Pon. This defect can be rescued by coexpression of nonphosphorylatable Lgl, indicating that Lgl is the main target of aPKC. We propose that a high-affinity binding site is asymmetrically distributed by aPKC and Lgl and is responsible for asymmetric localization of cell-fate determinants during mitosis. PMID:16243032

  9. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  10. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  11. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.; Fei, Baowei

    2012-07-01

    Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology.

  12. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-01

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

  13. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  14. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  15. Hyperspectral imaging and quantitative analysis for prostate cancer detection.

    PubMed

    Akbari, Hamed; Halig, Luma V; Schuster, David M; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T; Chen, Georgia Z; Fei, Baowei

    2012-07-01

    Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  16. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  17. Quantitative trait locus analysis for hemostasis and thrombosis

    PubMed Central

    Sa, Qila; Hart, Erika; Hill, Annie E.; Nadeau, Joseph H.

    2009-01-01

    Susceptibility to thrombosis varies in human populations as well as many in inbred mouse strains. The objective of this study was to characterize the genetic control of thrombotic risk on three chromosomes. Previously, utilizing a tail-bleeding/rebleeding assay as a surrogate of hemostasis and thrombosis function, three mouse chromosome substitution strains (CSS) (B6-Chr5A/J, Chr11A/J, Chr17A/J) were identified (Hmtb1, Hmtb2, Hmtb3). The tailbleeding/rebleeding assay is widely used and distinguishes mice with genetic defects in blood clot formation or dissolution. In the present study, quantitative trait locus (QTL) analysis revealed a significant locus for rebleeding (clot stability) time (time between cessation of initial bleeding and start of the second bleeding) on chromosome 5, suggestive loci for bleeding time (time between start of bleeding and cessation of bleeding) also on chromosomes 5, and two suggestive loci for clot stability on chromosome 17 and one on chromosome 11. The three CSS and the parent A/J had elevated clot stability time. There was no interaction of genes on chromosome 11 with genes on chromosome 5 or chromosome 17. On chromosome 17, twenty-three candidate genes were identified in synteny with previously identified loci for thrombotic risk on human chromosome 18. Thus, we have identified new QTLs and candidate genes not previously known to influence thrombotic risk. PMID:18787898

  18. Precessing rotating flows with additional shear: Stability analysis

    NASA Astrophysics Data System (ADS)

    Salhi, A.; Cambon, C.

    2009-03-01

    We consider unbounded precessing rotating flows in which vertical or horizontal shear is induced by the interaction between the solid-body rotation (with angular velocity Ω0 ) and the additional “precessing” Coriolis force (with angular velocity -ɛΩ0 ), normal to it. A “weak” shear flow, with rate 2ɛ of the same order of the Poincaré “small” ratio ɛ , is needed for balancing the gyroscopic torque, so that the whole flow satisfies Euler’s equations in the precessing frame (the so-called admissibility conditions). The base flow case with vertical shear (its cross-gradient direction is aligned with the main angular velocity) corresponds to Mahalov’s [Phys. Fluids A 5, 891 (1993)] precessing infinite cylinder base flow (ignoring boundary conditions), while the base flow case with horizontal shear (its cross-gradient direction is normal to both main and precessing angular velocities) corresponds to the unbounded precessing rotating shear flow considered by Kerswell [Geophys. Astrophys. Fluid Dyn. 72, 107 (1993)]. We show that both these base flows satisfy the admissibility conditions and can support disturbances in terms of advected Fourier modes. Because the admissibility conditions cannot select one case with respect to the other, a more physical derivation is sought: Both flows are deduced from Poincaré’s [Bull. Astron. 27, 321 (1910)] basic state of a precessing spheroidal container, in the limit of small ɛ . A Rapid distortion theory (RDT) type of stability analysis is then performed for the previously mentioned disturbances, for both base flows. The stability analysis of the Kerswell base flow, using Floquet’s theory, is recovered, and its counterpart for the Mahalov base flow is presented. Typical growth rates are found to be the same for both flows at very small ɛ , but significant differences are obtained regarding growth rates and widths of instability bands, if larger ɛ values, up to 0.2, are considered. Finally, both flow cases

  19. A general approach for the purification and quantitative glycomic analysis of human plasma.

    PubMed

    Tep, Samnang; Hincapie, Marina; Hancock, William S

    2012-03-01

    The development of a general method for the purification and quantitative glycomic analysis of human plasma samples to characterize global glycosylation changes shall be presented. The method involves multiple steps, including the depletion of plasma via multi-affinity chromatography to remove high abundant proteins, the enrichment of the lower abundant glycoproteins via multi-lectin affinity chromatography, the isotopic derivatization of released glycans, and quantitative analysis by MALDI-TOF MS. Isotopic derivatization of glycans is accomplished using the well-established chemistry of reductive amination to derivatize glycans with either a light analog ((12)C anthranilic acid) or a heavy analog ((13)C(7) anthranilic acid), which allows for the direct comparison of the alternately labeled glycans by MALDI-TOF MS. The method displays a tenfold linear dynamic range for both neutral and sialylated glycans with sub-picomolar sensitivity. Additionally, by using anthranilic acid, a very sensitive fluorophore, as the derivatization reagent, the glycans can be analyzed by chromatography with fluorescence detection. The utility of this methodology is highlighted by the many diseases and disorders that are known to either show or be the result of changes in glycosylation. A method that provides a generic approach for sample preparation and quantitative data will help to further advance the field of glycomics. PMID:22274286

  20. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory.

  1. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy. PMID:24607988

  2. Quantitative Analysis of Human Cancer Cell Extravasation Using Intravital Imaging.

    PubMed

    Willetts, Lian; Bond, David; Stoletov, Konstantin; Lewis, John D

    2016-01-01

    Metastasis, or the spread of cancer cells from a primary tumor to distant sites, is the leading cause of cancer-associated death. Metastasis is a complex multi-step process comprised of invasion, intravasation, survival in circulation, extravasation, and formation of metastatic colonies. Currently, in vitro assays are limited in their ability to investigate these intricate processes and do not faithfully reflect metastasis as it occurs in vivo. Traditional in vivo models of metastasis are limited by their ability to visualize the seemingly sporadic behavior of where and when cancer cells spread (Reymond et al., Nat Rev Cancer 13:858-870, 2013). The avian embryo model of metastasis is a powerful platform to study many of the critical steps in the metastatic cascade including the migration, extravasation, and invasion of human cancer cells in vivo (Sung et al., Nat Commun 6:7164, 2015; Leong et al., Cell Rep 8, 1558-1570, 2014; Kain et al., Dev Dyn 243:216-28, 2014; Leong et al., Nat Protoc 5:1406-17, 2010; Zijlstra et al., Cancer Cell 13:221-234, 2008; Palmer et al., J Vis Exp 51:2815, 2011). The chicken chorioallantoic membrane (CAM) is a readily accessible and well-vascularized tissue that surrounds the developing embryo. When the chicken embryo is grown in a shell-less, ex ovo environment, the nearly transparent CAM provides an ideal environment for high-resolution fluorescent microcopy approaches. In this model, the embryonic chicken vasculature and labeled cancer cells can be visualized simultaneously to investigate specific steps in the metastatic cascade including extravasation. When combined with the proper image analysis tools, the ex ovo chicken embryo model offers a cost-effective and high-throughput platform for the quantitative analysis of tumor cell metastasis in a physiologically relevant in vivo setting. Here we discuss detailed procedures to quantify cancer cell extravasation in the shell-less chicken embryo model with advanced fluorescence

  3. Quantitative analysis of cell-free DNA in ovarian cancer

    PubMed Central

    SHAO, XUEFENG; He, YAN; JI, MIN; CHEN, XIAOFANG; QI, JING; SHI, WEI; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ovarian tumor cases, and 19 healthy, non-cancerous ovaries. bDNA techniques were used to detect serum cf-DNA concentrations. All data were analyzed using SPSS version 18.0. The cf-DNA levels were significantly increased in the ovarian cancer group compared with those of the benign ovarian tumor group and healthy ovarian group (P<0.01). Furthermore, cf-DNA levels were significantly increased in stage III and IV ovarian cancer compared with those of stages I and II (P<0.01). In addition, cf-DNA levels were significantly increased on the first day post-surgery (P<0.01), and subsequently demonstrated a gradual decrease. In the ovarian cancer group, the area under the receiver operating characteristic curve of cf-DNA and the sensitivity were 0.917 and 88.9%, respectively, which was higher than those of cancer antigen 125 (0.724, 75%) and human epididymis protein 4 (0.743, 80.6%). There was a correlation between the levels of serum cf-DNA and the occurrence and development of ovarian cancer in the patients evaluated. bDNA techniques possessed higher sensitivity and specificity than other methods for the detection of serum cf-DNA in patients exhibiting ovarian cancer, and bDNA techniques are more useful for detecting cf-DNA than other factors. Thus, the present study demonstrated the potential value for the use of bDNA as an adjuvant diagnostic method for ovarian cancer. PMID:26788153

  4. Quantitative analysis of night skyglow amplification under cloudy conditions

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU

  5. Quantitative real-time single particle analysis of virions

    SciTech Connect

    Heider, Susanne; Metzner, Christoph

    2014-08-15

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed—or adapted from other fields, such as nanotechnology—to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. - Highlights: • We introduce four methods for virus particle-based quantification of viruses. • They allow for quantification of a wide range of samples in under an hour time. • The additional measurement of size and zeta potential is possible for some.

  6. Quantitative analysis of LISA pathfinder test-mass noise

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-12-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3×10-14ms-2/Hz at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise detection, the

  7. Characterizing Aging in the Human Brainstem Using Quantitative Multimodal MRI Analysis

    PubMed Central

    Lambert, Christian; Chowdhury, Rumana; FitzGerald, Thomas H. B.; Fleming, Stephen M.; Lutti, Antoine; Hutton, Chloe; Draganski, Bogdan; Frackowiak, Richard; Ashburner, John

    2013-01-01

    Aging is ubiquitous to the human condition. The MRI correlates of healthy aging have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI (qMRI), and diffusion tensor imaging. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analyzing this region. By utilizing a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of aging within the human brainstem in vivo. Using qMRI, tensor-based morphometry (TBM), and voxel-based quantification (VBQ), the volumetric and quantitative changes across healthy adults between 19 and 75 years were characterized. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetization transfer and increase in proton density (PD), accounting for the previously described “midbrain shrinkage.” Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterized, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterized by early, pre-clinical involvement of the brainstem, such as Parkinson’s and Alzheimer’s diseases. PMID:23970860

  8. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  9. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  10. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  11. Analysis of quantitative phase detection based on optical information processing

    NASA Astrophysics Data System (ADS)

    Tao, Wang; Tu, Jiang-Chen; Chun, Kuang-Tao; Yu, Han-Wang; Xin, Du

    2009-07-01

    Phase object exists widely in nature, such as biological cells, optical components, atmospheric flow field and so on. The phase detection of objects has great significance in the basic research, nondestructive testing, aerospace, military weapons and other areas. The usual methods of phase object detection include interference method, grating method, schlieren method, and phase-contrast method etc. These methods have their own advantages, but they also have some disadvantages on detecting precision, environmental requirements, cost, detection rate, detection range, detection linearity in various applications, even the most sophisticated method-phase contrast method mainly used in microscopic structure, lacks quantitative analysis of the size of the phase of the object and the relationship between the image contrast and the optical system. In this paper, various phase detection means and the characteristics of different applications are analyzed based on the optical information processing, and a phase detection system based on optical filtering is formed. Firstly the frequency spectrum of the phase object is achieved by Fourier transform lens in the system, then the frequency spectrum is changed reasonably by the filter, at last the image which can represent the phase distribution through light intensity is achieved by the inverse Fourier transform. The advantages and disadvantages of the common used filters such as 1/4 wavelength phase filter, high-pass filter and edge filter are analyzed, and their phase resolution is analyzed in the same optical information processing system, and the factors impacting phase resolution are pointed out. The paper draws a conclusion that there exists an optimal filter which makes the detect accuracy best for any application. At last, we discussed how to design an optimal filter through which the ability of the phase testing of optical information processing system can be improved most.

  12. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  13. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    PubMed Central

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  14. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    PubMed

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615

  15. PIQMIe: a web server for semi-quantitative proteomics data management and analysis

    PubMed Central

    Kuzniar, Arnold; Kanaar, Roland

    2014-01-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615

  16. Altered resting-state functional activity in posttraumatic stress disorder: A quantitative meta-analysis

    PubMed Central

    Wang, Ting; Liu, Jia; Zhang, Junran; Zhan, Wang; Li, Lei; Wu, Min; Huang, Hua; Zhu, Hongyan; Kemp, Graham J.; Gong, Qiyong

    2016-01-01

    Many functional neuroimaging studies have reported differential patterns of spontaneous brain activity in posttraumatic stress disorder (PTSD), but the findings are inconsistent and have not so far been quantitatively reviewed. The present study set out to determine consistent, specific regional brain activity alterations in PTSD, using the Effect Size Signed Differential Mapping technique to conduct a quantitative meta-analysis of resting-state functional neuroimaging studies of PTSD that used either a non-trauma (NTC) or a trauma-exposed (TEC) comparison control group. Fifteen functional neuroimaging studies were included, comparing 286 PTSDs, 203 TECs and 155 NTCs. Compared with NTC, PTSD patients showed hyperactivity in the right anterior insula and bilateral cerebellum, and hypoactivity in the dorsal medial prefrontal cortex (mPFC); compared with TEC, PTSD showed hyperactivity in the ventral mPFC. The pooled meta-analysis showed hypoactivity in the posterior insula, superior temporal, and Heschl’s gyrus in PTSD. Additionally, subgroup meta-analysis (non-medicated subjects vs. NTC) identified abnormal activation in the prefrontal-limbic system. In meta-regression analyses, mean illness duration was positively associated with activity in the right cerebellum (PTSD vs. NTC), and illness severity was negatively associated with activity in the right lingual gyrus (PTSD vs. TEC). PMID:27251865

  17. Altered resting-state functional activity in posttraumatic stress disorder: A quantitative meta-analysis.

    PubMed

    Wang, Ting; Liu, Jia; Zhang, Junran; Zhan, Wang; Li, Lei; Wu, Min; Huang, Hua; Zhu, Hongyan; Kemp, Graham J; Gong, Qiyong

    2016-01-01

    Many functional neuroimaging studies have reported differential patterns of spontaneous brain activity in posttraumatic stress disorder (PTSD), but the findings are inconsistent and have not so far been quantitatively reviewed. The present study set out to determine consistent, specific regional brain activity alterations in PTSD, using the Effect Size Signed Differential Mapping technique to conduct a quantitative meta-analysis of resting-state functional neuroimaging studies of PTSD that used either a non-trauma (NTC) or a trauma-exposed (TEC) comparison control group. Fifteen functional neuroimaging studies were included, comparing 286 PTSDs, 203 TECs and 155 NTCs. Compared with NTC, PTSD patients showed hyperactivity in the right anterior insula and bilateral cerebellum, and hypoactivity in the dorsal medial prefrontal cortex (mPFC); compared with TEC, PTSD showed hyperactivity in the ventral mPFC. The pooled meta-analysis showed hypoactivity in the posterior insula, superior temporal, and Heschl's gyrus in PTSD. Additionally, subgroup meta-analysis (non-medicated subjects vs. NTC) identified abnormal activation in the prefrontal-limbic system. In meta-regression analyses, mean illness duration was positively associated with activity in the right cerebellum (PTSD vs. NTC), and illness severity was negatively associated with activity in the right lingual gyrus (PTSD vs. TEC). PMID:27251865

  18. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  19. iTRAQ-Based Quantitative Proteomic Analysis of Nasopharyngeal Carcinoma.

    PubMed

    Cai, Xin-Zhang; Zeng, Wei-Qun; Xiang, Yi; Liu, Yi; Zhang, Hong-Min; Li, Hong; She, Sha; Yang, Min; Xia, Kun; Peng, Shi-Fang

    2015-07-01

    Nasopharyngeal carcinoma (NPC) is a common disease in the southern provinces of China with a poor prognosis. To better understand the pathogenesis of NPC and identify proteins involved in NPC carcinogenesis, we applied iTRAQ coupled with two-dimensional LC-MS/MS to compare the proteome profiles of NPC tissues and the adjacent non-tumor tissues. We identified 54 proteins with differential expression in NPC and the adjacent non-tumor tissues. The differentially expressed proteins were further determined by RT-PCR and Western blot analysis. In addition, the up-regulation of HSPB1, NPM1 and NCL were determined by immunohistochemistry using tissue microarray. Functionally, we found that siRNA mediated knockdown of NPM1 inhibited the migration and invasion of human NPC CNE1 cell line. In summary, this is the first study on proteome analysis of NPC tissues using an iTRAQ method, and we identified many new differentially expressed proteins which are potential targets for the diagnosis and therapy of NPC. PMID:25648846

  20. Optimized semi-quantitative blot analysis in infection assays using the Stain-Free technology.

    PubMed

    Zeitler, Anna F; Gerrer, Katrin H; Haas, Rainer; Jiménez-Soto, Luisa F

    2016-07-01

    Western blots are a commonly used method for protein detection and quantification in biological samples. Compensation of loading variations is achieved by housekeeping protein (HKP) normalization and/or total protein normalization (TPN). However, under infection conditions, HKP normalization, traditionally used in cell biology for quantification of western blots, can be problematic. Binding of microbes to target cells via specific receptors can induce signal transduction events resulting in drastic changes in the level of expression of HKPs. Additionally, samples collected after infection assays will include cellular and microbial proteins altering the analysis with TPN. Here we demonstrate under experimental infection conditions, how a reliable semi-quantitative analysis of proteins in western blots can be achieved using the Stain-Free technology. PMID:27150675

  1. Chemical fingerprint and quantitative analysis for quality control of polyphenols extracted from pomegranate peel by HPLC.

    PubMed

    Li, Jianke; He, Xiaoye; Li, Mengying; Zhao, Wei; Liu, Liu; Kong, Xianghong

    2015-06-01

    A simple and efficient HPLC fingerprint method was developed and validated for quality control of the polyphenols extracted from pomegranate peel (PPPs). Ten batches of pomegranate collected from different orchards in Shaanxi Lintong of China were used to establish the fingerprint. For the fingerprint analysis, 15 characteristic peaks were selected to evaluate the similarities of 10 batches of the PPPs. The similarities of the PPPs samples were all more than 0.968, indicating that the samples from different areas of Lintong were consistent. Additionally, simultaneous quantification of eight monophenols (including gallic acid, punicalagin, catechin, chlorogenic acid, caffeic acid, epicatechin, rutin, and ellagic acid) in the PPPs was conducted to interpret the consistency of the quality test. The results demonstrated that the HPLC fingerprint as a characteristic distinguishing method combining similarity evaluation and quantitative analysis can be successfully used to assess the quality and to identify the authenticity of the PPPs. PMID:25624199

  2. Estimation of crack and damage progression in concrete by quantitative acoustic emission analysis

    SciTech Connect

    Ohtsu, Masayasu

    1999-05-01

    The kinematics of cracking can be represented by the moment tensor. To distinguish moment tensor components from acoustic emission waveforms, the SiGMA (simplified Green`s functions for moment tensor analysis) procedure was developed. By applying the procedure to bending tests of notched beams, cracks in the fracture process zone of cementitious materials can be identified by kinematic means. In addition to cracks, estimation of the damage level in structural concrete is also conducted, based on acoustic emission activity of a concrete sample under compression. Depending on the damage resulting from existing microcracks, acoustic emission generated behavior is quantitatively estimated by the rate process analysis. The damage mechanics are introduced to quantify the degree of damage. Determining the current damage level using acoustic emission without information on undamaged concrete is attempted by correlating the damage value with the rate process.

  3. Quantitative analysis of localized surface plasmons based on molecular probing.

    PubMed

    Deeb, Claire; Bachelot, Renaud; Plain, Jérôme; Baudrion, Anne-Laure; Jradi, Safi; Bouhelier, Alexandre; Soppera, Olivier; Jain, Prashant K; Huang, Libai; Ecoffet, Carole; Balan, Lavinia; Royer, Pascal

    2010-08-24

    We report on the quantitative characterization of the plasmonic optical near-field of a single silver nanoparticle. Our approach relies on nanoscale molecular molding of the confined electromagnetic field by photoactivated molecules. We were able to directly image the dipolar profile of the near-field distribution with a resolution better than 10 nm and to quantify the near-field depth and its enhancement factor. A single nanoparticle spectral signature was also assessed. This quantitative characterization constitutes a prerequisite for developing nanophotonic applications. PMID:20687536

  4. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  5. Quantitative analysis of real-time tissue elastography for evaluation of liver fibrosis

    PubMed Central

    Shi, Ying; Wang, Xing-Hua; Zhang, Huan-Hu; Zhang, Hai-Qing; Tu, Ji-Zheng; Wei, Kun; Li, Juan; Liu, Xiao-Li

    2014-01-01

    The present study aimed to investigate the feasibility of quantitative analysis of liver fibrosis using real-time tissue elastography (RTE) and its pathological and molecule biological basis. Methods: Fifty-four New Zealand rabbits were subcutaneously injected with thioacetamide (TAA) to induce liver fibrosis as the model group, and another eight New Zealand rabbits served as the normal control group. Four rabbits were randomly taken every two weeks for real-time tissue elastography (RTE) and quantitative analysis of tissue diffusion. The obtained twelve characteristic quantities included relative mean value (MEAN), standard deviation (SD), blue area % (% AREA), complexity (COMP), kurtosis (KURT), skewness (SKEW), contrast (CONT), entropy (ENT), inverse different moment (IDM), angular secon moment (ASM), correlation (CORR) and liver fibrosis index (LF Index). Rabbits were executed and liver tissues were taken for pathological staging of liver fibrosis (grouped by pathological stage into S0 group, S1 group, S2 group, S3 group and S4 group). In addition, the collagen I (Col I) and collagen III (Col III) expression levels in liver tissue were detected by Western blot. Results: Except for KURT, there were significant differences among the other eleven characteristic quantities (P < 0.05). LF Index, Col I and Col III expression levels showed a rising trend with increased pathological staging of liver fibrosis, presenting a positive correlation with the pathological staging of liver fibrosis (r = 0.718, r = 0.693, r = 0.611, P < 0.05). Conclusion: RTE quantitative analysis is expected for noninvasive evaluation of the pathological staging of liver fibrosis. PMID:24955175

  6. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  7. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  8. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  9. Features of the Quantitative Analysis in Moessbauer Spectroscopy

    SciTech Connect

    Semenov, V. G.; Panchuk, V. V.; Irkaev, S. M.

    2010-07-13

    The results describing the effect of different factors on errors in quantitative determination of the phase composition of studied substances by Moessbauer spectroscopy absorption are presented, and the ways of using them are suggested. The effectiveness of the suggested methods is verified by an example of analyzing standard and unknown compositions.

  10. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  11. Quantitative Analysis of Radionuclides in Process and Environmental Samples

    SciTech Connect

    Boni, A.L.

    2003-02-21

    An analytical method was developed for the radiochemical separation and quantitative recovery of ruthenium, zirconium, niobium, neptunium, cobalt, iron, zinc, strontium, rare earths, chromium and cesium from a wide variety of natural materials. This paper discusses this analytical method, based on the anion exchange properties of the various radionuclides, although both ion exchange and precipitation techniques are incorporated.

  12. Additional analysis of dendrochemical data of Fallon, Nevada.

    PubMed

    Sheppard, Paul R; Helsel, Dennis R; Speakman, Robert J; Ridenour, Gary; Witten, Mark L

    2012-04-01

    Previously reported dendrochemical data showed temporal variability in concentration of tungsten (W) and cobalt (Co) in tree rings of Fallon, Nevada, US. Criticism of this work questioned the use of the Mann-Whitney test for determining change in element concentrations. Here, we demonstrate that Mann-Whitney is appropriate for comparing background element concentrations to possibly elevated concentrations in environmental media. Given that Mann-Whitney tests for differences in shapes of distributions, inter-tree variability (e.g., "coefficient of median variation") was calculated for each measured element across trees within subsites and time periods. For W and Co, the metals of highest interest in Fallon, inter-tree variability was always higher within versus outside of Fallon. For calibration purposes, this entire analysis was repeated at a different town, Sweet Home, Oregon, which has a known tungsten-powder facility, and inter-tree variability of W in tree rings confirmed the establishment date of that facility. Mann-Whitney testing of simulated data also confirmed its appropriateness for analysis of data affected by point-source contamination. This research adds important new dimensions to dendrochemistry of point-source contamination by adding analysis of inter-tree variability to analysis of central tendency. Fallon remains distinctive by a temporal increase in W beginning by the mid 1990s and by elevated Co since at least the early 1990s, as well as by high inter-tree variability for W and Co relative to comparison towns. PMID:22227064

  13. Quantitative solid state NMR analysis of residues from acid hydrolysis of loblolly pine wood.

    PubMed

    Sievers, Carsten; Marzialetti, Teresita; Hoskins, Travis J C; Valenzuela Olarte, Mariefel B; Agrawal, Pradeep K; Jones, Christopher W

    2009-10-01

    The composition of solid residues from hydrolysis reactions of loblolly pine wood with dilute mineral acids is analyzed by (13)C Cross Polarization Magic Angle Spinning (CP MAS) NMR spectroscopy. Using this method, the carbohydrate and lignin fractions are quantified in less than 3h as compared to over a day using wet chemical methods. In addition to the quantitative information, (13)C CP MAS NMR spectroscopy provides information on the formation of additional extractives and pseudo lignin from the carbohydrates. Being a non-destructive technique, NMR spectroscopy provides unambiguous evidence of the presence of side reactions and products, which is a clear advantage over the wet chemical analytical methods. Quantitative results from NMR spectroscopy and proximate analysis are compared for the residues from hydrolysis of loblolly pine wood under 13 different conditions; samples were treated either at 150 degrees C or 200 degrees C in the presence of various acids (HCl, H(2)SO(4), H(3)PO(4), HNO(3) and TFA) or water. The lignin content determined by both methods differed on averaged by 2.9 wt% resulting in a standard deviation of 3.5 wt%. It is shown that solid degradation products are formed from saccharide precursors under harsh reaction conditions. These degradation reactions limit the total possible yield of monosaccharides from any subsequent reaction. PMID:19477123

  14. Analysis of Saccharides by the Addition of Amino Acids

    NASA Astrophysics Data System (ADS)

    Ozdemir, Abdil; Lin, Jung-Lee; Gillig, Kent J.; Gulfen, Mustafa; Chen, Chung-Hsuan

    2016-06-01

    In this work, we present the detection sensitivity improvement of electrospray ionization (ESI) mass spectrometry of neutral saccharides in a positive ion mode by the addition of various amino acids. Saccharides of a broad molecular weight range were chosen as the model compounds in the present study. Saccharides provide strong noncovalent interactions with amino acids, and the complex formation enhances the signal intensity and simplifies the mass spectra of saccharides. Polysaccharides provide a polymer-like ESI spectrum with a basic subunit difference between multiply charged chains. The protonated spectra of saccharides are not well identified because of different charge state distributions produced by the same molecules. Depending on the solvent used and other ions or molecules present in the solution, noncovalent interactions with saccharides may occur. These interactions are affected by the addition of amino acids. Amino acids with polar side groups show a strong tendency to interact with saccharides. In particular, serine shows a high tendency to interact with saccharides and significantly improves the detection sensitivity of saccharide compounds.

  15. Porosity Measurements and Analysis for Metal Additive Manufacturing Process Control

    PubMed Central

    Slotwinski, John A; Garboczi, Edward J; Hebenstreit, Keith M

    2014-01-01

    Additive manufacturing techniques can produce complex, high-value metal parts, with potential applications as critical metal components such as those found in aerospace engines and as customized biomedical implants. Material porosity in these parts is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants - since surface-breaking pores allows for better integration with biological tissue. Changes in a part’s porosity during an additive manufacturing build may also be an indication of an undesired change in the build process. Here, we present efforts to develop an ultrasonic sensor for monitoring changes in the porosity in metal parts during fabrication on a metal powder bed fusion system. The development of well-characterized reference samples, measurements of the porosity of these samples with multiple techniques, and correlation of ultrasonic measurements with the degree of porosity are presented. A proposed sensor design, measurement strategy, and future experimental plans on a metal powder bed fusion system are also presented. PMID:26601041

  16. Porosity Measurements and Analysis for Metal Additive Manufacturing Process Control.

    PubMed

    Slotwinski, John A; Garboczi, Edward J; Hebenstreit, Keith M

    2014-01-01

    Additive manufacturing techniques can produce complex, high-value metal parts, with potential applications as critical metal components such as those found in aerospace engines and as customized biomedical implants. Material porosity in these parts is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants - since surface-breaking pores allows for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the build process. Here, we present efforts to develop an ultrasonic sensor for monitoring changes in the porosity in metal parts during fabrication on a metal powder bed fusion system. The development of well-characterized reference samples, measurements of the porosity of these samples with multiple techniques, and correlation of ultrasonic measurements with the degree of porosity are presented. A proposed sensor design, measurement strategy, and future experimental plans on a metal powder bed fusion system are also presented. PMID:26601041

  17. Additional EIPC Study Analysis: Interim Report on High Priority Topics

    SciTech Connect

    Hadley, Stanton W

    2013-11-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 13 topics was developed for further analysis; this paper discusses the first five.

  18. Disclosure of hydraulic fracturing fluid chemical additives: analysis of regulations.

    PubMed

    Maule, Alexis L; Makey, Colleen M; Benson, Eugene B; Burrows, Isaac J; Scammell, Madeleine K

    2013-01-01

    Hydraulic fracturing is used to extract natural gas from shale formations. The process involves injecting into the ground fracturing fluids that contain thousands of gallons of chemical additives. Companies are not mandated by federal regulations to disclose the identities or quantities of chemicals used during hydraulic fracturing operations on private or public lands. States have begun to regulate hydraulic fracturing fluids by mandating chemical disclosure. These laws have shortcomings including nondisclosure of proprietary or "trade secret" mixtures, insufficient penalties for reporting inaccurate or incomplete information, and timelines that allow for after-the-fact reporting. These limitations leave lawmakers, regulators, public safety officers, and the public uninformed and ill-prepared to anticipate and respond to possible environmental and human health hazards associated with hydraulic fracturing fluids. We explore hydraulic fracturing exemptions from federal regulations, as well as current and future efforts to mandate chemical disclosure at the federal and state level. PMID:23552653

  19. Quantitative analysis of rib kinematics based on dynamic chest bone images: preliminary results

    PubMed Central

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-01-01

    Abstract. An image-processing technique for separating bones from soft tissue in static chest radiographs has been developed. The present study was performed to evaluate the usefulness of dynamic bone images in quantitative analysis of rib movement. Dynamic chest radiographs of 16 patients were obtained using a dynamic flat-panel detector and processed to create bone images by using commercial software (Clear Read BS, Riverain Technologies). Velocity vectors were measured in local areas on the dynamic images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as a reduced rib velocity field, resulting in an asymmetrical distribution of rib movement. Vector maps in all normal cases exhibited left/right symmetric distributions of the velocity field, whereas those in abnormal cases showed asymmetric distributions because of locally limited rib movements. Dynamic bone images were useful for accurate quantitative analysis of rib movements. The present method has a potential for an additional functional examination in chest radiography. PMID:26158097

  20. WormFarm: a quantitative control and measurement device toward automated Caenorhabditis elegans aging analysis.

    PubMed

    Xian, Bo; Shen, Jie; Chen, Weiyang; Sun, Na; Qiao, Nan; Jiang, Dongqing; Yu, Tao; Men, Yongfan; Han, Zhijun; Pang, Yuhong; Kaeberlein, Matt; Huang, Yanyi; Han, Jing-Dong J

    2013-06-01

    Caenorhabditis elegans is a leading model organism for studying the basic mechanisms of aging. Progress has been limited, however, by the lack of an automated system for quantitative analysis of longevity and mean lifespan. To address this barrier, we developed 'WormFarm', an integrated microfluidic device for culturing nematodes. Cohorts of 30-50 animals are maintained throughout their lifespan in each of eight separate chambers on a single WormFarm polydimethylsiloxane chip. Design features allow for automated removal of progeny and efficient control of environmental conditions. In addition, we have developed computational algorithms for automated analysis of video footage to quantitate survival and other phenotypes, such as body size and motility. As proof-of-principle, we show here that WormFarm successfully recapitulates survival data obtained from a standard plate-based assay for both RNAi-mediated and dietary-induced changes in lifespan. Further, using a fluorescent reporter in conjunction with WormFarm, we report an age-associated decrease in fluorescent intensity of GFP in transgenic worms expressing GFP tagged with a mitochondrial import signal under the control of the myo-3 promoter. This marker may therefore serve as a useful biomarker of biological age and aging rate. PMID:23442149

  1. Global quantitative analysis of phosphorylation underlying phencyclidine signaling and sensorimotor gating in the prefrontal cortex

    PubMed Central

    McClatchy, Daniel B.; Savas, Jeffrey N.; Martínez-Bartolomé, Salvador; Park, Sung Kyu; Maher, Pamela; Powell, Susan B.; Yates, John R.

    2015-01-01

    Prepulse inhibition (PPI) is an example of sensorimotor gating and deficits in PPI have been demonstrated in schizophrenia patients. Phencyclidine (PCP) suppression of PPI in animals has been studied to elucidate the pathological elements of schizophrenia. However, the molecular mechanisms underlying PCP treatment or PPI in the brain are still poorly understood. In this study, quantitative phosphoproteomic analysis was performed on the prefrontal cortex from rats that were subjected to PPI after being systemically injected with PCP or saline. PCP down-regulated phosphorylation events were significantly enriched in proteins associated with long-term potentiation (LTP). Importantly, this dataset identifies functionally novel phosphorylation sites on known LTP-associated signaling molecules. In addition, mutagenesis of a significantly altered phosphorylation site on xCT (SLC7A11), the light chain of system xc-, the cystine/glutamate antiporter, suggests that PCP also regulates the activity of this protein. Finally, new insights were also derived on PPI signaling independent of PCP treatment. This is the first quantitative phosphorylation proteomic analysis providing new molecular insights into sensorimotor gating. PMID:25869802

  2. Quantitative deuterium analysis of titanium samples in ultraviolet laser-induced low-pressure helium plasma.

    PubMed

    Abdulmadjid, Syahrun Nur; Lie, Zener Sukra; Niki, Hideaki; Pardede, Marincan; Hedwig, Rinda; Lie, Tjung Jie; Jobiliong, Eric; Kurniawan, Koo Hendrik; Fukumoto, Ken-Ichi; Kagawa, Kiichiro; Tjia, May On

    2010-04-01

    An experimental study of ultraviolet (UV) laser-induced plasma spectroscopy (LIPS) on Ti samples with low-pressure surrounding He gas has been carried out to demonstrate its applicability to quantitative micro-analysis of deuterium impurities in titanium without the spectral interference from the ubiquitous surface water. This was achieved by adopting the optimal experimental condition ascertained in this study, which is specified by 5 mJ laser energy, 10 Torr helium pressure, and 1-50 mus measurement window, which resulted in consistent D emission enhancement and effective elimination of spectral interference from surface water. As a result, a linear calibration line exhibiting a zero intercept was obtained from Ti samples doped with various D impurity concentrations. An additional measurement also yielded a detection limit of about 40 ppm for D impurity, well below the acceptable threshold of damaging H concentration in Ti and its alloys. Each of these measurements was found to produce a crater size of only 25 mum in diameter, and they may therefore qualify as nondestructive measurements. The result of this study has therefore paved the way for conducting further experiments with hydrogen-doped Ti samples and the technical implementation of quantitative micro-analysis of detrimental hydrogen impurity in Ti metal and its alloys, which is the ultimate goal of this study. PMID:20412619

  3. [The Quantitative Analysis of Raman Spectroscopy to Sulfate Ion in Aqueous Solution].

    PubMed

    Wang, Qian-qian; Sun, Qiang

    2016-02-01

    As a non-destructive and non-contact method, Raman spectroscopy has been widely applied in many research fields. Based on vibrational wavenumber, Raman spectroscopy is usually applied to determine the molecular species. Therefore, Raman quantitative analysis is necessary. In this study, according to the theoretical analysis of Raman intensity, Raman quantitative measurement should be fulfilled by relative intensity ratio, which can be divided into internal and external standards. This eliminates the influence of the measurement conditions. For aqueous solution, it is reasonable to treat the OH stretching band of water as an internal standard to determine the solute concentrations in aqueous solution. The Raman spectra of Na₂SO₄-H₂O, K₂SO₄-H₂O and NaCl-Na₂SO₄-H₂O are recorded in the paper. In addition, the Raman OH stretching band of water can be fitted into two Gaussian sub-bands. The intensity proportion I(SO₄²⁻)/I(W) is used to determine the molarity of sulfate in aqueous solution, where I(SO₄²⁻) represents the intensity of sulfate band and I(W) represents the sum of the two sub-bands of Raman OH stretching bands of water. Therefore, Raman spectroscopy can be utilized to measure the SO₄²⁻concentrations in aqueous solutions. PMID:27209744

  4. Quantitative analysis of HSV gene expression during lytic infection

    PubMed Central

    Turner, Anne-Marie W.; Arbuckle, Jesse H.; Kristie, Thomas M.

    2014-01-01

    Herpes Simplex Virus (HSV) is a human pathogen that establishes latency and undergoes periodic reactivation, resulting in chronic recurrent lytic infection. HSV lytic infection is characterized by an organized cascade of three gene classes, however successful transcription and expression of the first, the immediate early class, is critical to the overall success of viral infection. This initial event of lytic infection is also highly dependent on host cell factors. This unit uses RNA interference and small molecule inhibitors to examine the role of host and viral proteins in HSV lytic infection. Methods detailing isolation of viral and host RNA and genomic DNA, followed by quantitative real-time PCR, allow characterization of impacts on viral transcription and replication respectively. Western blot can be used to confirm quantitative PCR results. This combination of protocols represents a starting point for researchers interested in virus-host interactions during HSV lytic infection. PMID:25367270

  5. Quantitative and qualitative HPLC analysis of thermogenic weight loss products.

    PubMed

    Schaneberg, B T; Khan, I A

    2004-11-01

    An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

  6. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    PubMed

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314

  7. Fluorescent microscopy approaches of quantitative soil microbial analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  8. Comprehensive objective maps of macromolecular conformations by quantitative SAXS analysis

    PubMed Central

    Hura, Greg L.; Budworth, Helen; Dyer, Kevin N.; Rambo, Robert P.; Hammel, Michal

    2013-01-01

    Comprehensive perspectives of macromolecular conformations are required to connect structure to biology. Here we present a small angle X-ray scattering (SAXS) Structural Similarity Map (SSM) and Volatility of Ratio (VR) metric providing comprehensive, quantitative and objective (superposition-independent) perspectives on solution state conformations. We validate VR and SSM utility on human MutSβ, a key ABC ATPase and chemotherapeutic target, by revealing MutSβ DNA sculpting and identifying multiple conformational states for biological activity. PMID:23624664

  9. Spectroscopic and Chemometric Analysis of Binary and Ternary Edible Oil Mixtures: Qualitative and Quantitative Study.

    PubMed

    Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica

    2016-04-19

    The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil. PMID:26971405

  10. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  11. Vervets revisited: A quantitative analysis of alarm call structure and context specificity.

    PubMed

    Price, Tabitha; Wadewitz, Philip; Cheney, Dorothy; Seyfarth, Robert; Hammerschmidt, Kurt; Fischer, Julia

    2015-01-01

    The alarm calls of vervet monkeys (Chlorocebus pygerythrus) constitute the classic textbook example of semantic communication in nonhuman animals, as vervet monkeys give acoustically distinct calls to different predators and these calls elicit appropriate responses in conspecifics. They also give similar sounding calls in aggressive contexts, however. Despite the central role the vervet alarm calls have played for understanding the evolution of communication, a comprehensive, quantitative analysis of the acoustic structure of these calls was lacking. We used 2-step cluster analysis to identify objective call types and discriminant function analysis to assess context specificity. Alarm calls given in response to leopards, eagles, and snakes could be well distinguished, while the inclusion of calls given in aggressive contexts yielded some overlap, specifically between female calls given to snakes, eagles and during aggression, as well as between male vervet barks (additionally recorded in South Africa) in leopard and aggressive contexts. We suggest that both cognitive appraisal of the situation and internal state contribute to the variation in call usage and structure. While the semantic properties of vervet alarm calls bear little resemblance to human words, the existing acoustic variation, possibly together with additional contextual information, allows listeners to select appropriate responses. PMID:26286236

  12. Vervets revisited: A quantitative analysis of alarm call structure and context specificity

    PubMed Central

    Price, Tabitha; Wadewitz, Philip; Cheney, Dorothy; Seyfarth, Robert; Hammerschmidt, Kurt; Fischer, Julia

    2015-01-01

    The alarm calls of vervet monkeys (Chlorocebus pygerythrus) constitute the classic textbook example of semantic communication in nonhuman animals, as vervet monkeys give acoustically distinct calls to different predators and these calls elicit appropriate responses in conspecifics. They also give similar sounding calls in aggressive contexts, however. Despite the central role the vervet alarm calls have played for understanding the evolution of communication, a comprehensive, quantitative analysis of the acoustic structure of these calls was lacking. We used 2-step cluster analysis to identify objective call types and discriminant function analysis to assess context specificity. Alarm calls given in response to leopards, eagles, and snakes could be well distinguished, while the inclusion of calls given in aggressive contexts yielded some overlap, specifically between female calls given to snakes, eagles and during aggression, as well as between male vervet barks (additionally recorded in South Africa) in leopard and aggressive contexts. We suggest that both cognitive appraisal of the situation and internal state contribute to the variation in call usage and structure. While the semantic properties of vervet alarm calls bear little resemblance to human words, the existing acoustic variation, possibly together with additional contextual information, allows listeners to select appropriate responses. PMID:26286236

  13. Quantitative analysis of single amino acid variant peptides associated with pancreatic cancer in serum by an isobaric labeling quantitative method.

    PubMed

    Nie, Song; Yin, Haidi; Tan, Zhijing; Anderson, Michelle A; Ruffin, Mack T; Simeone, Diane M; Lubman, David M

    2014-12-01

    Single amino acid variations are highly associated with many human diseases. The direct detection of peptides containing single amino acid variants (SAAVs) derived from nonsynonymous single nucleotide polymorphisms (SNPs) in serum can provide unique opportunities for SAAV associated biomarker discovery. In the present study, an isobaric labeling quantitative strategy was applied to identify and quantify variant peptides in serum samples of pancreatic cancer patients and other benign controls. The largest number of SAAV peptides to date in serum including 96 unique variant peptides were quantified in this quantitative analysis, of which five variant peptides showed a statistically significant difference between pancreatic cancer and other controls (p-value < 0.05). Significant differences in the variant peptide SDNCEDTPEAGYFAVAVVK from serotransferrin were detected between pancreatic cancer and controls, which was further validated by selected reaction monitoring (SRM) analysis. The novel biomarker panel obtained by combining α-1-antichymotrypsin (AACT), Thrombospondin-1 (THBS1) and this variant peptide showed an excellent diagnostic performance in discriminating pancreatic cancer from healthy controls (AUC = 0.98) and chronic pancreatitis (AUC = 0.90). These results suggest that large-scale analysis of SAAV peptides in serum may provide a new direction for biomarker discovery research. PMID:25393578

  14. Additional challenges for uncertainty analysis in river engineering

    NASA Astrophysics Data System (ADS)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  15. Pharmaceutical development, composition and quantitative analysis of phthalocyanine as the photosensitizer for cancer photodynamic therapy.

    PubMed

    Jiang, Zhou; Shao, Jingwei; Yang, Tingting; Wang, Jian; Jia, Lee

    2014-01-01

    Phthalocyanine (Pc) and its related derivatives are a class of functional materials that are easily activated by the light at a special wavelength. As such photosensitizer, Pc has been applied to photodynamic therapy (PDT), in addition to its broad applications in many fields, for both malignant and benign diseases. One of our long-term research focuses is to develop Pc for cancer therapy. Herein we briefly review mechanisms of action of Pc used for photodynamic therapy, its pharmaceutical development and molecular modification to enhance its drugability and improve its intracellular localization. We also describe the current status of the Pc derivatives under clinical investigation, and analyze the methods used for quantitative analysis of those Pc derivatives. PMID:23746989

  16. Modulation of phosphofructokinase action by macromolecular interactions. Quantitative analysis of the phosphofructokinase-aldolase-calmodulin system.

    PubMed

    Orosz, F; Christova, T Y; Ovádi, J

    1988-11-23

    The simultaneous effect of calmodulin and aldolase (D-fructose-1,6-bisphosphate D-glyceraldehyde-3-phosphate-lyase, EC 4.1.2.13) on the concentration-dependent behaviour of muscle phosphofructokinase (ATP: D-fructose-6-phosphate 1-phosphotransferase, EC 2.7.1.11) has been analysed by means of a covalently attached fluorescent probe, gel penetration experiments, and using a kinetic approach. We found that calmodulin-induced inactivation of phosphofructokinase is suspended by addition of an equimolar amount of aldolase. This effect was attributed to an apparent competition of calmodulin and aldolase for the dimeric forms of kinase. Moreover, the direct binding of aldolase to calmodulin has also been demonstrated, which resulted in a significant decrease in the kcat value of the enzyme. The quantitative analysis of these interactions in the system phosphofructokinase-calmodulin-aldolase is presented. A possible molecular model for the modulation of phosphofructokinase action by macromolecular interactions is envisaged. PMID:2973356

  17. Quantitative analysis of a wind energy conversion model

    NASA Astrophysics Data System (ADS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-03-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s-1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is cp = 0.15. The v3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively.

  18. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  19. Quantitative analysis of radiation-induced changes in sperm morphology.

    PubMed

    Young, I T; Gledhill, B L; Lake, S; Wyrobek, A J

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure. PMID:6184000

  20. Quantitative phenotypic analysis of multistress response in Zygosaccharomyces rouxii complex.

    PubMed

    Solieri, Lisa; Dakal, Tikam C; Bicciato, Silvio

    2014-06-01

    Zygosaccharomyces rouxii complex comprises three yeasts clusters sourced from sugar- and salt-rich environments: haploid Zygosaccharomyces rouxii, diploid Zygosaccharomyces sapae and allodiploid/aneuploid strains of uncertain taxonomic affiliations. These yeasts have been characterized with respect to gene copy number variation, karyotype variability and change in ploidy, but functional diversity in stress responses has not been explored yet. Here, we quantitatively analysed the stress response variation in seven strains of the Z. rouxii complex by modelling growth variables via model and model-free fitting methods. Based on the spline fit as most reliable modelling method, we resolved different interstrain responses to 15 environmental perturbations. Compared with Z. rouxii CBS 732(T) and Z. sapae strains ABT301(T) and ABT601, allodiploid strain ATCC 42981 and aneuploid strains CBS 4837 and CBS 4838 displayed higher multistress resistance and better performance in glycerol respiration even in the presence of copper. μ-based logarithmic phenotypic index highlighted that ABT601 is a slow-growing strain insensitive to stress, whereas ABT301(T) grows fast on rich medium and is sensitive to suboptimal conditions. Overall, the differences in stress response could imply different adaptation mechanisms to sugar- and salt-rich niches. The obtained phenotypic profiling contributes to provide quantitative insights for elucidating the adaptive mechanisms to stress in halo- and osmo-tolerant Zygosaccharomyces yeasts. PMID:24533625

  1. Quantitative analysis of laminin 5 gene expression in human keratinocytes.

    PubMed

    Akutsu, Nobuko; Amano, Satoshi; Nishiyama, Toshio

    2005-05-01

    To examine the expression of laminin 5 genes (LAMA3, LAMB3, and LAMC2) encoding the three polypeptide chains alpha3, beta3, and gamma2, respectively, in human keratinocytes, we developed novel quantitative polymerase chain reaction (PCR) methods utilizing Thermus aquaticus DNA polymerase, specific primers, and fluorescein-labeled probes with the ABI PRISM 7700 sequence detector system. Gene expression levels of LAMA3, LAMB3, and LAMC2 and glyceraldehyde-3-phosphate dehydrogenase were quantitated reproducibly and sensitively in the range from 1 x 10(2) to 1 x 10(8) gene copies. Basal gene expression level of LAMB3 was about one-tenth of that of LAMA3 or LAMC2 in human keratinocytes, although there was no clear difference among immunoprecipitated protein levels of alpha3, beta3, and gamma2 synthesized in radio-labeled keratinocytes. Human serum augmented gene expressions of LAMA3, LAMB3, and LAMC2 in human keratinocytes to almost the same extent, and this was associated with an increase of the laminin 5 protein content measured by a specific sandwich enzyme-linked immunosorbent assay. These results demonstrate that the absolute mRNA levels generated from the laminin 5 genes do not determine the translated protein levels of the laminin 5 chains in keratinocytes, and indicate that the expression of the laminin 5 genes may be controlled by common regulation mechanisms. PMID:15854126

  2. Quantitative Analysis of the Enhanced Permeation and Retention (EPR) Effect

    PubMed Central

    Ulmschneider, Martin B.; Searson, Peter C.

    2015-01-01

    Tumor vasculature is characterized by a variety of abnormalities including irregular architecture, poor lymphatic drainage, and the upregulation of factors that increase the paracellular permeability. The increased permeability is important in mediating the uptake of an intravenously administered drug in a solid tumor and is known as the enhanced permeation and retention (EPR) effect. Studies in animal models have demonstrated a cut-off size of 500 nm - 1 µm for molecules or nanoparticles to extravasate into a tumor, however, surprisingly little is known about the kinetics of the EPR effect. Here we present a pharmacokinetic model to quantitatively assess the influence of the EPR effect on the uptake of a drug into a solid tumor. We use pharmacokinetic data for Doxil and doxorubicin from human clinical trials to illustrate how the EPR effect influences tumor uptake. This model provides a quantitative framework to guide preclinical trials of new chemotherapies and ultimately to develop design rules that can increase targeting efficiency and decrease unwanted side effects in normal tissue. PMID:25938565

  3. Quantitative analysis of thermal spray deposits using stereology

    SciTech Connect

    Leigh, S.H.; Sampath, S.; Herman, H.; Berndt, C.C.; Montavon, G.; Coddet, C.

    1995-12-31

    Stereology deals with protocols for describing a 3-D space, when only 2-D sections through solid bodies are available. This paper describes a stereological characterization of the microstructure of a thermal spray deposit. The aim of this work is to present results on the stereological characterization of a thermal spray deposit, using two approaches known as DeHoff`s and Cruz-Orive`s protocols. The individual splats are assumed to have an oblate spheroidal shape. The splat size distribution and elongation ratio distribution of splats are calculated using quantitative information from 2-D plane sections. The stereological methods are implemented to investigate the microstructure of a water stabilized plasma spray-formed Al{sub 2}O{sub 3}-13wt.%TiO{sub 2}. Results are obtained with both protocols. The splat sizes range from 0 to 60 {micro}m and shape factors from 0.4 to 1.0. The splats within the deposit seem to be much smaller and thicker (i.e., lower spreading) than those of the first layer deposited onto the substrate. The approach described in this work provides helpful quantitative information on the 3-D microstructure of thermal spray deposit.

  4. Compositional GC-FID analysis of the additives to PVC, focusing on the gaskets of lids for glass jars.

    PubMed

    Biedermann-Brem, Sandra; Biedermann, Maurus; Fiselier, Katell; Grob, Koni

    2005-12-01

    A gas chromatographic (FID) method is described which aims at the quantitative compositional analysis of the additives in plasticized PVC, particularly the plastisols used as gaskets for lids of glass jars. An extract of the PVC is analysed directly as well as after transesterification to ethyl esters. Transesterification enables the analysis of epoxidized soya bean and linseed oil (ESBO and ELO) as well as polyadipates. For most other additives, the shifts in the chromatogram resulting from transesterification is used to confirm the identifications made by direct analysis. In the gaskets of 69 lids from the European market used for packaging oily foods, a broad variety of plastisol compositions was found, many or possibly all of which do not comply with legal requirements. In 62% of these lids, ESBO was the principal plasticizer, whereas in 25% a phthalate had been used. PMID:16356892

  5. Quantitative analysis of complex three-dimensional microstructures

    NASA Astrophysics Data System (ADS)

    Genau, Amber Lynn

    The morphological evolution due to coarsening is analyzed for two distinctive types of microstructure. First, the feasibility of characterizing spatial correlations of interfacial curvature in topologically complex structures is demonstrated with the analysis of bicontinuous two-phase mixtures produced using phase field modeling. For structures produced with both conserved and nonconserved dynamics, new characteristic length scales are identified. In the nonconserved case, despite the local evolution law governing interfacial motion, long-range correlations develop that lead to a characteristic length scale associated with the distance between high curvature tunnels. In the conserved case the diffusional dynamics leads to a length scale that is related to correlations and anticorrelations between regions of curvature of opposite sign. Positive correlations due to this length scale can be measured out to seven times the characteristic length of the system. Spatial correlations are also compared for symmetric and asymmetric mixtures produced with conserved dynamics. In addition, the microstructure of directionally solidified and isothermally coarsened Pb-Sn samples are examined at various coarsening times. The samples, composed of Pb-69.1wt%Sn, have an overall volume fraction of 22% solid which is not uniformly distributed through the sample but clustered into regions of approximately 37% solid separated by empty eutectic regions. The morphology of the dendrites, both in the dense regions and at the edge of the eutectic spaces is analyzed using three-dimensional reconstructions, Interface Shape Distributions and Interface Normal Distributions. These methods are used to track the evolution of the structures from being dominated by secondary and tertiary arms in the plane perpendicular to the solidification direction to predominance of the primary stalks running in the solidification direction. Finally, the method of characterizing spatial correlations introduced above

  6. Quantitative laser-induced breakdown spectroscopy analysis of calcified tissue samples

    NASA Astrophysics Data System (ADS)

    Samek, O.; Beddows, D. C. S.; Telle, H. H.; Kaiser, J.; Liška, M.; Cáceres, J. O.; Gonzáles Ureña, A.

    2001-06-01

    We report on the application of laser-induced breakdown spectroscopy (LIBS) to the analysis of important minerals and the accumulation of potentially toxic elements in calcified tissue, to trace e.g. the influence of environmental exposure, and other medical or biological factors. This theme was exemplified for quantitative detection and mapping of Al, Pb and Sr in representative samples, including teeth (first teeth of infants, second teeth of children and teeth of adults) and bones (tibia and femur). In addition to identifying and quantifying major and trace elements in the tissues, one- and two-dimensional profiles and maps were generated. Such maps (a) provide time/concentration relations, (b) allow to follow mineralisation of the hydroxyapatite matrix and the migration of the elements within it and (c) enable to identify disease states, such as caries in teeth. In order to obtain quantitative calibration, reference samples in the form of pressed pellets with calcified tissue-equivalent material (majority compound of pellets is CaCO 3) were used whose physical properties closely resembled hydroxyapatite. Compounds of Al, Sr and Pb were added to the pellets, containing atomic concentrations in the range 100-10 000 ppm relative to the Ca content of the matrix. Analytical results based on this calibration against artificial samples for the trace elements under investigation agree with literature values, and with our atomic absorption spectroscopy (AAS) cross-validation measurements.

  7. Quantitative Spectral Morphology Analysis of Unusually Red and Blue L Dwarfs

    NASA Astrophysics Data System (ADS)

    Camnasio, Sara; Khalida Alam, Munazza; Rice, Emily L.; Cruz, Kelle L.; Faherty, Jacqueline K.; Mace, Gregory N.; Martin, Emily; Logsdon, Sarah E.; McLean, Ian S.; Brown Dwarfs in New York City (BDNYC)

    2016-01-01

    In an effort to constrain the properties of photometric color outliers, we present a quantitative spectral morphology analysis of medium-resolution NIRSPEC (R~2,000), SpeX cross-dispersed (R~2,000), Palomar TripleSpec (R~2600), and Magellan FIRE (R~6000) J-band spectra for a sample of unusually red and blue L dwarfs. Some red L dwarfs are low surface gravity, young objects whose spectra present weak Na I doublets and FeH absorption bands, but strong VO features (Cruz et al. 2009). Some blue L dwarfs are subdwarfs with low metallicity spectral features such as greater H2 absorption, stronger metal hydride bands, and enhanced TiO absorption (Burgasser et al 2008c). We fit 3rd order polynomials to the pseudo-continuum in order to provide a quantitative comparison of spectral morphology with other peculiar L dwarfs, field standards, young L dwarfs, and L subdwarf. The results indicated that the coefficients of the fit correlate with spectral type, but are independent of color. This newly found trend provides a parameter which can be utilized as an additional tool in characterizing quantifiable differences in the spectra of brown dwarfs. Furthermore, this method can be applied in studying the atmospheric properties of exoplanets, given their similarities with brown dwarfs in mass and photospheric properties.

  8. A comparison of the effects of PCR inhibition in quantitative PCR and forensic STR analysis.

    PubMed

    Funes-Huacca, Maribel E; Opel, Kerry; Thompson, Robyn; McCord, Bruce R

    2011-04-01

    In this paper we compare the effects of three representative PCR inhibitors using quantitative PCR (qPCR) and multiplex STR amplification in order to determine the effect of inhibitor concentration on allele dropout and to develop better ways to interpret forensic DNA data. We have used humic acid, collagen and calcium phosphate at different concentrations to evaluate the profiles of alleles inhibited in these amplifications. These data were correlated with previously obtained results from quantitative PCR including melt curve effects, efficiency changes and cycle threshold (Ct) values. Overall, the data show that there are two competing processes that result from PCR inhibition. The first process is a general loss of larger alleles. This appears to occur with all inhibitors. The second process is more sequence specific and occurs when the inhibitor binds DNA, altering the cycle threshold and the melt curve. This sequence-specific inhibition results in patterns of allele loss that occur in addition to the overall loss of larger alleles. The data demonstrate the applicability of utilizing real-time PCR results to predict the presence of certain types of PCR inhibition in STR analysis. PMID:21462225

  9. Quantitative Analysis and Modeling Probe Polarity Establishment in C. elegans Embryos

    PubMed Central

    Blanchoud, Simon; Busso, Coralie; Naef, Félix; Gönczy, Pierre

    2015-01-01

    Cell polarity underlies many aspects of metazoan development and homeostasis, and relies notably on a set of PAR proteins located at the cell cortex. How these proteins interact in space and time remains incompletely understood. We performed a quantitative assessment of polarity establishment in one-cell stage Caenorhabditis elegans embryos by combining time-lapse microscopy and image analysis. We used our extensive data set to challenge and further specify an extant mathematical model. Using likelihood-based calibration, we uncovered that cooperativity is required for both anterior and posterior PAR complexes. Moreover, we analyzed the dependence of polarity establishment on changes in size or temperature. The observed robustness of PAR domain dimensions in embryos of different sizes is in agreement with a model incorporating fixed protein concentrations and variations in embryo surface/volume ratio. In addition, we quantified the dynamics of polarity establishment over most of the viable temperatures range of C. elegans. Modeling of these data suggests that diffusion of PAR proteins is the process most affected by temperature changes, although cortical flows appear unaffected. Overall, our quantitative analytical framework provides insights into the dynamics of polarity establishment in a developing system. PMID:25692585

  10. Bridging the gaps for global sustainable development: a quantitative analysis.

    PubMed

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other

  11. Quantitative inverse modelling of a cylindrical object in the laboratory using ERT: An error analysis

    NASA Astrophysics Data System (ADS)

    Korteland, Suze-Anne; Heimovaara, Timo

    2015-03-01

    Electrical resistivity tomography (ERT) is a geophysical technique that can be used to obtain three-dimensional images of the bulk electrical conductivity of the subsurface. Because the electrical conductivity is strongly related to properties of the subsurface and the flow of water it has become a valuable tool for visualization in many hydrogeological and environmental applications. In recent years, ERT is increasingly being used for quantitative characterization, which requires more detailed prior information than a conventional geophysical inversion for qualitative purposes. In addition, the careful interpretation of measurement and modelling errors is critical if ERT measurements are to be used in a quantitative way. This paper explores the quantitative determination of the electrical conductivity distribution of a cylindrical object placed in a water bath in a laboratory-scale tank. Because of the sharp conductivity contrast between the object and the water, a standard geophysical inversion using a smoothness constraint could not reproduce this target accurately. Better results were obtained by using the ERT measurements to constrain a model describing the geometry of the system. The posterior probability distributions of the parameters describing the geometry were estimated with the Markov chain Monte Carlo method DREAM(ZS). Using the ERT measurements this way, accurate estimates of the parameters could be obtained. The information quality of the measurements was assessed by a detailed analysis of the errors. Even for the uncomplicated laboratory setup used in this paper, errors in the modelling of the shape and position of the electrodes and the shape of the domain could be identified. The results indicate that the ERT measurements have a high information content which can be accessed by the inclusion of prior information and the consideration of measurement and modelling errors.

  12. Quantitative Selection Analysis of Bacteriophage φCbK Susceptibility in Caulobacter crescentus.

    PubMed

    Christen, Matthias; Beusch, Christian; Bösch, Yvonne; Cerletti, Dario; Flores-Tinoco, Carlos Eduardo; Del Medico, Luca; Tschan, Flavia; Christen, Beat

    2016-01-29

    Classical molecular genetics uses stringent selective conditions to identify mutants with distinct phenotypic responses. Mutations giving rise to less pronounced phenotypes are often missed. However, to gain systems-level insights into complex genetic interaction networks requires genome-wide assignment of quantitative phenotypic traits. In this paper, we present a quantitative selection approach coupled with transposon sequencing (QS-TnSeq) to globally identify the cellular components that orchestrate susceptibility of the cell cycle model bacterium Caulobacter crescentus toward bacteriophage φCbK infection. We found that 135 genes representing 3.30% of the Caulobacter genome exhibit significant accumulation of transposon insertions upon φCbK selection. More than 85% thereof consist of new factors not previously associated with phage φCbK susceptibility. Using hierarchical clustering of dose-dependent TnSeq datasets, we grouped these genes into functional modules that correlate with different stages of the φCbK infection process. We assign φCbK susceptibility to eight new genes that represent novel components of the pilus secretion machinery. Further, we demonstrate that, from 86 motility genes, only seven genes encoding structural and regulatory components of the flagellar hook increase phage resistance when disrupted by transposons, suggesting a link between flagellar hook assembly and pili biogenesis. In addition, we observe high recovery of Tn5 insertions within regulatory sequences of the genes encoding the essential NADH:ubiquinone oxidoreductase complex indicating that intact proton motive force is crucial for effective phage propagation. In sum, QS-TnSeq is broadly applicable to perform quantitative and genome-wide systems-genetics analysis of complex phenotypic traits. PMID:26593064

  13. [Quantitative analysis of alloy steel based on laser induced breakdown spectroscopy with partial least squares method].

    PubMed

    Cong, Zhi-Bo; Sun, Lan-Xiang; Xin, Yong; Li, Yang; Qi, Li-Feng; Yang, Zhi-Jia

    2014-02-01

    In the present paper both the partial least squares (PLS) method and the calibration curve (CC) method are used to quantitatively analyze the laser induced breakdown spectroscopy data obtained from the standard alloy steel samples. Both the major and trace elements were quantitatively analyzed. By comparing the results of two different calibration methods some useful results were obtained: for major elements, the PLS method is better than the CC method in quantitative analysis; more importantly, for the trace elements, the CC method can not give the quantitative results due to the extremely weak characteristic spectral lines, but the PLS method still has a good ability of quantitative analysis. And the regression coefficient of PLS method is compared with the original spectral data with background interference to explain the advantage of the PLS method in the LIBS quantitative analysis. Results proved that the PLS method used in laser induced breakdown spectroscopy is suitable for quantitative analysis of trace elements such as C in the metallurgical industry. PMID:24822436

  14. Quantitative analysis on electric dipole energy in Rashba band splitting

    NASA Astrophysics Data System (ADS)

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-09-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime.

  15. Quantitative error analysis for computer assisted navigation: a feasibility study

    PubMed Central

    Güler, Ö.; Perwög, M.; Kral, F.; Schwarm, F.; Bárdosi, Z. R.; Göbel, G.; Freysinger, W.

    2013-01-01

    Purpose The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features – the User Localization Error (ULE) - is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. Methods Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials) and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root mean square, RMS; probe tip calibration was 0.18 mm RMS. Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. Results The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. Conclusions Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials

  16. Quantitative and sensitive analysis of CN molecules using laser induced low pressure He plasma

    NASA Astrophysics Data System (ADS)

    Pardede, Marincan; Hedwig, Rinda; Abdulmadjid, Syahrun Nur; Lahna, Kurnia; Idris, Nasrullah; Jobiliong, Eric; Suyanto, Hery; Marpaung, Alion Mangasi; Suliyanti, Maria Margaretha; Ramli, Muliadi; Tjia, May On; Lie, Tjung Jie; Lie, Zener Sukra; Kurniawan, Davy Putra; Kurniawan, Koo Hendrik; Kagawa, Kiichiro

    2015-03-01

    We report the results of experimental study on CN 388.3 nm and C I 247.8 nm emission characteristics using 40 mJ laser irradiation with He and N2 ambient gases. The results obtained with N2 ambient gas show undesirable interference effect between the native CN emission and the emission of CN molecules arising from the recombination of native C ablated from the sample with the N dissociated from the ambient gas. This problem is overcome by the use of He ambient gas at low pressure of 2 kPa, which also offers the additional advantages of cleaner and stronger emission lines. The result of applying this favorable experimental condition to emission spectrochemical measurement of milk sample having various protein concentrations is shown to yield a close to linear calibration curve with near zero extrapolated intercept. Additionally, a low detection limit of 5 μg/g is found in this experiment, making it potentially applicable for quantitative and sensitive CN analysis. The visibility of laser induced breakdown spectroscopy with low pressure He gas is also demonstrated by the result of its application to spectrochemical analysis of fossil samples. Furthermore, with the use of CO2 ambient gas at 600 Pa mimicking the Mars atmosphere, this technique also shows promising applications to exploration in Mars.

  17. Quantitative and sensitive analysis of CN molecules using laser induced low pressure He plasma

    SciTech Connect

    Pardede, Marincan; Hedwig, Rinda; Abdulmadjid, Syahrun Nur; Lahna, Kurnia; Idris, Nasrullah; Ramli, Muliadi; Jobiliong, Eric; Suyanto, Hery; Marpaung, Alion Mangasi; Suliyanti, Maria Margaretha; Tjia, May On

    2015-03-21

    We report the results of experimental study on CN 388.3 nm and C I 247.8 nm emission characteristics using 40 mJ laser irradiation with He and N{sub 2} ambient gases. The results obtained with N{sub 2} ambient gas show undesirable interference effect between the native CN emission and the emission of CN molecules arising from the recombination of native C ablated from the sample with the N dissociated from the ambient gas. This problem is overcome by the use of He ambient gas at low pressure of 2 kPa, which also offers the additional advantages of cleaner and stronger emission lines. The result of applying this favorable experimental condition to emission spectrochemical measurement of milk sample having various protein concentrations is shown to yield a close to linear calibration curve with near zero extrapolated intercept. Additionally, a low detection limit of 5 μg/g is found in this experiment, making it potentially applicable for quantitative and sensitive CN analysis. The visibility of laser induced breakdown spectroscopy with low pressure He gas is also demonstrated by the result of its application to spectrochemical analysis of fossil samples. Furthermore, with the use of CO{sub 2} ambient gas at 600 Pa mimicking the Mars atmosphere, this technique also shows promising applications to exploration in Mars.

  18. Response Neighborhoods in Online Learning Networks: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2005-01-01

    Theoretical foundation of Response mechanisms in networks of online learners are revealed by Statistical Analysis of p* Markov Models for the Networks. Our comparative analysis of two networks shows that the minimal-effort hunt-for-social-capital mechanism controls a major behavior of both networks: negative tendency to respond. Differences in…

  19. [Research progress of quantitative analysis for respiratory sinus arrhythmia].

    PubMed

    Sun, Congcong; Zhang, Zhengbo; Wang, Buqing; Liu, Hongyun; Ang, Qing; Wang, Weidong

    2011-12-01

    Respiratory sinus arrhythmia (RSA) is known as fluctuations of heart rate associated with breathing. It has been increasingly used as a noninvasive index of cardiac vagal tone in psychophysiological research recently. Its analysis is often influenced or distorted by respiratory parameters, posture and action, etc. This paper reviews five methods of quantification, including the root mean square of successive differences (RMSSD), peak valley RSA (pvRSA), cosinor fitting, spectral analysis, and joint timing-frequency analysis (JTFA). Paced breathing, analysis of covariance, residua method and msRSA per liter tidal volume are adjustment strategies of measurement and analysis of RSA in this article as well. At last, some prospects of solutions of the problems of RSA research are given. PMID:22295719

  20. Quantitative analysis of numerical solvers for oscillatory biomolecular system models

    PubMed Central

    Quo, Chang F; Wang, May D

    2008-01-01

    Background This article provides guidelines for selecting optimal numerical solvers for biomolecular system models. Because various parameters of the same system could have drastically different ranges from 10-15 to 1010, the ODEs can be stiff and ill-conditioned, resulting in non-unique, non-existing, or non-reproducible modeling solutions. Previous studies have not examined in depth how to best select numerical solvers for biomolecular system models, which makes it difficult to experimentally validate the modeling results. To address this problem, we have chosen one of the well-known stiff initial value problems with limit cycle behavior as a test-bed system model. Solving this model, we have illustrated that different answers may result from different numerical solvers. We use MATLAB numerical solvers because they are optimized and widely used by the modeling community. We have also conducted a systematic study of numerical solver performances by using qualitative and quantitative measures such as convergence, accuracy, and computational cost (i.e. in terms of function evaluation, partial derivative, LU decomposition, and "take-off" points). The results show that the modeling solutions can be drastically different using different numerical solvers. Thus, it is important to intelligently select numerical solvers when solving biomolecular system models. Results The classic Belousov-Zhabotinskii (BZ) reaction is described by the Oregonator model and is used as a case study. We report two guidelines in selecting optimal numerical solver(s) for stiff, complex oscillatory systems: (i) for problems with unknown parameters, ode45 is the optimal choice regardless of the relative error tolerance; (ii) for known stiff problems, both ode113 and ode15s are good choices under strict relative tolerance conditions. Conclusions For any given biomolecular model, by building a library of numerical solvers with quantitative performance assessment metric, we show that it is possible

  1. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  2. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    PubMed Central

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  3. Quantitative analysis on electric dipole energy in Rashba band splitting.

    PubMed

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime. PMID:26323493

  4. Quantitative analysis on electric dipole energy in Rashba band splitting

    PubMed Central

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime. PMID:26323493

  5. Quantitative analysis of ultrasound images for computer-aided diagnosis.

    PubMed

    Wu, Jie Ying; Tuomi, Adam; Beland, Michael D; Konrad, Joseph; Glidden, David; Grand, David; Merck, Derek

    2016-01-01

    We propose an adaptable framework for analyzing ultrasound (US) images quantitatively to provide computer-aided diagnosis using machine learning. Our preliminary clinical targets are hepatic steatosis, adenomyosis, and craniosynostosis. For steatosis and adenomyosis, we collected US studies from 288 and 88 patients, respectively, as well as their biopsy or magnetic resonanceconfirmed diagnosis. Radiologists identified a region of interest (ROI) on each image. We filtered the US images for various texture responses and use the pixel intensity distribution within each ROI as feature parameterizations. Our craniosynostosis dataset consisted of 22 CT-confirmed cases and 22 age-matched controls. One physician manually measured the vectors from the center of the skull to the outer cortex at every 10 deg for each image and we used the principal directions as shape features for parameterization. These parameters and the known diagnosis were used to train classifiers. Testing with cross-validation, we obtained 72.74% accuracy and 0.71 area under receiver operating characteristics curve for steatosis ([Formula: see text]), 77.27% and 0.77 for adenomyosis ([Formula: see text]), and 88.63% and 0.89 for craniosynostosis ([Formula: see text]). Our framework is able to detect a variety of diseases with high accuracy. We hope to include it as a routinely available support system in the clinic. PMID:26835502

  6. Quantitative proteomic analysis of the Salmonella-lettuce interaction

    PubMed Central

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-01-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

  7. Quantitative analysis of pheromone-binding protein specificity

    PubMed Central

    Katti, S.; Lokhande, N.; González, D.; Cassill, A.; Renthal, R.

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between β-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the β-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

  8. Analysis of alpha-synuclein-associated proteins by quantitative proteomics.

    PubMed

    Zhou, Yong; Gu, Guangyu; Goodlett, David R; Zhang, Terry; Pan, Catherine; Montine, Thomas J; Montine, Kathleen S; Aebersold, Ruedi H; Zhang, Jing

    2004-09-10

    To identify the proteins associated with soluble alpha-synuclein (AS) that might promote AS aggregation, a key event leading to neurodegeneration, we quantitatively compared protein profiles of AS-associated protein complexes in MES cells exposed to rotenone, a pesticide that produces parkinsonism in animals and induces Lewy body (LB)-like inclusions in the remaining dopaminergic neurons, and to vehicle. We identified more than 250 proteins associated with Nonidet P-40 soluble AS, and demonstrated that at least 51 of these proteins displayed significant differences in their relative abundance in AS complexes under conditions where rotenone was cytotoxic and induced formation of cytoplasmic inclusions immunoreactive to anti-AS. Overexpressing one of these proteins, heat shock protein (hsp) 70, not only protected cells from rotenone-mediated cytotoxicity but also decreased soluble AS aggregation. Furthermore, the protection afforded by hsp70 transfection appeared to be related to suppression of rotenone-induced oxidative stress as well as mitochondrial and proteasomal dysfunction. PMID:15234983

  9. Quantitative analysis of virus and plasmid trafficking in cells

    NASA Astrophysics Data System (ADS)

    Lagache, Thibault; Dauty, Emmanuel; Holcman, David

    2009-01-01

    Intracellular transport of DNA carriers is a fundamental step of gene delivery. By combining both theoretical and numerical approaches we study here single and several viruses and DNA particles trafficking in the cell cytoplasm to a small nuclear pore. We present a physical model to account for certain aspects of cellular organization, starting with the observation that a viral trajectory consists of epochs of pure diffusion and epochs of active transport along microtubules. We define a general degradation rate to describe the limitations of the delivery of plasmid or viral particles to a nuclear pore imposed by various types of direct and indirect hydrolysis activity inside the cytoplasm. By replacing the switching dynamics by a single steady state stochastic description, we obtain estimates for the probability and the mean time for the first one of many particles to go from the cell membrane to a small nuclear pore. Computational simulations confirm that our model can be used to analyze and interpret viral trajectories and estimate quantitatively the success of nuclear delivery.

  10. Analysis of copy number variation using quantitative interspecies competitive PCR.

    PubMed

    Williams, Nigel M; Williams, Hywel; Majounie, Elisa; Norton, Nadine; Glaser, Beate; Morris, Huw R; Owen, Michael J; O'Donovan, Michael C

    2008-10-01

    Over recent years small submicroscopic DNA copy-number variants (CNVs) have been highlighted as an important source of variation in the human genome, human phenotypic diversity and disease susceptibility. Consequently, there is a pressing need for the development of methods that allow the efficient, accurate and cheap measurement of genomic copy number polymorphisms in clinical cohorts. We have developed a simple competitive PCR based method to determine DNA copy number which uses the entire genome of a single chimpanzee as a competitor thus eliminating the requirement for competitive sequences to be synthesized for each assay. This results in the requirement for only a single reference sample for all assays and dramatically increases the potential for large numbers of loci to be analysed in multiplex. In this study we establish proof of concept by accurately detecting previously characterized mutations at the PARK2 locus and then demonstrating the potential of quantitative interspecies competitive PCR (qicPCR) to accurately genotype CNVs in association studies by analysing chromosome 22q11 deletions in a sample of previously characterized patients and normal controls. PMID:18697816

  11. Quantitative proteomic analysis of the Salmonella-lettuce interaction.

    PubMed

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-11-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

  12. Quantitative Analysis of CME Deflections in the Corona

    NASA Astrophysics Data System (ADS)

    Gui, Bin; Shen, Chenglong; Wang, Yuming; Ye, Pinzhong; Liu, Jiajia; Wang, Shui; Zhao, Xuepu

    2011-07-01

    In this paper, ten CME events viewed by the STEREO twin spacecraft are analyzed to study the deflections of CMEs during their propagation in the corona. Based on the three-dimensional information of the CMEs derived by the graduated cylindrical shell (GCS) model (Thernisien, Howard, and Vourlidas in Astrophys. J. 652, 1305, 2006), it is found that the propagation directions of eight CMEs had changed. By applying the theoretical method proposed by Shen et al. ( Solar Phys. 269, 389, 2011) to all the CMEs, we found that the deflections are consistent, in strength and direction, with the gradient of the magnetic energy density. There is a positive correlation between the deflection rate and the strength of the magnetic energy density gradient and a weak anti-correlation between the deflection rate and the CME speed. Our results suggest that the deflections of CMEs are mainly controlled by the background magnetic field and can be quantitatively described by the magnetic energy density gradient (MEDG) model.

  13. Analysis of quantitative trait loci for behavioral laterality in mice.

    PubMed Central

    Roubertoux, Pierre L; Le Roy, Isabelle; Tordjman, Sylvie; Cherfou, Améziane; Migliore-Samour, Danièle

    2003-01-01

    Laterality is believed to have genetic components, as has been deduced from family studies in humans and responses to artificial selection in mice, but these genetic components are unknown and the underlying physiological mechanisms are still a subject of dispute. We measured direction of laterality (preferential use of left or right paws) and degree of laterality (absolute difference between the use of left and right paws) in C57BL/6ByJ (B) and NZB/BlNJ (N) mice and in their F(1) and F(2) intercrosses. Measurements were taken of both forepaws and hind paws. Quantitative trait loci (QTL) did not emerge for direction but did for degree of laterality. One QTL for forepaw (LOD score = 5.6) and the second QTL for hind paw (LOD score = 7.2) were both located on chromosome 4 and their peaks were within the same confidence interval. A QTL for plasma luteinizing hormone concentration was also found in the confidence interval of these two QTL. These results suggest that the physiological mechanisms underlying degree of laterality react to gonadal steroids. PMID:12663540

  14. Quantitative Proteome Analysis of Leishmania donovani under Spermidine Starvation

    PubMed Central

    Singh, Shalini; Dubey, Vikash Kumar

    2016-01-01

    We have earlier reported antileishmanial activity of hypericin by spermidine starvation. In the current report, we have used label free proteome quantitation approach to identify differentially modulated proteins after hypericin treatment. A total of 141 proteins were found to be differentially regulated with ANOVA P value less than 0.05 in hypericin treated Leishmania promastigotes. Differentially modulated proteins have been broadly classified under nine major categories. Increase in ribosomal protein S7 protein suggests the repression of translation. Inhibition of proteins related to ubiquitin proteasome system, RNA binding protein and translation initiation factor also suggests altered translation. We have also observed increased expression of Hsp 90, Hsp 83–1 and stress inducible protein 1. Significant decreased level of cyclophilin was observed. These stress related protein could be cellular response of the parasite towards hypericin induced cellular stress. Also, defective metabolism, biosynthesis and replication of nucleic acids, flagellar movement and signalling of the parasite were observed as indicated by altered expression of proteins involved in these pathways. The data was analyzed rigorously to get further insight into hypericin induced parasitic death. PMID:27123864

  15. Quantitative dual-probe microdialysis: mathematical model and analysis.

    PubMed

    Chen, Kevin C; Höistad, Malin; Kehr, Jan; Fuxe, Kjell; Nicholson, Charles

    2002-04-01

    Steady-state microdialysis is a widely used technique to monitor the concentration changes and distributions of substances in tissues. To obtain more information about brain tissue properties from microdialysis, a dual-probe approach was applied to infuse and sample the radiotracer, [3H]mannitol, simultaneously both in agar gel and in the rat striatum. Because the molecules released by one probe and collected by the other must diffuse through the interstitial space, the concentration profile exhibits dynamic behavior that permits the assessment of the diffusion characteristics in the brain extracellular space and the clearance characteristics. In this paper a mathematical model for dual-probe microdialysis was developed to study brain interstitial diffusion and clearance processes. Theoretical expressions for the spatial distribution of the infused tracer in the brain extracellular space and the temporal concentration at the probe outlet were derived. A fitting program was developed using the simplex algorithm, which finds local minima of the standard deviations between experiments and theory by adjusting the relevant parameters. The theoretical curves accurately fitted the experimental data and generated realistic diffusion parameters, implying that the mathematical model is capable of predicting the interstitial diffusion behavior of [3H]mannitol and that it will be a valuable quantitative tool in dual-probe microdialysis. PMID:12067242

  16. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  17. Temporal kinetics and quantitative analysis of Cryptococcus neoformans nonlytic exocytosis.

    PubMed

    Stukes, Sabriya A; Cohen, Hillel W; Casadevall, Arturo

    2014-05-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  18. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  19. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  20. [Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].

    PubMed

    Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong

    2011-09-01

    Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line. PMID:22097853

  1. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  2. GPR-Analyzer: a simple tool for quantitative analysis of hierarchical multispecies microarrays.

    PubMed

    Dittami, Simon M; Edvardsen, Bente

    2013-10-01

    Monitoring of marine microalgae is important to predict and manage harmful algae blooms. It currently relies mainly on light-microscopic identification and enumeration of algal cells, yet several molecular tools are currently being developed to complement traditional methods. MIcroarray Detection of Toxic ALgae (MIDTAL) is an FP7-funded EU project aiming to establish a hierarchical multispecies microarray as one of these tools. Prototype arrays are currently being tested with field samples, yet the analysis of the large quantities of data generated by these arrays presents a challenge as suitable analysis tools or protocols are scarce. This paper proposes a two-part protocol for the analysis of the MIDTAL and other hierarchical multispecies arrays: Signal-to-noise ratios can be used to determine the presence or absence of signals and to identify potential false-positives considering parallel and hierarchical probes. In addition, normalized total signal intensities are recommended for comparisons between microarrays and in order to relate signals for specific probes to cell concentrations using external calibration curves. Hybridization- and probe-specific detection limits can be calculated to help evaluate negative results. The suggested analyses were implemented in "GPR-Analyzer", a platform-independent and graphical user interface-based application, enabling non-specialist users to quickly and quantitatively analyze hierarchical multispecies microarrays. It is available online at http://folk.uio.no/edvardse/gpranalyzer . PMID:22767354

  3. [Quantitative Analysis of Immuno-fluorescence of Nuclear Factor-κB Activation].

    PubMed

    Xiu, Min; He, Feng; Lou, Yuanlei; Xu, Lu; Xiong Jieqi; Wang, Ping; Liu, Sisun; Guo, Fei

    2015-06-01

    Immuno-fluorescence technique can qualitatively determine certain nuclear translocation, of which NF-κB/ p65 implicates the activation of NF-κB signal pathways. Immuno-fluorescence analysis software with independent property rights is able to quantitatively analyze dynamic location of NF-κB/p65 by computing relative fluorescence units in nuclei and cytoplasm. We verified the quantitative analysis by Western Blot. When we applied the software to analysis of nuclear translocation in lipopolysaccharide (LPS) induced (0. 5 h, 1 h, 2 h, 4 h) primary human umbilical vein endothelial cells (HUVECs) , we found that nuclear translocation peak showed up at 2h as with calculated Western blot verification results, indicating that the inventive immuno-fluorescence analysis software can be applied to the quantitative analysis of immuno-fluorescence. PMID:26485997

  4. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  5. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  6. Quantitative Analysis of 3′-Hydroxynorcotinine in Human Urine

    PubMed Central

    Upadhyaya, Pramod

    2015-01-01

    Introduction: Based on previous metabolism studies carried out in patas monkeys, we hypothesized that urinary 3′-hydroxynorcotinine could be a specific biomarker for uptake and metabolism of the carcinogen N′-nitrosonornicotine in people who use tobacco products. Methods: We developed a method for quantitation of 3′-hydroxynorcotinine in human urine. [Pyrrolidinone-13C4]3′-hydroxynorcotinine was added to urine as an internal standard, the samples were treated with β-glucuronidase, partially purified by solid supported liquid extraction and quantified by liquid chromatography–electrospray ionization–tandem mass spectrometry. Results: The method was accurate (average accuracy = 102%) and precise (coefficient of variation = 5.6%) in the range of measurement. 3′-Hydroxynorcotinine was detected in 48 urine samples from smokers (mean 393±287 pmol/ml urine) and 12 samples from individuals who had stopped smoking and were using the nicotine patch (mean 658±491 pmol/ml urine), but not in any of 10 samples from nonsmokers. Conclusions: Since the amounts of 3′-hydroxynorcotinine found in smokers’ urine were approximately 50 times greater than the anticipated daily dose of N′-nitrosonornicotine, we concluded that it is a metabolite of nicotine or one of its metabolites, comprising perhaps 1% of nicotine intake in smokers. Therefore, it would not be suitable as a specific biomarker for uptake and metabolism of N′-nitrosonornicotine. Since 3′-hydroxynorcotinine has never been previously reported as a constituent of human urine, further studies are required to determine its source and mode of formation. PMID:25324430

  7. Quantitative analysis of nailfold capillary morphology in patients with fibromyalgia

    PubMed Central

    Choi, Dug-Hyun

    2015-01-01

    Background/Aims Nailfold capillaroscopy (NFC) has been used to examine morphological and functional microcirculation changes in connective tissue diseases. It has been demonstrated that NFC patterns reflect abnormal microvascular dynamics, which may play a role in fibromyalgia (FM) syndrome. The aim of this study was to determine NFC patterns in FM, and their association with clinical features of FM. Methods A total of 67 patients with FM, and 30 age- and sex-matched healthy controls, were included. Nailfold capillary patterns were quantitatively analyzed using computerized NFC. The parameters of interest were as follows: number of capillaries within the central 3 mm, deletion score, apical limb width, capillary width, and capillary dimension. Capillary dimension was determined by calculating the number of capillaries using the Adobe Photoshop version 7.0. Results FM patients had a lower number of capillaries and higher deletion scores on NFC compared to healthy controls (17.3 ± 1.7 vs. 21.8 ± 2.9, p < 0.05; 2.2 ± 0.9 vs. 0.7 ± 0.6, p < 0.05, respectively). Both apical limb width (µm) and capillary width (µm) were significantly decreased in FM patients (1.1 ± 0.2 vs. 3.7 ± 0.6; 5.4 ± 0.5 vs. 7.5 ± 1.4, respectively), indicating that FM patients have abnormally decreased digital capillary diameter and density. Interestingly, there was no difference in capillary dimension between the two groups, suggesting that the length or tortuosity of capillaries in FM patients is increased to compensate for diminished microcirculation. Conclusions FM patients had altered capillary density and diameter in the digits. Diminished microcirculation on NFC may alter capillary density and increase tortuosity. PMID:26161020

  8. Quantitative analysis of wrist electrodermal activity during sleep.

    PubMed

    Sano, Akane; Picard, Rosalind W; Stickgold, Robert

    2014-12-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to the prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 healthy adults in a sleep laboratory, with the latter compared to concurrent polysomnography. While high frequency patterns of EDA called "storms" were identified by eye in the 1960s, we systematically compare thresholds for automatically detecting EDA peaks and establish criteria for EDA storms. We found that more than 80% of the EDA peaks occurred in non-REM sleep, specifically during slow-wave sleep (SWS) and non-REM stage 2 sleep (NREM2). Also, EDA amplitude is higher in SWS than in other sleep stages. Longer EDA storms were more likely to occur in the first two quarters of sleep and during SWS and NREM2. We also found from the home studies (65 nights) that EDA levels were higher and the skin conductance peaks were larger and more frequent when measured on the wrist than when measured on the palm. These EDA high frequency peaks and high amplitude were sometimes associated with higher skin temperature, but more work is needed looking at neurological and other EDA elicitors in order to elucidate their complete behavior. PMID:25286449

  9. A comparative analysis of British and Taiwanese students' conceptual and procedural knowledge of fraction addition

    NASA Astrophysics Data System (ADS)

    Li, Hui-Chuan

    2014-10-01

    This study examines students' procedural and conceptual achievement in fraction addition in England and Taiwan. A total of 1209 participants (561 British students and 648 Taiwanese students) at ages 12 and 13 were recruited from England and Taiwan to take part in the study. A quantitative design by means of a self-designed written test is adopted as central to the methodological considerations. The test has two major parts: the concept part and the skill part. The former is concerned with students' conceptual knowledge of fraction addition and the latter is interested in students' procedural competence when adding fractions. There were statistically significant differences both in concept and skill parts between the British and Taiwanese groups with the latter having a higher score. The analysis of the students' responses to the skill section indicates that the superiority of Taiwanese students' procedural achievements over those of their British peers is because most of the former are able to apply algorithms to adding fractions far more successfully than the latter. Earlier, Hart [1] reported that around 30% of the British students in their study used an erroneous strategy (adding tops and bottoms, for example, 2/3 + 1/7 = 3/10) while adding fractions. This study also finds that nearly the same percentage of the British group remained using this erroneous strategy to add fractions as Hart found in 1981. The study also provides evidence to show that students' understanding of fractions is confused and incomplete, even those who are successfully able to perform operations. More research is needed to be done to help students make sense of the operations and eventually attain computational competence with meaningful grounding in the domain of fractions.

  10. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions

    PubMed Central

    Shenoy, Shailesh M.

    2016-01-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software’s support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity. PMID:27516727

  11. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  12. Quantitative analysis and purity evaluation of medroxyprogesterone acetate by HPLC.

    PubMed

    Cavina, G; Valvo, L; Alimenti, R

    1985-01-01

    A reversed-phase high-performance liquid chromatographic method was developed for the assay of medroxyprogesterone acetate and for the detection and determination of related steroids present as impurities in the drug. The method was compared with the normal-phase technique of the USP XX and was also applied to the analysis of tablets and injectable suspensions. PMID:16867645

  13. Quantitative modeling and analysis in environmental studies. Technical report

    SciTech Connect

    Gaver, D.P.

    1994-10-01

    This paper reviews some of the many mathematical modeling and statistical data analysis problems that arise in environmental studies. It makes no claim to be comprehensive nor truly up-to-date. It will appear as a chapter in a book on ecotoxicology to be published by CRC Press, probably in 1995. Workshops leading to the book creation were sponsored by The Conte Foundation.

  14. Procedures for Quantitative Analysis of Change Facilitator Interventions.

    ERIC Educational Resources Information Center

    Hord, Shirley M.; Hall, Gene E.

    The procedures and coding schema that have been developed by the Research on the Improvement Process (RIP) Program for analyzing the frequency of interventions and for examining their internal characteristics are described. In two in-depth ethnographic studies of implementation efforts, interventions were the focus of data collection and analysis.…

  15. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  16. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  17. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution…

  18. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  19. Quantitative histology analysis of the ovarian tumour microenvironment.

    PubMed

    Lan, Chunyan; Heindl, Andreas; Huang, Xin; Xi, Shaoyan; Banerjee, Susana; Liu, Jihong; Yuan, Yinyin

    2015-01-01

    Concerted efforts in genomic studies examining RNA transcription and DNA methylation patterns have revealed profound insights in prognostic ovarian cancer subtypes. On the other hand, abundant histology slides have been generated to date, yet their uses remain very limited and largely qualitative. Our goal is to develop automated histology analysis as an alternative subtyping technology for ovarian cancer that is cost-efficient and does not rely on DNA quality. We developed an automated system for scoring primary tumour sections of 91 late-stage ovarian cancer to identify single cells. We demonstrated high accuracy of our system based on expert pathologists' scores (cancer = 97.1%, stromal = 89.1%) as well as compared to immunohistochemistry scoring (correlation = 0.87). The percentage of stromal cells in all cells is significantly associated with poor overall survival after controlling for clinical parameters including debulking status and age (multivariate analysis p = 0.0021, HR = 2.54, CI = 1.40-4.60) and progression-free survival (multivariate analysis p = 0.022, HR = 1.75, CI = 1.09-2.82). We demonstrate how automated image analysis enables objective quantification of microenvironmental composition of ovarian tumours. Our analysis reveals a strong effect of the tumour microenvironment on ovarian cancer progression and highlights the potential of therapeutic interventions that target the stromal compartment or cancer-stroma signalling in the stroma-high, late-stage ovarian cancer subset. PMID:26573438

  20. Clinical value of quantitative analysis of ST slope during exercise.

    PubMed Central

    Ascoop, C A; Distelbrink, C A; De Lang, P A

    1977-01-01

    The diagnostic performance of automatic analysis of the exercise electrocardiogram in detecting ischaemic heart disease was studied in 147 patients with angiographically documented coronary disease. The results were compared with the results of visual analysis of the same recordings. Using a bicycle ergometer we tried to reach at least 90 per cent of the predicted maximal heart rate of the patient. Two bipolar thoracic leads (CM5, CC5) were used. In the visual analysis the criterion of the so-called ischaemic ST segment was applied. For the automatic analysis the population was divided into a learning group (N=87) and a testing group (N=60). In the learning group first critical values were computed for different ST measurements that provided optimal separation between patients with (CAG POS.) and without (CAG. NEG.) significant coronary stenoses as revealed by coronary arteriography. These critical values were kept unchanged when applied to the testing group. With respect to the visual method an increase of the sensitivity by 0-45 and 0-36 was obtained by the automatic analysis in the learning and testing group, respectively. The best separation between CAG. POS. and CAG. NEG. group was reached using a criterion consisting of a linear combination of the slope of the initial part of the ST segment and the ST depression; the sensitivity being 0-70 and 0-60, respectively, in the learning and testing group. Using a criterion based on the area between the baseline and the ST segment (the SX integral) these values were 0-42 and 0-49, respectively. All specificities were kept to at least 0-90. PMID:319813

  1. Quantitative analysis of acrylamide labeled serum proteins by LC-MS/MS.

    PubMed

    Faca, Vitor; Coram, Marc; Phanstiel, Doug; Glukhova, Veronika; Zhang, Qing; Fitzgibbon, Matthew; McIntosh, Martin; Hanash, Samir

    2006-08-01

    Isotopic labeling of cysteine residues with acrylamide was previously utilized for relative quantitation of proteins by MALDI-TOF. Here, we explored and compared the application of deuterated and (13)C isotopes of acrylamide for quantitative proteomic analysis using LC-MS/MS and high-resolution FTICR mass spectrometry. The method was applied to human serum samples that were immunodepleted of abundant proteins. Our results show reliable quantitation of proteins across an abundance range that spans 5 orders of magnitude based on ion intensities and known protein concentration in plasma. The use of (13)C isotope of acrylamide had a slightly greater advantage relative to deuterated acrylamide, because of shifts in elution of deuterated acrylamide relative to its corresponding nondeuterated compound by reversed-phase chromatography. Overall, the use of acrylamide for differentially labeling intact proteins in complex mixtures, in combination with LC-MS/MS provides a robust method for quantitative analysis of complex proteomes. PMID:16889424

  2. Quantitative chemical analysis of nickel-chromium dental casting alloys.

    PubMed

    Nagayama, K; Kuroiwa, A; Ando, Y; Hashimoto, H

    1990-01-01

    Twenty-nine brands of dental casting nickel-chromium alloys made in Japan for small castings were analyzed by electron probe X-ray microanalyzer. Nickel-chromium alloys for metal-ceramic application were composed primarily of nickel, chromium, and molybdenum with the exception of one brand. Of the nickel-chromium alloys for inlay, crown, and bridgework applications, 11 of the 22 alloys were up to the standard of the Ministry of Welfare specifications. And additive metal elements of these alloys were molybdenum, iron, copper, manganese, aluminum, silicon, tin, indium, silver, titanium, and gallium. PMID:2134288

  3. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  4. Optimized Protocol for Quantitative Multiple Reaction Monitoring-Based Proteomic Analysis of Formalin-Fixed, Paraffin-Embedded Tissues.

    PubMed

    Kennedy, Jacob J; Whiteaker, Jeffrey R; Schoenherr, Regine M; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N; Baird, Geoffrey Stuart; Paulovich, Amanda G

    2016-08-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin-embedded (FFPE) tissues. Although the feasibility of using targeted, multiple reaction monitoring mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope-labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e., nine processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R(2) = 0.94) and immuno-MRM (R(2) = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  5. Probabilistic reliability analysis, quantitative safety goals, and nuclear licensing in the United Kingdom.

    PubMed

    Cannell, W

    1987-09-01

    Although unpublicized, the use of quantitative safety goals and probabilistic reliability analysis for licensing nuclear reactors has become a reality in the United Kingdom. This conclusion results from an examination of the process leading to the licensing of the Sizewell B PWR in England. The licensing process for this reactor has substantial implications for nuclear safety standards in Britain, and is examined in the context of the growing trend towards quantitative safety goals in the United States. PMID:3685540

  6. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    NASA Astrophysics Data System (ADS)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  7. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI. PMID:27576710

  8. Quantitative trait locus analysis in crosses between outbred lines with dominance and inbreeding.

    PubMed Central

    Pérez-Enciso, M; Fernando, R L; Bidanel, J P; Le Roy, P

    2001-01-01

    We provide a theoretical framework for quantitative trait locus (QTL) analysis of a crossed population where parental lines may be outbred and dominance as well as inbreeding are allowed for. It can be applied to any pedigree. A biallelic QTL is assumed, and the QTL allele frequencies can be different in each breed. The genetic covariance between any two individuals is expressed as a nonlinear function of the probability of up to 15 possible identity modes and of the additive and dominance effects, together with the allelic frequencies in each of the two parental breeds. The probabilities of each identity mode are obtained at the desired genome positions using a Monte Carlo Markov chain method. Unbiased estimates of the actual genetic parameters are recovered in a simulated F(2) cross and in a six-generation complex pedigree under a variety of genetic models (allele fixed or segregating in the parental populations and additive or dominance action). Results from analyzing an F(2) cross between Meishan and Large White pigs are also presented. PMID:11560915

  9. Quantitative Analysis of PMLA Nanoconjugate Components after Backbone Cleavage

    PubMed Central

    Ding, Hui; Patil, Rameshwar; Portilla-Arias, Jose; Black, Keith L.; Ljubimova, Julia Y.; Holler, Eggehard

    2015-01-01

    Multifunctional polymer nanoconjugates containing multiple components show great promise in cancer therapy, but in most cases complete analysis of each component is difficult. Polymalic acid (PMLA) based nanoconjugates have demonstrated successful brain and breast cancer treatment. They consist of multiple components including targeting antibodies, Morpholino antisense oligonucleotides (AONs), and endosome escape moieties. The component analysis of PMLA nanoconjugates is extremely difficult using conventional spectrometry and HPLC method. Taking advantage of the nature of polyester of PMLA, which can be cleaved by ammonium hydroxide, we describe a method to analyze the content of antibody and AON within nanoconjugates simultaneously using SEC-HPLC by selectively cleaving the PMLA backbone. The selected cleavage conditions only degrade PMLA without affecting the integrity and biological activity of the antibody. Although the amount of antibody could also be determined using the bicinchoninic acid (BCA) method, our selective cleavage method gives more reliable results and is more powerful. Our approach provides a new direction for the component analysis of polymer nanoconjugates and nanoparticles. PMID:25894227

  10. Quantitative Analysis of Calcium Spikes in Noisy Fluorescent Background

    PubMed Central

    Janicek, Radoslav; Hotka, Matej; Zahradníková, Alexandra; Zahradníková, Alexandra; Zahradník, Ivan

    2013-01-01

    Intracellular calcium signals are studied by laser-scanning confocal fluorescence microscopy. The required spatio-temporal resolution makes description of calcium signals difficult because of the low signal-to-noise ratio. We designed a new procedure of calcium spike analysis based on their fitting with a model. The accuracy and precision of calcium spike description were tested on synthetic datasets generated either with randomly varied spike parameters and Gaussian noise of constant amplitude, or with constant spike parameters and Gaussian noise of various amplitudes. Statistical analysis was used to evaluate the performance of spike fitting algorithms. The procedure was optimized for reliable estimation of calcium spike parameters and for dismissal of false events. A new algorithm was introduced that corrects the acquisition time of pixels in line-scan images that is in error due to sequential acquisition of individual pixels along the space coordinate. New software was developed in Matlab and provided for general use. It allows interactive dissection of temporal profiles of calcium spikes from x-t images, their fitting with predefined function(s) and acceptance of results on statistical grounds, thus allowing efficient analysis and reliable description of calcium signaling in cardiac myocytes down to the in situ function of ryanodine receptors. PMID:23741324

  11. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation. PMID:27501287

  12. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    NASA Astrophysics Data System (ADS)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  13. Quantitative PCR analysis of CYP1A induction in Atlantic salmon (Salmo salar)

    USGS Publications Warehouse

    Rees, C.B.; McCormick, S.D.; Vanden, Heuvel J.P.; Li, W.

    2003-01-01

    Environmental pollutants are hypothesized to be one of the causes of recent declines in wild populations of Atlantic salmon (Salmo salar) across Eastern Canada and the United States. Some of these pollutants, such as polychlorinated biphenyls and dioxins, are known to induce expression of the CYP1A subfamily of genes. We applied a highly sensitive technique, quantitative reverse transcription-polymerase chain reaction (RT-PCR), for measuring the levels of CYP1A induction in Atlantic salmon. This assay was used to detect patterns of CYP1A mRNA levels, a direct measure of CYP1A expression, in Atlantic salmon exposed to pollutants under both laboratory and field conditions. Two groups of salmon were acclimated to 11 and 17??C, respectively. Each subject then received an intraperitoneal injection (50 mg kg-1) of either ??-naphthoflavone (BNF) in corn oil (10 mg BNF ml-1 corn oil) or corn oil alone. After 48 h, salmon gill, kidney, liver, and brain were collected for RNA isolation and analysis. All tissues showed induction of CYP1A by BNF. The highest base level of CYP1A expression (2.56??1010 molecules/??g RNA) was found in gill tissue. Kidney had the highest mean induction at five orders of magnitude while gill tissue showed the lowest mean induction at two orders of magnitude. The quantitative RT-PCR was also applied to salmon sampled from two streams in Massachusetts, USA. Salmon liver and gill tissue sampled from Millers River (South Royalston, Worcester County), known to contain polychlorinated biphenyls (PCBs), showed on average a two orders of magnitude induction over those collected from a stream with no known contamination (Fourmile Brook, Northfield, Franklin County). Overall, the data show CYP1A exists and is inducible in Atlantic salmon gill, brain, kidney, and liver tissue. In addition, the results obtained demonstrate that quantitative PCR analysis of CYP1A expression is useful in studying ecotoxicity in populations of Atlantic salmon in the wild. ?? 2003

  14. A quantitative analysis of nuclear factor I/DNA interactions.

    PubMed Central

    Meisterernst, M; Gander, I; Rogge, L; Winnacker, E L

    1988-01-01

    Nuclear factor I (NFI) was purified to homogeneity from porcine liver by DNA-affinity chromatography and displays a single band with a molecular weight of 36 kDa in SDS-polyacrylamide gels. The purified protein was used to determine absolute equilibrium binding constants by gel retardation techniques for a variety of DNA fragments with genuine or mutated NFI binding sites and a number of DNA fragments derived from various eukaryotic promoters carrying the CCAAT-box as a half-site for NFI binding. We present a model which allows prediction of the functional significance of mutated NFI binding-sites from sequence data. The data suggest that the single molecular species of NFI from porcine liver may not be able to recognize and activate the -CCAAT- promoter element in vivo without additional interactions, e.g. with other proteins. Images PMID:3380685

  15. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  16. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  17. Integrating Data Analysis (IDA): Working with Sociology Departments to Address the Quantitative Literacy Gap

    ERIC Educational Resources Information Center

    Howery, Carla B.; Rodriguez, Havidan

    2006-01-01

    The NSF-funded Integrating Data Analysis (IDA) Project undertaken by the American Sociological Association (ASA) and the Social Science Data Analysis Network sought to close the quantitative literacy gap for sociology majors. Working with twelve departments, the project built on lessons learned from ASA's Minority Opportunities through School…

  18. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    ERIC Educational Resources Information Center

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  19. A Quantitative Content Analysis of Mercer University MEd, EdS, and Doctoral Theses

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Gaiek, Lura S.; White, Torian A.; Slappey, Lisa A.; Chastain, Andrea; Harris, Rose Prejean

    2010-01-01

    Quantitative content analysis of a body of research not only helps budding researchers understand the culture, language, and expectations of scholarship, it helps identify deficiencies and inform policy and practice. Because of these benefits, an analysis of a census of 980 Mercer University MEd, EdS, and doctoral theses was conducted. Each thesis…

  20. Some remarks on the quantitative analysis of behavior

    PubMed Central

    Marr, M. Jackson

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences—physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  1. Identifying severity of electroporation through quantitative image analysis

    NASA Astrophysics Data System (ADS)

    Morshed, Bashir I.; Shams, Maitham; Mussivand, Tofy

    2011-04-01

    Electroporation is the formation of reversible hydrophilic pores in the cell membrane under electric fields. Severity of electroporation is challenging to measure and quantify. An image analysis method is developed, and the initial results with a fabricated microfluidic device are reported. The microfluidic device contains integrated microchannels and coplanar interdigitated electrodes allowing low-voltage operation and low-power consumption. Noninvasive human buccal cell samples were specifically stained, and electroporation was induced. Captured image sequences were analyzed for pixel color ranges to quantify the severity of electroporation. The method can detect even a minor occurrence of electroporation and can perform comparative studies.

  2. Quantitative Trait Locus Analysis of Mating Behavior and Male Sex Pheromones in Nasonia Wasps.

    PubMed

    Diao, Wenwen; Mousset, Mathilde; Horsburgh, Gavin J; Vermeulen, Cornelis J; Johannes, Frank; van de Zande, Louis; Ritchie, Michael G; Schmitt, Thomas; Beukeboom, Leo W

    2016-01-01

    A major focus in speciation genetics is to identify the chromosomal regions and genes that reduce hybridization and gene flow. We investigated the genetic architecture of mating behavior in the parasitoid wasp species pair Nasonia giraulti and Nasonia oneida that exhibit strong prezygotic isolation. Behavioral analysis showed that N. oneida females had consistently higher latency times, and broke off the mating sequence more often in the mounting stage when confronted with N. giraulti males compared with males of their own species. N. oneida males produce a lower quantity of the long-range male sex pheromone (4R,5S)-5-hydroxy-4-decanolide (RS-HDL). Crosses between the two species yielded hybrid males with various pheromone quantities, and these males were used in mating trials with females of either species to measure female mate discrimination rates. A quantitative trait locus (QTL) analysis involving 475 recombinant hybrid males (F2), 2148 reciprocally backcrossed females (F3), and a linkage map of 52 equally spaced neutral single nucleotide polymorphism (SNP) markers plus SNPs in 40 candidate mating behavior genes revealed four QTL for male pheromone amount, depending on partner species. Our results demonstrate that the RS-HDL pheromone plays a role in the mating system of N. giraulti and N. oneida, but also that additional communication cues are involved in mate choice. No QTL were found for female mate discrimination, which points at a polygenic architecture of female choice with strong environmental influences. PMID:27172207

  3. STIM evaluation in GeoPIXE to complement the quantitative dynamic analysis

    NASA Astrophysics Data System (ADS)

    Pallon, J.; Ryan, C. G.; Arteaga Marrero, N.; Elfman, M.; Kristiansson, P.; Nilsson, E. J. C.; Nilsson, C.

    2009-06-01

    The GeoPIXE software for quantitative PIXE trace element imaging and analysis is a well established package for evaluation of characteristic X-ray data for both PIXE and SXRF. For the case of microbeam applications on semi-thick samples knowledge of the local areal density distribution is important for precise quantification. A technique is reported to achieve this using the measurement of beam particle energy loss as it traverses the sample, as in scanning transmission ion microscopy (STIM). New functionality is added to the GeoPIXE code through integration of routines for STIM sorting of event-by-event data to create elemental maps of the mean energy after traversing the sample. Integration of stopping powers for a given sample matrix then permits the measured energy loss to be related to the local areal density. In a further step, this information is used for X-ray absorption corrections made directly to the PIXE analysis results. As a complement, user-written plugins operating on single STIM spectra have been used to compare the estimated areal density from chosen spots with the corresponding values calculated with the new GeoPIXE routines. The additions made to the code allow a more precise quantification to be done on inhomogeneous, semi-thick samples.

  4. New tools for comparing microscopy images: quantitative analysis of cell types in Bacillus subtilis.

    PubMed

    van Gestel, Jordi; Vlamakis, Hera; Kolter, Roberto

    2015-02-15

    Fluorescence microscopy is a method commonly used to examine individual differences between bacterial cells, yet many studies still lack a quantitative analysis of fluorescence microscopy data. Here we introduce some simple tools that microbiologists can use to analyze and compare their microscopy images. We show how image data can be converted to distribution data. These data can be subjected to a cluster analysis that makes it possible to objectively compare microscopy images. The distribution data can further be analyzed using distribution fitting. We illustrate our methods by scrutinizing two independently acquired data sets, each containing microscopy images of a doubly labeled Bacillus subtilis strain. For the first data set, we examined the expression of srfA and tapA, two genes which are expressed in surfactin-producing and matrix-producing cells, respectively. For the second data set, we examined the expression of eps and tapA; these genes are expressed in matrix-producing cells. We show that srfA is expressed by all cells in the population, a finding which contrasts with a previously reported bimodal distribution of srfA expression. In addition, we show that eps and tapA do not always have the same expression profiles, despite being expressed in the same cell type: both operons are expressed in cell chains, while single cells mainly express eps. These findings exemplify that the quantification and comparison of microscopy data can yield insights that otherwise would go unnoticed. PMID:25448819

  5. Quantitative Trait Locus Analysis of Mating Behavior and Male Sex Pheromones in Nasonia Wasps

    PubMed Central

    Diao, Wenwen; Mousset, Mathilde; Horsburgh, Gavin J.; Vermeulen, Cornelis J.; Johannes, Frank; van de Zande, Louis; Ritchie, Michael G.; Schmitt, Thomas; Beukeboom, Leo W.

    2016-01-01

    A major focus in speciation genetics is to identify the chromosomal regions and genes that reduce hybridization and gene flow. We investigated the genetic architecture of mating behavior in the parasitoid wasp species pair Nasonia giraulti and Nasonia oneida that exhibit strong prezygotic isolation. Behavioral analysis showed that N. oneida females had consistently higher latency times, and broke off the mating sequence more often in the mounting stage when confronted with N. giraulti males compared with males of their own species. N. oneida males produce a lower quantity of the long-range male sex pheromone (4R,5S)-5-hydroxy-4-decanolide (RS-HDL). Crosses between the two species yielded hybrid males with various pheromone quantities, and these males were used in mating trials with females of either species to measure female mate discrimination rates. A quantitative trait locus (QTL) analysis involving 475 recombinant hybrid males (F2), 2148 reciprocally backcrossed females (F3), and a linkage map of 52 equally spaced neutral single nucleotide polymorphism (SNP) markers plus SNPs in 40 candidate mating behavior genes revealed four QTL for male pheromone amount, depending on partner species. Our results demonstrate that the RS-HDL pheromone plays a role in the mating system of N. giraulti and N. oneida, but also that additional communication cues are involved in mate choice. No QTL were found for female mate discrimination, which points at a polygenic architecture of female choice with strong environmental influences. PMID:27172207

  6. Quantitative proteomic analysis of cold-responsive proteins in rice.

    PubMed

    Neilson, Karlie A; Mariani, Michael; Haynes, Paul A

    2011-05-01

    Rice is susceptible to cold stress and with a future of climatic instability we will be unable to produce enough rice to satisfy increasing demand. A thorough understanding of the molecular responses to thermal stress is imperative for engineering cultivars, which have greater resistance to low temperature stress. In this study we investigated the proteomic response of rice seedlings to 48, 72 and 96 h of cold stress at 12-14°C. The use of both label-free and iTRAQ approaches in the analysis of global protein expression enabled us to assess the complementarity of the two techniques for use in plant proteomics. The approaches yielded a similar biological response to cold stress despite a disparity in proteins identified. The label-free approach identified 236 cold-responsive proteins compared to 85 in iTRAQ results, with only 24 proteins in common. Functional analysis revealed differential expression of proteins involved in transport, photosynthesis, generation of precursor metabolites and energy; and, more specifically, histones and vitamin B biosynthetic proteins were observed to be affected by cold stress. PMID:21433000

  7. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  8. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  9. [Quantitative analysis of seven phenolic acids in eight Yinqiao Jiedu serial preparations by quantitative analysis of multi-components with single-marker].

    PubMed

    Wang, Jun-jun; Zhang, Li; Guo, Qing; Kou, Jun-ping; Yu, Bo-yang; Gu, Dan-hua

    2015-04-01

    The study aims to develop a unified method to determine seven phenolic acids (neochlorogenic acid, chlorogenic acid, 4-caffeoylquinic acid, caffeic acid, isochlorogenic acid B, isochlorogenic acid A and isochlorogenic acid C) contained in honeysuckle flower that is the monarch drug of all the eight Yinqiao Jiedu serial preparations using quantitative analysis of multi-components by single-marker (QAMS). Firstly, chlorogenic acid was used as a reference to get the average relative correction factors (RCFs) of the other phenolic acids in ratios to the reference; columns and instruments from different companies were used to validate the durability of the achieved RCFs in different levels of standard solutions; and honeysuckle flower extract was used as the reference substance to fix the positions of chromatographic peaks. Secondly, the contents of seven phenolic acids in eight different Yinqiao Jiedu serial preparations samples were calculated based on the RCFs durability. Finally, the quantitative results were compared between QAMS and the external standard (ES) method. The results have showed that the durability of the achieved RCFs is good (RSD during 0.80% - 2.56%), and there are no differences between the quantitative results of QAMS and ES (the relative average deviation < 0.93%). So it can be successfully used to the quantitative control of honeysuckle flower principally prescribed in Yinqiao Jiedu serial preparations. PMID:26223132

  10. Quantitative flow analysis of swimming dynamics with coherent Lagrangian vortices

    NASA Astrophysics Data System (ADS)

    Huhn, F.; van Rees, W. M.; Gazzola, M.; Rossinelli, D.; Haller, G.; Koumoutsakos, P.

    2015-08-01

    Undulatory swimmers flex their bodies to displace water, and in turn, the flow feeds back into the dynamics of the swimmer. At moderate Reynolds number, the resulting flow structures are characterized by unsteady separation and alternating vortices in the wake. We use the flow field from simulations of a two-dimensional, incompressible viscous flow of an undulatory, self-propelled swimmer and detect the coherent Lagrangian vortices in the wake to dissect the driving momentum transfer mechanisms. The detected material vortex boundary encloses a Lagrangian control volume that serves to track back the vortex fluid and record its circulation and momentum history. We consider two swimming modes: the C-start escape and steady anguilliform swimming. The backward advection of the coherent Lagrangian vortices elucidates the geometry of the vorticity field and allows for monitoring the gain and decay of circulation and momentum transfer in the flow field. For steady swimming, momentum oscillations of the fish can largely be attributed to the momentum exchange with the vortex fluid. For the C-start, an additionally defined jet fluid region turns out to balance the high momentum change of the fish during the rapid start.

  11. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    SciTech Connect

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  12. Quantitative radiographic analysis of fiber reinforced polymer composites.

    PubMed

    Baidya, K P; Ramakrishna, S; Rahman, M; Ritchie, A

    2001-01-01

    X-ray radiographic examination of the bone fracture healing process is a widely used method in the treatment and management of patients. Medical devices made of metallic alloys reportedly produce considerable artifacts that make the interpretation of radiographs difficult. Fiber reinforced polymer composite materials have been proposed to replace metallic alloys in certain medical devices because of their radiolucency, light weight, and tailorable mechanical properties. The primary objective of this paper is to provide a comparable radiographic analysis of different fiber reinforced polymer composites that are considered suitable for biomedical applications. Composite materials investigated consist of glass, aramid (Kevlar-29), and carbon reinforcement fibers, and epoxy and polyether-ether-ketone (PEEK) matrices. The total mass attenuation coefficient of each material was measured using clinical X-rays (50 kev). The carbon fiber reinforced composites were found to be more radiolucent than the glass and kevlar fiber reinforced composites. PMID:11261603

  13. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila

    PubMed Central

    Itskov, Pavel M.; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H.; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  14. Digital photogrammetry for quantitative wear analysis of retrieved TKA components.

    PubMed

    Grochowsky, J C; Alaways, L W; Siskey, R; Most, E; Kurtz, S M

    2006-11-01

    The use of new materials in knee arthroplasty demands a way in which to accurately quantify wear in retrieved components. Methods such as damage scoring, coordinate measurement, and in vivo wear analysis have been used in the past. The limitations in these methods illustrate a need for a different methodology that can accurately quantify wear, which is relatively easy to perform and uses a minimal amount of expensive equipment. Off-the-shelf digital photogrammetry represents a potentially quick and easy alternative to what is readily available. Eighty tibial inserts were visually examined for front and backside wear and digitally photographed in the presence of two calibrated reference fields. All images were segmented (via manual and automated algorithms) using Adobe Photoshop and National Institute of Health ImageJ. Finally, wear was determined using ImageJ and Rhinoceros software. The absolute accuracy of the method and repeatability/reproducibility by different observers were measured in order to determine the uncertainty of wear measurements. To determine if variation in wear measurements was due to implant design, 35 implants of the three most prevalent designs were subjected to retrieval analysis. The overall accuracy of area measurements was 97.8%. The error in automated segmentation was found to be significantly lower than that of manual segmentation. The photogrammetry method was found to be reasonably accurate and repeatable in measuring 2-D areas and applicable to determining wear. There was no significant variation in uncertainty detected among different implant designs. Photogrammetry has a broad range of applicability since it is size- and design-independent. A minimal amount of off-the-shelf equipment is needed for the procedure and no proprietary knowledge of the implant is needed. PMID:16649169

  15. Quantitative analysis by mid-infrared spectrometry in food and agro-industrial fields

    NASA Astrophysics Data System (ADS)

    Dupuy, Nathalie; Huvenne, J. P.; Sombret, B.; Legrand, P.

    1993-03-01

    Thanks to what has been achieved by the Fourier transform, infrared spectroscopy can now become a state of the art device in the quality control laboratories if we consider its precision and the gain in time it ensures compared to traditional analysis methods such as HPLC chromatography. Moreover, the increasing number of new mathematical regression methods such as Partial Least Square ( PLS) regression allows the multicomponent quantitative analysis in mixtures. Nevertheless, the efficiency of infrared spectrometry as a quantitative analysis method often depends on the choice of an adequate presentation for the sample. In this document, we shall demonstrate several techniques such as diffuse reflectance and Attenuated Total Reflectance (ATR) which can be according to the various physical states of the mixtures. The quantitative analysis of real samples from the food industry enables us to estimate its precision. For instance, the analysis of the three main components (glucose, fructose and maltose) in the glucose syrups can be done (using ATR) with a precision in the region of 3% whereas the time required to obtain an analysis report is about 5 minutes. Finally multicomponent quantitative analysis is quite feasable by mid-IR spectroscopy.

  16. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    NASA Astrophysics Data System (ADS)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2016-07-01

    Drought monitoring and early warning is an important measure to enhance resilience towards drought. While there are numerous operational systems using different drought indicators, there is no consensus on which indicator best represents drought impact occurrence for any given sector. Furthermore, thresholds are widely applied in these indicators but, to date, little empirical evidence exists as to which indicator thresholds trigger impacts on society, the economy, and ecosystems. The main obstacle for evaluating commonly used drought indicators is a lack of information on drought impacts. Our aim was therefore to exploit text-based data from the European Drought Impact report Inventory (EDII) to identify indicators that are meaningful for region-, sector-, and season-specific impact occurrence, and to empirically determine indicator thresholds. In addition, we tested the predictability of impact occurrence based on the best-performing indicators. To achieve these aims we applied a correlation analysis and an ensemble regression tree approach, using Germany and the UK (the most data-rich countries in the EDII) as test beds. As candidate indicators we chose two meteorological indicators (Standardized Precipitation Index, SPI, and Standardized Precipitation Evaporation Index, SPEI) and two hydrological indicators (streamflow and groundwater level percentiles). The analysis revealed that accumulation periods of SPI and SPEI best linked to impact occurrence are longer for the UK compared with Germany, but there is variability within each country, among impact categories and, to some degree, seasons. The median of regression tree splitting values, which we regard as estimates of thresholds of impact occurrence, was around -1 for SPI and SPEI in the UK; distinct differences between northern/northeastern vs. southern/central regions were found for Germany. Predictions with the ensemble regression tree approach yielded reasonable results for regions with good impact data

  17. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    NASA Astrophysics Data System (ADS)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2015-09-01

    Drought monitoring and early warning is an important measure to enhance resilience towards drought. While there are numerous operational systems using different drought indicators, there is no consensus on which indicator best represents drought impact occurrence for any given sector. Furthermore, thresholds are widely applied in these indicators but, to date, little empirical evidence exists as to which indicator thresholds trigger impacts on society, the economy, and ecosystems. The main obstacle for evaluating commonly used drought indicators is a lack of information on drought impacts. Our aim was therefore to exploit text-based data from the European Drought Impact report Inventory (EDII) to identify indicators which are meaningful for region-, sector-, and season-specific impact occurrence, and to empirically determine indicator thresholds. In addition, we tested the predictability of impact occurrence based on the best performing indicators. To achieve these aims we applied a correlation analysis and an ensemble regression tree approach ("random forest"), using Germany and the UK (the most data-rich countries in the EDII) as a testbed. As candidate indicators we chose two meteorological indicators (Standardized Precipitation Index (SPI) and Standardized Precipitation Evaporation Index (SPEI)) and two hydrological indicators. The analysis revealed that accumulation periods of SPI and SPEI best linked to impact occurrence are longer for the UK compared with Germany, but there is variability within each country, among impact categories and, to some degree, seasons. The median of regression tree splitting values, which we regard as estimates of thresholds of impact occurrence, was around -1 for SPI and SPEI in the UK; distinct differences between northern/northeastern vs. southern/central regions were found for Germany. Predictions with the ensemble regression tree approach yielded reasonable results for regions with good impact data coverage. The predictions

  18. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  19. A quantitative analysis of rock cliff erosion environments

    NASA Astrophysics Data System (ADS)

    Lim, M.; Rosser, N.; Petley, D. N.; Norman, E. C.; Barlow, J.

    2009-12-01

    The spatial patterns and temporal sequencing of failures from coastal rock cliffs are complex and typically generate weak correlations with environmental variables such as tidal inundation, wave energy, wind and rain. Consequently, understanding of rock cliff behaviour, its response to predicted changes in environmental forcing and, more specifically, the interaction between marine and climatic factors in influencing failure processes has remained limited. This work presents the results from the first attempt to characterise and quantify the conditions on coastal cliffs that lead to accelerated rates of material detachment. The rate of change in an 80 m high section of coastal rock cliffs has been surveyed annually with high-resolution terrestrial laser scanning (TLS). The rockfall data have been analysed according to a simplified source geology that exhibit distinct magnitude-frequency distributions relating to the dominance of particular failure types. An integrated network of sensors and instrumentation designed to reflect the lithological control on failure has been installed to examine both the distinction between prevailing conditions and those affecting the local cliff environment and the physical response of different rock types to micro-climatic processes. The monitoring system records near-surface rock strain, temperature, moisture and micro-seismic displacement in addition to air temperature, humidity, radiation, precipitation, water-level and three-dimensional wind characteristics. A characteristic environmental signal, unique to the cliff face material, has been identified that differs substantially from that experienced by the surrounding area; suggesting that established methods of meteorological and tidal data collection are insufficient and inappropriate to represent erosive processes. The interaction between thermo- and hydro-dynamics of the cliff environment and the physical response of the rock highlights the composite environmental effects

  20. Quantitative Nutrient Limitation Analysis of Global Forests by Remote Sensing

    NASA Astrophysics Data System (ADS)

    Lopez, A. M.; Badgley, G. M.; Field, C. B.

    2015-12-01

    Nutrient availability in terrestrial ecosystems may be the primary determinant of the long-term carbon storage capacity of vegetation. Both nutrient availability and carbon storage capacity are highly uncertain and limit our ability to predict atmospheric CO2 concentrations. Terrestrial vegetation, especially forests, play a critical role in regulating the global carbon cycle and Earth's climate by sequestering carbon from the atmosphere. The broad relationship between nutrient availability and increased biomass production can be captured using remotely-sensed spectral information. We develop an approach to estimate total nutrient availability in 848 global forest sites at 1-km spatial resolution by combining the ecological principle of functional convergence with MODIS gross primary productivity (GPP) and evapotranspiration (ET) products from 2000-2013. Convergence in the relationship between maximum GPP and ET of nutrient-rich forests indicate that any sites deviating from this upper-limit are associated with a lower availability of nutrients. This method offers a way to examine the severity, as well as the spatial extent of nutrient limitation at the global scale. We find that the degree to which forests are nutrient limited range between 0% and 81% with an average limitation of 16 ± 17%. Our method agrees with regional nutrient gradients (i.e. SW-NE Amazon), but does not tightly correspond with recently published nutrient limitation classification standards (Fernandez-Martinez et al., 2014). A global terrestrial nutrient limitation map can assist in diagnosing the health of vegetation while removing the necessity for extensive field sampling or local nutrient addition experiments. Further research will expand the study sites to obtain a complete global terrestrial nutrient limitation map.

  1. High throughput comparative proteome analysis using a quantitative cysteinyl-peptide enrichment technology

    SciTech Connect

    Liu, Tao; Qian, Weijun; Strittmatter, Eric F.; Camp, David G.; Anderson, Gordon A.; Thrall, Brian D.; Smith, Richard D.

    2004-09-15

    A new quantitative cysteinyl-peptide enrichment technology (QCET) was developed to achieve higher efficiency, greater dynamic range, and higher throughput in quantitative proteomics that use stable-isotope labeling techniques combined with high resolution liquid chromatography (LC)-mass spectrometry (MS). This approach involves {sup 18}O labeling of tryptic peptides, high efficiency enrichment of cysteine-containing peptides, and confident protein identification and quantification using the accurate mass and time tag strategy. Proteome profiling of naive and in vitro-differentiated human mammary epithelial cells using QCET resulted in the identification and quantification of 603 proteins in a single LC-Fourier transform ion cyclotron resonance MS analysis. Advantages of this technology include: (1) a simple, highly efficient method for enriching cysteinyl-peptides; (2) a high throughput strategy suitable for extensive proteome analysis; and (3) improved labeling efficiency for better quantitative measurements. This technology enhances both the functional analysis of biological systems and the detection of potential clinical biomarkers.

  2. Statistical shape analysis using 3D Poisson equation-A quantitatively validated approach.

    PubMed

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. PMID:26874288

  3. Quantitative autoradiography with radiopharmaceuticals, Part 1: Digital film-analysis system by videodensitometry: concise communication

    SciTech Connect

    Yonekura, Y.; Brill, A.B.; Som, P.; Bennett, G.W.; Fand, I.

    1983-03-01

    A simple low-cost digital film-analysis system using videodensitometry was developed to quantitate autoradiograms. It is based on a TV-film analysis system coupled to a minicomputer. Digital sampling of transmitted light intensities through the autoradiogram is performed with 8-bit gray levels according to the selected array size (128 X 128 to 1024 X 1024). The performance characteristics of the system provide sufficient stability, uniformity, linearity, and intensity response for use in quantitative analysis. Digital images of the autoradiograms are converted to radioactivity content, pixel by pixel, using step-wedge standards. This type of low-cost system can be installed on conventional mini-computers commonly used in modern nuclear medical facilities. Quantitative digital autoradiography can play an important role, with applications stretching from dosimetry calculations of radiopharmaceuticals to metabolic studies in conjunction with positron-emission tomography.

  4. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research

  5. Mammographic quantitative image analysis and biologic image composition for breast lesion characterization and classification

    SciTech Connect

    Drukker, Karen Giger, Maryellen L.; Li, Hui; Duewer, Fred; Malkov, Serghei; Joe, Bonnie; Kerlikowske, Karla; Shepherd, John A.; Flowers, Chris I.; Drukteinis, Jennifer S.

    2014-03-15

    Purpose: To investigate whether biologic image composition of mammographic lesions can improve upon existing mammographic quantitative image analysis (QIA) in estimating the probability of malignancy. Methods: The study population consisted of 45 breast lesions imaged with dual-energy mammography prior to breast biopsy with final diagnosis resulting in 10 invasive ductal carcinomas, 5 ductal carcinomain situ, 11 fibroadenomas, and 19 other benign diagnoses. Analysis was threefold: (1) The raw low-energy mammographic images were analyzed with an established in-house QIA method, “QIA alone,” (2) the three-compartment breast (3CB) composition measure—derived from the dual-energy mammography—of water, lipid, and protein thickness were assessed, “3CB alone”, and (3) information from QIA and 3CB was combined, “QIA + 3CB.” Analysis was initiated from radiologist-indicated lesion centers and was otherwise fully automated. Steps of the QIA and 3CB methods were lesion segmentation, characterization, and subsequent classification for malignancy in leave-one-case-out cross-validation. Performance assessment included box plots, Bland–Altman plots, and Receiver Operating Characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) for distinguishing between benign and malignant lesions (invasive and DCIS) was 0.81 (standard error 0.07) for the “QIA alone” method, 0.72 (0.07) for “3CB alone” method, and 0.86 (0.04) for “QIA+3CB” combined. The difference in AUC was 0.043 between “QIA + 3CB” and “QIA alone” but failed to reach statistical significance (95% confidence interval [–0.17 to + 0.26]). Conclusions: In this pilot study analyzing the new 3CB imaging modality, knowledge of the composition of breast lesions and their periphery appeared additive in combination with existing mammographic QIA methods for the distinction between different benign and malignant lesion types.

  6. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  7. Quantitative analysis of cardiac lesions in chronic canine chagasic cardiomyopathy.

    PubMed

    Caliari, Marcelo Vidigal; do Pilar Machado, Raquel; de Lana, Marta; Caja, Rosângela Aparecida França; Carneiro, Cláudia Martins; Bahia, Maria Teresinha; dos Santos, César Augusto Bueno; Magalhaes, Gustavo Albergaria; Sampaio, Ivan Barbosa Machado; Tafuri, Washington Luiz

    2002-01-01

    Lesions observed in chronic chagasic cardiopathy frequently produce electrocardiographic alterations and affect cardiac function. Through a computerized morphometrical analysis we quantified the areas occupied by cardiac muscle, connective and adipose tissues in the right atrium of dogs experimentally infected with Trypanosoma cruzi. All of the infected dogs showed chronic myocarditis with variable reduction levels of cardiac muscle, fibrosis and adipose tissue replacement. In the atrial myocardium of dogs infected with Be78 and Be62 cardiac muscle represented 34 and 50%, fibrosis 28 and 32% and adipose tissue 38 and 18%, respectively. The fibrosis observed was both diffuse and focal and mostly intrafascicular, either partially or completely interrupting the path of muscle bundles. Such histological alterations probably contributed to the appearance of electrocardiographic disturbances verified in 10 out 11 dogs which are also common in human chronic chagasic cardiopathy. Fibrosis was the most important microscopic occurrence found since it produces rearrangements of collagen fibers in relation to myocardiocytes which causes changes in anatomical physiognomy and mechanical behavior of the myocardium. These abnormalities can contribute to the appearance of cardiac malfunction, arrythmias and congestive cardiac insufficiency as observed in two of the analyzed dogs. Strain Be78 caused destruction of atrial cardiac muscle higher than that induced by strain Be62. PMID:12436168

  8. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  9. Funtools: Fits Users Need Tools for Quick, Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Mandel, Eric; Brederkamp, Joe (Technical Monitor)

    2001-01-01

    The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

  10. Quantitative image analysis of histological sections of coronary arteries

    NASA Astrophysics Data System (ADS)

    Holmes, David R., III; Robb, Richard A.

    2000-06-01

    The study of coronary arteries has evolved from examining gross anatomy and morphology to scrutinizing micro-anatomy and cellular composition. Technological advances such as high- resolution digital microscopes and high precision cutting devices have allowed examination of coronary artery morphology and pathology at micron resolution. We have developed a software toolkit to analyze histological sections. In particular, we are currently engaged in examining normal coronary arteries in order to provide the foundation for study of remodeled tissue. The first of two coronary arteries was stained for elastin and collagen. The second coronary artery was sectioned and stained for cellular nuclei and smooth muscle. High resolution light microscopy was used to image the sections. Segmentation was accomplished initially with slice- to-slice thresholding algorithms. These segmentation techniques choose optimal threshold values by modeling the tissue as one or more distributions. Morphology and image statistics were used to further differentiate the thresholded data into different tissue categories therefore refine the results of the segmentation. Specificity/sensitivity analysis suggests that automatic segmentation can be very effective. For both tissue samples, greater than 90% specificity was achieved. Summed voxel projection and maximum intensity projection appear to be effective 3-D visualization tools. Shading methods also provide useful visualization, however it is important to incorporate combined 2-D and 3-D displays. Surface rendering techniques (e.g. color mapping) can be used for visualizing parametric data. Preliminary results are promising, but continued development of algorithms is needed.

  11. Quantitative analysis of bloggers' collective behavior powered by emotions

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Paltoglou, Georgios; Tadić, Bosiljka

    2011-02-01

    Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

  12. Quantitative Analysis of the Microstructure of Auxetic Foams

    SciTech Connect

    Gaspar, N.; Smith, C.W.; Miller, E.A.; Seidler, G.T.; Evans, K.E.

    2008-07-28

    The auxetic foams first produced by Lakes have been modelled in a variety of ways, each model trying to reproduce some observed feature of the microscale of the foams. Such features include bent or broken ribs or inverted angles between ribs. These models can reproduce the Poisson's ratio or Poisson's function of auxetic foam if the model parameters are carefully chosen. However these model parameters may not actually reflect the internal structure of the foams. A big problem is that measurement of parameters such as lengths and angles is not straightforward within a 3-d sample. In this work a sample of auxetic foam has been imaged by 3-d X-ray computed tomography. The resulting image is translated to a form that emphasises the geometrical structure of connected ribs. This connected rib data are suitably analysed to describe both the microstructural construction of auxetic foams and the statistical spread of structure, that is, the heterogeneity of an auxetic foam. From the analysis of the microstructure, observations are made about the requirements for microstructural models and comparisons made to previous existing models. From the statistical data, measures of heterogeneity are made that will help with future modelling that includes the heterogeneous aspect of auxetic foams.

  13. Quantitative Analysis with Heavy Ion E-TOF ERD

    SciTech Connect

    Banks, J.C.; Doyle, B.L.; Font, A. Climent

    1999-07-23

    Heavy ion TOF ERD combined with energy detection (E-TOF-ERD) is a powerful analytical technique taking advantage of the following facts: the scattering cross section is usually very high ({approximately}10{sup {minus}21} cm{sup 2}/sr) compared to regular He RBS ({approximately}10{sup {minus}25} cm{sup 2}/sr), contrary to what happens with the energy resolution in ordinary surface solid barrier detectors, time resolution is almost independent of the atomic mass of the detected element, and the detection in coincidence of time and energy signals allows for the mass separation of overlapping signals with the same energy (or time of flight). Measurements on several oxides have been performed with the E-TOF-ERD set up at Sandia National Laboratories using an incident beam of 10-15 MeV Au. The information on the composition of the sample is obtained from the time domain spectrum, which is converted to energy domain, and then, using existing software codes, the analysis is performed. During the quantification of the results, they have found problems related to the interaction of the beam with the sample and to the tabulated values of the stopping powers for heavy ions.

  14. Direct Quantitative Analysis of Arsenic in Coal Fly Ash

    PubMed Central

    Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi

    2012-01-01

    A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500–1200 and 2150°C, heating rates of 100 and 500°C s−1, hold times of 90 and 7 s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

  15. Fibrin Architecture in Clots: A Quantitative Polarized Light Microscopy Analysis

    PubMed Central

    Whittaker, Peter; Przyklenk, Karin

    2009-01-01

    Fibrin plays a vital structural role in thrombus integrity. Thus, the ability to assess fibrin architecture has potential to provide insight into thrombosis and thrombolysis. Fibrin has an anisotropic molecular structure, which enables it to be seen with polarized light. Therefore, we aimed to determine if automated polarized light microscopy methods of quantifying two structural parameters; fibrin fiber bundle orientation and fibrin's optical retardation (OR: a measure of molecular anisotropy) could be used to assess thrombi. To compare fibrin fiber bundle orientation we analyzed picrosirius red-stained sections obtained from clots formed: (A) in vitro, (B) in injured and stenotic coronary arteries, and (C) in surgically created aortic aneurysms (n = 6 for each group). To assess potential changes in OR, we examined fibrin in picrosirius red-stained clots formed after ischemic preconditioning (10 minutes ischemia + 10 minutes reflow; a circumstance shown to enhance lysability) and in control clots (n = 8 each group). The degree of fibrin organization differed significantly according to the location of clot formation; fibrin was most aligned in the aneurysms and least aligned in vitro whereas fibrin in the coronary clots had an intermediate organization. The OR of fibrin in the clots formed after ischemic preconditioning was lower than that in controls (2.9 ± 0.5 nm versus 5.4 ± 1.0 nm, P < 0.05). The automated polarized light analysis methods not only enabled fibrin architecture to be assessed, but also revealed structural differences in clots formed under different circumstances. PMID:19054699

  16. Quantitative analysis of American woodcock nest and brood habitat

    USGS Publications Warehouse

    Bourgeois, A.

    1977-01-01

    Sixteen nest and 19 brood sites of American woodcock (Philohela minoI) were examined in northern lower Michigan between 15 April and 15 June 1974 to determine habitat structure associated with these sites. Woodcock hens utilized young, second-growth forest stands which were similar in species composition for both nesting and brood rearing. A multi-varIate discriminant function analysis revealed a significant (P< 0.05) difference, however, in habitat structure. Nest habitat was characterized by lower tree density (2176 trees/ha) and basal area (8.6 m2/ha), by being close to forest openings (7 m) and by being situated on dry, relatively well drained sites. In contrast, woodcock broods were located in sites that had nearly twice the tree density (3934 trees/hal and basal area (16.5 m2/ha), was located over twice as far from forest openings (18 m) and generally occurred on damp sites, near (8 m) standing water. Importance of the habitat features to the species and possible management implications are discussed.

  17. Combination of quantitative analysis and chemometric analysis for the quality evaluation of three different frankincenses by ultra high performance liquid chromatography and quadrupole time of flight mass spectrometry.

    PubMed

    Zhang, Chao; Sun, Lei; Tian, Run-tao; Jin, Hong-yu; Ma, Shuang-Cheng; Gu, Bing-ren

    2015-10-01

    Frankincense has gained increasing attention in the pharmaceutical industry because of its pharmacologically active components such as boswellic acids. However, the identity and overall quality evaluation of three different frankincense species in different Pharmacopeias and the literature have less been reported. In this paper, quantitative analysis and chemometric evaluation were established and applied for the quality control of frankincense. Meanwhile, quantitative and chemometric analysis could be conducted under the same analytical conditions. In total 55 samples from four habitats (three species) of frankincense were collected and six boswellic acids were chosen for quantitative analysis. Chemometric analyses such as similarity analysis, hierarchical cluster analysis, and principal component analysis were used to identify frankincense of three species to reveal the correlation between its components and species. In addition, 12 chromatographic peaks have been tentatively identified explored by reference substances and quadrupole time-of-flight mass spectrometry. The results indicated that the total boswellic acid profiles of three species of frankincense are similar and their fingerprints can be used to differentiate between them. PMID:26228790

  18. Segmentation of vascular structures and hematopoietic cells in 3D microscopy images and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Mu, Jian; Yang, Lin; Kamocka, Malgorzata M.; Zollman, Amy L.; Carlesso, Nadia; Chen, Danny Z.

    2015-03-01

    In this paper, we present image processing methods for quantitative study of how the bone marrow microenvironment changes (characterized by altered vascular structure and hematopoietic cell distribution) caused by diseases or various factors. We develop algorithms that automatically segment vascular structures and hematopoietic cells in 3-D microscopy images, perform quantitative analysis of the properties of the segmented vascular structures and cells, and examine how such properties change. In processing images, we apply local thresholding to segment vessels, and add post-processing steps to deal with imaging artifacts. We propose an improved watershed algorithm that relies on both intensity and shape information and can separate multiple overlapping cells better than common watershed methods. We then quantitatively compute various features of the vascular structures and hematopoietic cells, such as the branches and sizes of vessels and the distribution of cells. In analyzing vascular properties, we provide algorithms for pruning fake vessel segments and branches based on vessel skeletons. Our algorithms can segment vascular structures and hematopoietic cells with good quality. We use our methods to quantitatively examine the changes in the bone marrow microenvironment caused by the deletion of Notch pathway. Our quantitative analysis reveals property changes in samples with deleted Notch pathway. Our tool is useful for biologists to quantitatively measure changes in the bone marrow microenvironment, for developing possible therapeutic strategies to help the bone marrow microenvironment recovery.

  19. Quantitative flux analysis reveals folate-dependent NADPH production

    NASA Astrophysics Data System (ADS)

    Fan, Jing; Ye, Jiangbin; Kamphorst, Jurre J.; Shlomi, Tomer; Thompson, Craig B.; Rabinowitz, Joshua D.

    2014-06-01

    ATP is the dominant energy source in animals for mechanical and electrical work (for example, muscle contraction or neuronal firing). For chemical work, there is an equally important role for NADPH, which powers redox defence and reductive biosynthesis. The most direct route to produce NADPH from glucose is the oxidative pentose phosphate pathway, with malic enzyme sometimes also important. Although the relative contribution of glycolysis and oxidative phosphorylation to ATP production has been extensively analysed, similar analysis of NADPH metabolism has been lacking. Here we demonstrate the ability to directly track, by liquid chromatography-mass spectrometry, the passage of deuterium from labelled substrates into NADPH, and combine this approach with carbon labelling and mathematical modelling to measure NADPH fluxes. In proliferating cells, the largest contributor to cytosolic NADPH is the oxidative pentose phosphate pathway. Surprisingly, a nearly comparable contribution comes from serine-driven one-carbon metabolism, in which oxidation of methylene tetrahydrofolate to 10-formyl-tetrahydrofolate is coupled to reduction of NADP+ to NADPH. Moreover, tracing of mitochondrial one-carbon metabolism revealed complete oxidation of 10-formyl-tetrahydrofolate to make NADPH. As folate metabolism has not previously been considered an NADPH producer, confirmation of its functional significance was undertaken through knockdown of methylenetetrahydrofolate dehydrogenase (MTHFD) genes. Depletion of either the cytosolic or mitochondrial MTHFD isozyme resulted in decreased cellular NADPH/NADP+ and reduced/oxidized glutathione ratios (GSH/GSSG) and increased cell sensitivity to oxidative stress. Thus, although the importance of folate metabolism for proliferating cells has been long recognized and attributed to its function of producing one-carbon units for nucleic acid synthesis, another crucial function of this pathway is generating reducing power.

  20. Quantitative ultrasound texture analysis for clinical decision making support

    NASA Astrophysics Data System (ADS)

    Wu, Jie Ying; Beland, Michael; Konrad, Joseph; Tuomi, Adam; Glidden, David; Grand, David; Merck, Derek

    2015-03-01

    We propose a general ultrasound (US) texture-analysis and machine-learning framework for detecting the presence of disease that is suitable for clinical application across clinicians, disease types, devices, and operators. Its stages are image selection, image filtering, ROI selection, feature parameterization, and classification. Each stage is modular and can be replaced with alternate methods. Thus, this framework is adaptable to a wide range of tasks. Our two preliminary clinical targets are hepatic steatosis and adenomyosis diagnosis. For steatosis, we collected US images from 288 patients and their pathology-determined values of steatosis (%) from biopsies. Two radiologists independently reviewed all images and identified the region of interest (ROI) most representative of the hepatic echotexture for each patient. To parameterize the images into comparable quantities, we filter the US images at multiple scales for various texture responses. For each response, we collect a histogram of pixel features within the ROI, and parameterize it as a Gaussian function using its mean, standard deviation, kurtosis, and skew to create a 36-feature vector. Our algorithm uses a support vector machine (SVM) for classification. Using a threshold of 10%, we achieved 72.81% overall accuracy, 76.18% sensitivity, and 65.96% specificity in identifying steatosis with leave-ten-out cross-validation (p<0.0001). Extending this framework to adenomyosis, we identified 38 patients with MR-confirmed findings of adenomyosis and previous US studies and 50 controls. A single rater picked the best US-image and ROI for each case. Using the same processing pipeline, we obtained 76.14% accuracy, 86.00% sensitivity, and 63.16% specificity with leave-one-out cross-validation (p<0.0001).

  1. A quantitative analysis of 3-D coronary modeling from two or more projection images.

    PubMed

    Movassaghi, B; Rasche, V; Grass, M; Viergever, M A; Niessen, W J

    2004-12-01

    A method is introduced to examine the geometrical accuracy of the three-dimensional (3-D) representation of coronary arteries from multiple (two and more) calibrated two-dimensional (2-D) angiographic projections. When involving more then two projections, (multiprojection modeling) a novel procedure is presented that consists of fully automated centerline and width determination in all available projections based on the information provided by the semi-automated centerline detection in two initial calibrated projections. The accuracy of the 3-D coronary modeling approach is determined by a quantitative examination of the 3-D centerline point position and the 3-D cross sectional area of the reconstructed objects. The measurements are based on the analysis of calibrated phantom and calibrated coronary 2-D projection data. From this analysis a confidence region (alpha degrees approximately equal to [35 degrees - 145 degrees]) for the angular distance of two initial projection images is determined for which the modeling procedure is sufficiently accurate for the applied system. Within this angular border range the centerline position error is less then 0.8 mm, in terms of the Euclidean distance to a predefined ground truth. When involving more projections using our new procedure, experiments show that when the initial pair of projection images has an angular distance in the range alpha degrees approximately equal to [35 degrees - 145 degrees], the centerlines in all other projections (gamma = 0 degrees - 180 degrees) were indicated very precisely without any additional centering procedure. When involving additional projection images in the modeling procedure a more realistic shape of the structure can be provided. In case of the concave segment, however, the involvement of multiple projections does not necessarily provide a more realistic shape of the reconstructed structure. PMID:15575409

  2. Analysis of liver connexin expression using reverse transcription quantitative real-time polymerase chain reaction

    PubMed Central

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Summary Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin mRNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction and data analysis. PMID:27207283

  3. Analysis of Liver Connexin Expression Using Reverse Transcription Quantitative Real-Time Polymerase Chain Reaction.

    PubMed

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin RNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction, and data analysis. PMID:27207283

  4. New bone formation in the in vivo implantation of bioceramics. A quantitative analysis.

    PubMed

    Wu, H; Zhu, T B; Du, J Y; Hong, G X; Sun, S Z; Xu, X H

    1992-09-01

    Two kinds of synthetic biomaterial, porous tricalcium phosphate (PTCP) and magnetic porous tricalcium phosphate (MPTCP) ceramic granules were implanted in rat femur. In the period of 4 months, the assessment of serial histological sections, scanning electron microphotographs and quantitative analysis of bone formation in the sections showed that both ceramics are biocompatible and degradable in vivo. More new bone formation occurred in the MPTCP group. Endochondral ossification was seen in both groups. The quantitative analysis in this study is reliable, and may be suitable to the similar experimental models. PMID:1288979

  5. Quantitative analysis of polydisperse systems via solvent-free matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Kulkarni, Sourabh U; Thies, Mark C

    2012-02-15

    Quantitative analysis of partially soluble and insoluble polydisperse materials is challenging due to the lack of both appropriate standards and reliable analytical techniques. To this end, matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) incorporating a solvent-free sample preparation technique was investigated for the quantitative analysis of partially soluble, polydisperse, polycyclic aromatic hydrocarbon (PAH) oligomers. Molecular weight standards consisting of narrow molecular weight dimer and trimer oligomers of the starting M-50 petroleum pitch were produced using both dense-gas/supercritical extraction (DGE/SCE) and preparative-scale, gel permeation chromatography (GPC). The validity of a MALDI-based, quantitative analysis technique using solvent-free sample preparation was first demonstrated by applying the method of standard addition to a pitch of known composition. The standard addition method was then applied to the quantitative analysis of two insoluble petroleum pitch fractions of unknown oligomeric compositions, with both the dimer and trimer compositions of these fractions being accurately determined. To our knowledge, this study represents the first successful MALDI application of solvent-free quantitative analysis to insoluble, polydisperse materials. PMID:22223328

  6. Scattering influences in quantitative fission neutron radiography for the in situ analysis of hydrogen distribution in metal hydrides

    NASA Astrophysics Data System (ADS)

    Börries, S.; Metz, O.; Pranzas, P. K.; Bücherl, T.; Söllradl, S.; Dornheim, M.; Klassen, T.; Schreyer, A.

    2015-10-01

    In situ neutron radiography allows for the time-resolved study of hydrogen distribution in metal hydrides. However, for a precise quantitative investigation of a time-dependent hydrogen content within a host material, an exact knowledge of the corresponding attenuation coefficient is necessary. Additionally, the effect of scattering has to be considered as it is known to violate Beer's law, which is used to determine the amount of hydrogen from a measured intensity distribution. Within this study, we used a metal hydride inside two different hydrogen storage tanks as host systems, consisting of steel and aluminum. The neutron beam attenuation by hydrogen was investigated in these two different setups during the hydrogen absorption process. A linear correlation to the amount of absorbed hydrogen was found, allowing for a readily quantitative investigation. Further, an analysis of scattering contributions on the measured intensity distributions was performed and is described in detail.

  7. Analysis of mixed cell cultures with quantitative digital holographic phase microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Wibbeling, Jana; Ketelhut, Steffi

    2014-05-01

    In order to study, for example, the influence of pharmaceuticals or pathogens on different cell types under identical measurement conditions and to analyze interactions between different cellular specimens a minimally-invasive quantitative observation of mixed cell cultures is of particular interest. Quantitative phase microscopy (QPM) provides high resolution detection of optical path length changes that is suitable for stain-free minimally-invasive live cell analysis. Due to low light intensities for object illumination, QPM minimizes the interaction with the sample and is in particular suitable for long term time-lapse investigations, e.g., for the detection of cell morphology alterations due to drugs and toxins. Furthermore, QPM has been demonstrated to be a versatile tool for the quantification of cellular growth, the extraction morphological parameters and cell motility. We studied the feasibility of QPM for the analysis of mixed cell cultures. It was explored if quantitative phase images provide sufficient information to distinguish between different cell types and to extract cell specific parameters. For the experiments quantitative phase imaging with digital holographic microscopy (DHM) was utilized. Mixed cell cultures with different types of human pancreatic tumor cells were observed with quantitative DHM phase contrast up to 35 h. The obtained series of quantitative phase images were evaluated by adapted algorithms for image segmentation. From the segmented images the cellular dry mass and the mean cell thickness were calculated and used in the further analysis as parameters to quantify the reliability the measurement principle. The obtained results demonstrate that it is possible to characterize the growth of cell types with different morphologies in a mixed cell culture separately by consideration of specimen size and cell thickness in the evaluation of quantitative DHM phase images.

  8. Quantitative Computed Tomography Protocols Affect Material Mapping and Quantitative Computed Tomography-Based Finite-Element Analysis Predicted Stiffness.

    PubMed

    Giambini, Hugo; Dragomir-Daescu, Dan; Nassr, Ahmad; Yaszemski, Michael J; Zhao, Chunfeng

    2016-09-01

    Quantitative computed tomography-based finite-element analysis (QCT/FEA) has become increasingly popular in an attempt to understand and possibly reduce vertebral fracture risk. It is known that scanning acquisition settings affect Hounsfield units (HU) of the CT voxels. Material properties assignments in QCT/FEA, relating HU to Young's modulus, are performed by applying empirical equations. The purpose of this study was to evaluate the effect of QCT scanning protocols on predicted stiffness values from finite-element models. One fresh frozen cadaveric torso and a QCT calibration phantom were scanned six times varying voltage and current and reconstructed to obtain a total of 12 sets of images. Five vertebrae from the torso were experimentally tested to obtain stiffness values. QCT/FEA models of the five vertebrae were developed for the 12 image data resulting in a total of 60 models. Predicted stiffness was compared to the experimental values. The highest percent difference in stiffness was approximately 480% (80 kVp, 110 mAs, U70), while the lowest outcome was ∼1% (80 kVp, 110 mAs, U30). There was a clear distinction between reconstruction kernels in predicted outcomes, whereas voltage did not present a clear influence on results. The potential of QCT/FEA as an improvement to conventional fracture risk prediction tools is well established. However, it is important to establish research protocols that can lead to results that can be translated to the clinical setting. PMID:27428281

  9. Quantitative carbide analysis using the Rietveld method for 2.25Cr-1Mo-0.25V steel

    SciTech Connect

    Zhang Yongtao; Han Haibo; Miao Lede; Zhang Hanqian; Li Jinfu

    2009-09-15

    It is usually difficult to quantitatively determine the mass fraction of each type of precipitates in steels using transmission electron microscopy and traditional X-ray powder diffraction analysis methods. In this paper the Rietveld full-pattern fitting algorithm was employed to calculate the relative mass fractions of the precipitates in 2.25Cr-1Mo-0.25V steel. The results suggest that the fractions of MC, M{sub 7}C{sub 3} and M{sub 23}C{sub 6} carbides were evaluated precisely and relatively quickly. In addition, it was found that the fine MC phase dissolved into the matrix with prolonged tempering.

  10. Possibility of quantitative estimation of blood cell forms by the spatial-frequency spectrum analysis

    NASA Astrophysics Data System (ADS)

    Spiridonov, Igor N.; Safonova, Larisa P.; Samorodov, Andrey V.

    2000-05-01

    At present in hematology there are no quantitative estimates of such important for the cell classification parameters: cell form and nuclear form. Due to the absence of the correlation between morphological parameters and parameters measured by hemoanalyzers, both flow cytometers and computer recognition systems, do not provide the completeness of the clinical blood analysis. Analysis of the spatial-frequency spectra of blood samples (smears and liquid probes) permit the estimate the forms quantitatively. On the results of theoretical and experimental researches carried out an algorithm of the form quantitative estimation by means of SFS parameters has been created. The criteria of the quality of these estimates have been proposed. A test bench based on the coherent optical and digital processors. The received results could be applied for the automated classification of ether normal or pathological blood cells in the standard blood smears.

  11. Applied research for quantitative analysis of fluorescent whitening agent in emulsion paint

    NASA Astrophysics Data System (ADS)

    Zhang, Lin

    Fluorescent whitening agents (FWAS) are widely used in the emulsion paint for brightening effect. In spite of extensive use of FWAS, there are no reports about the measurement method of FWAS in emulsion paint. In this work, a very simple quantitative approach is proposed. Based on the digital grayscale images of three-dimensional fluorescence spectra and two-dimensional fluorescence images, several wavelet moment invariants are calculated and used to establish the standard models for the quantitative analysis. The influence factors of storage time and exposure time are also studied here. Measurement results indicated the feasibility and precision of using this method for quantitative analysis of FWAS. The research results also provides a reliable basis for the application of FWAS in emulsion paint. Keywords: fluorescent whitening agents, three-dimensional fluorescence spectra, fluorescence image, wavelet moment invariants

  12. Quantitative Glycoproteomics Analysis Reveals Changes in N-Glycosylation Level Associated with Pancreatic Ductal Adenocarcinoma

    PubMed Central

    2015-01-01

    Glycosylation plays an important role in epithelial cancers, including pancreatic ductal adenocarcinoma. However, little is known about the glycoproteome of the human pancreas or its alterations associated with pancreatic tumorigenesis. Using quantitative glycoproteomics approach, we investigated protein N-glycosylation in pancreatic tumor tissue in comparison with normal pancreas and chronic pancreatitis tissue. The study lead to the discovery of a roster of glycoproteins with aberrant N-glycosylation level associated with pancreatic cancer, including mucin-5AC (MUC5AC), carcinoembryonic antigen-related cell adhesion molecule 5 (CEACAM5), insulin-like growth factor binding protein (IGFBP3), and galectin-3-binding protein (LGALS3BP). Pathway analysis of cancer-associated aberrant glycoproteins revealed an emerging phenomenon that increased activity of N-glycosylation was implicated in several pancreatic cancer pathways, including TGF-β, TNF, NF-kappa-B, and TFEB-related lysosomal changes. In addition, the study provided evidence that specific N-glycosylation sites within certain individual proteins can have significantly altered glycosylation occupancy in pancreatic cancer, reflecting the complexity of the molecular mechanisms underlying cancer-associated glycosylation events. PMID:24471499

  13. Quantitative Analysis of Myelin and Axonal Remodeling in the Uninjured Motor Network After Stroke.

    PubMed

    Lin, Ying-Chia; Daducci, Alessandro; Meskaldji, Djalel Eddine; Thiran, Jean-Philippe; Michel, Patrik; Meuli, Reto; Krueger, Gunnar; Menegaz, Gloria; Granziera, Cristina

    2015-09-01

    Contralesional brain connectivity plasticity was previously reported after stroke. This study aims at disentangling the biological mechanisms underlying connectivity plasticity in the uninjured motor network after an ischemic lesion. In particular, we measured generalized fractional anisotropy (GFA) and magnetization transfer ratio (MTR) to assess whether poststroke connectivity remodeling depends on axonal and/or myelin changes. Diffusion-spectrum imaging and magnetization transfer MRI at 3T were performed in 10 patients in acute phase, at 1 and 6 months after stroke, which was affecting motor cortical and/or subcortical areas. Ten age- and gender-matched healthy volunteers were scanned 1 month apart for longitudinal comparison. Clinical assessment was also performed in patients prior to magnetic resonance imaging (MRI). In the contralesional hemisphere, average measures and tract-based quantitative analysis of GFA and MTR were performed to assess axonal integrity and myelination along motor connections as well as their variations in time. Mean and tract-based measures of MTR and GFA showed significant changes in a number of contralesional motor connections, confirming both axonal and myelin plasticity in our cohort of patients. Moreover, density-derived features (peak height, standard deviation, and skewness) of GFA and MTR along the tracts showed additional correlation with clinical scores than mean values. These findings reveal the interplay between contralateral myelin and axonal remodeling after stroke. PMID:25296185

  14. Lack of efficacy of music to improve sleep: a polysomnographic and quantitative EEG analysis.

    PubMed

    Lazic, Stanley E; Ogilvie, Robert D

    2007-03-01

    An increasing number of studies have been examining non-pharmacological methods to improve the quality of sleep, including the use of music and other types of auditory stimulation. While many of these studies have found significant results, they suffer from a combination of subjective self-report measures as the primary outcome, a lack of proper controls, often combine music with some type of relaxation therapy, or do not randomise subjects to control and treatment conditions. It is therefore difficult to assess the efficacy of music to induce or improve sleep. The present study therefore examined the effects of music using standard polysomnographic measures and quantitative analysis of the electroencephalogram, along with subjective ratings of sleep quality. In addition, a tones condition was used to compare any effects of music with the effects of general auditory stimulation. Using a counter-balanced within-subjects design, the music was not significantly better than the tones or control conditions in improving sleep onset latency, sleep efficiency, wake time after sleep onset, or percent slow wave sleep, as determined by objective physiological criteria. PMID:17123654

  15. Quantitative in silico Analysis of Neurotransmitter Pathways Under Steady State Conditions

    PubMed Central

    Calvetti, Daniela; Somersalo, Erkki

    2013-01-01

    The modeling of glutamate/GABA-glutamine cycling in the brain tissue involving astrocytes, glutamatergic and GABAergic neurons leads to a complex compartmentalized metabolic network that comprises neurotransmitter synthesis, shuttling, and degradation. Without advanced computational tools, it is difficult to quantitatively track possible scenarios and identify viable ones. In this article, we follow a sampling-based computational paradigm to analyze the biochemical network in a multi-compartment system modeling astrocytes, glutamatergic, and GABAergic neurons, and address some questions about the details of transmitter cycling, with particular emphasis on the ammonia shuttling between astrocytes and neurons, and the synthesis of transmitter GABA. More specifically, we consider the joint action of the alanine-lactate shuttle, the branched chain amino acid shuttle, and the glutamine-glutamate cycle, as well as the role of glutamate dehydrogenase (GDH) activity. When imposing a minimal amount of bound constraints on reaction and transport fluxes, a preferred stoichiometric steady state equilibrium requires an unrealistically high reductive GDH activity in neurons, indicating the need for additional bound constants which were included in subsequent computer simulations. The statistical flux balance analysis also suggests a stoichiometrically viable role for leucine transport as an alternative to glutamine for replenishing the glutamate pool in neurons. PMID:24115944

  16. Identification of Salmonella Typhimurium deubiquitinase SseL substrates by immunoaffinity enrichment and quantitative proteomic analysis

    SciTech Connect

    Nakayasu, Ernesto S.; Sydor, Michael A.; Brown, Roslyn N.; Sontag, Ryan L.; Sobreira, Tiago; Slysz, Gordon W.; Humphrys, Daniel R.; Skarina, Tatiana; Onoprienko, Olena; Di Leo, Rosa; Kaiser, Brooke LD; Li, Jie; Ansong, Charles; Cambronne, Eric; Smith, Richard D.; Savchenko, Alexei; Adkins, Joshua N.

    2015-07-06

    Ubiquitination is a key protein post-translational modification that regulates many important cellular pathways and whose levels are regulated by equilibrium between the activities of ubiquitin ligases and deubiquitinases. Here we present a method to identify specific deubiquitinase substrates based on treatment of cell lysates with recombinant enzymes, immunoaffinity purification and global quantitative proteomic analysis. As model system to identify substrates, we used a virulence-related deubiquitinase secreted by Salmonella enterica serovar Typhimurium into the host cells, SseL. Using this approach two SseL substrates were identified in RAW 264.7 murine macrophage-like cell line, S100A6 and het-erogeneous nuclear ribonuclear protein K, in addition to the previously reported K63-linked ubiquitin chains. These substrates were further validated by a combination of enzymatic and binding assays. This method can be used for the systematic identification of substrates of deubiquitinases from other organisms and applied to study their functions in physiology and disease.

  17. Quantitative analysis of core-shell catalyst nanoparticles for industrial applications

    NASA Astrophysics Data System (ADS)

    E, H.; Nellist, P. D.; Lozano-Perez, S.; Ozkaya, D.

    2012-07-01

    Pd@Pt core-shell designed nanoparticle catalysts have been shown to dramatically increase the activity and selectivity of the oxygen reduction reaction in fuel cells. Aberration corrected electron microscopy offers the spatial resolution and chemical sensitivity to unlock these structures at the atomic scale. Understanding the particle size, shape and the exact nature of the shell coverage (whether it is full, partial or whether the particle is alloyed) is vital to understanding their behaviour. This paves the way for even more effective catalyst designs. We present a semi-statistical investigation into the size, morphology and bimetallic content of various core-shell particle designs, pre- and post- fuel cell cycling, using high resolution HAADF STEM and EDX. In addition, careful quantitative analysis of our datasets will allow us to extract information, not only of the morphology, but also the thickness and coverage of the particle shells. We compare this with chemical findings about activity and selectivity to understand how shell coverage and content affect catalytic activity.

  18. Quantitative analysis of cytoskeletal reorganization during epithelial tissue sealing by large-volume electron tomography.

    PubMed

    Eltsov, Mikhail; Dubé, Nadia; Yu, Zhou; Pasakarnis, Laurynas; Haselmann-Weiss, Uta; Brunner, Damian; Frangakis, Achilleas S

    2015-05-01

    The closure of epidermal openings is an essential biological process that causes major developmental problems such as spina bifida in humans if it goes awry. At present, the mechanism of closure remains elusive. Therefore, we reconstructed a model closure event, dorsal closure in fly embryos, by large-volume correlative electron tomography. We present a comprehensive, quantitative analysis of the cytoskeletal reorganization, enabling separated epidermal cells to seal the epithelium. After establishing contact through actin-driven exploratory filopodia, cells use a single lamella to generate 'roof tile'-like overlaps. These shorten to produce the force, 'zipping' the tissue closed. The shortening overlaps lack detectable actin filament ensembles but are crowded with microtubules. Cortical accumulation of shrinking microtubule ends suggests a force generation mechanism in which cortical motors pull on microtubule ends as for mitotic spindle positioning. In addition, microtubules orient filopodia and lamellae before zipping. Our 4D electron microscopy picture describes an entire developmental process and provides fundamental insight into epidermal closure. PMID:25893916

  19. Quantitative assessment of chemical artefacts produced by propionylation of histones prior to mass spectrometry analysis.

    PubMed

    Soldi, Monica; Cuomo, Alessandro; Bonaldi, Tiziana

    2016-07-01

    Histone PTMs play a crucial role in regulating chromatin structure and function, with impact on gene expression. MS is nowadays widely applied to study histone PTMs systematically. Because histones are rich in arginine and lysine, classical shot-gun approaches based on trypsin digestion are typically not employed for histone modifications mapping. Instead, different protocols of chemical derivatization of lysines in combination with trypsin have been implemented to obtain "Arg-C like" digestion products that are more suitable for LC-MS/MS analysis. Although widespread, these strategies have been recently described to cause various side reactions that result in chemical modifications prone to be misinterpreted as native histone marks. These artefacts can also interfere with the quantification process, causing errors in histone PTMs profiling. The work of Paternoster V. et al. is a quantitative assessment of methyl-esterification and other side reactions occurring on histones after chemical derivatization of lysines with propionic anhydride [Proteomics 2016, 16, 2059-2063]. The authors estimate the effect of different solvents, incubation times, and pH on the extent of these side reactions. The results collected indicate that the replacement of methanol with isopropanol or ACN not only blocks methyl-esterification, but also significantly reduces other undesired unspecific reactions. Carefully titrating the pH after propionic anhydride addition is another way to keep methyl-esterification under control. Overall, the authors describe a set of experimental conditions that allow reducing the generation of various artefacts during histone propionylation. PMID:27373704

  20. Identification of Salmonella Typhimurium deubiquitinase SseL substrates by immunoaffinity enrichment and quantitative proteomic analysis

    DOE PAGESBeta

    Nakayasu, Ernesto S.; Sydor, Michael A.; Brown, Roslyn N.; Sontag, Ryan L.; Sobreira, Tiago; Slysz, Gordon W.; Humphrys, Daniel R.; Skarina, Tatiana; Onoprienko, Olena; Di Leo, Rosa; et al

    2015-07-06

    Ubiquitination is a key protein post-translational modification that regulates many important cellular pathways and whose levels are regulated by equilibrium between the activities of ubiquitin ligases and deubiquitinases. Here we present a method to identify specific deubiquitinase substrates based on treatment of cell lysates with recombinant enzymes, immunoaffinity purification and global quantitative proteomic analysis. As model system to identify substrates, we used a virulence-related deubiquitinase secreted by Salmonella enterica serovar Typhimurium into the host cells, SseL. By using this approach two SseL substrates were identified in RAW 264.7 murine macrophage-like cell line, S100A6 and het-erogeneous nuclear ribonuclear protein K, inmore » addition to the previously reported K63-linked ubiquitin chains. These substrates were further validated by a combination of enzymatic and binding assays. Finally, this method can be used for the systematic identification of substrates of deubiquitinases from other organisms and applied to study their functions in physiology and disease.« less

  1. Identification of Salmonella Typhimurium deubiquitinase SseL substrates by immunoaffinity enrichment and quantitative proteomic analysis

    SciTech Connect

    Nakayasu, Ernesto S.; Sydor, Michael A.; Brown, Roslyn N.; Sontag, Ryan L.; Sobreira, Tiago; Slysz, Gordon W.; Humphrys, Daniel R.; Skarina, Tatiana; Onoprienko, Olena; Di Leo, Rosa; Kaiser, Brooke L. Deatherage; Li, Jie; Ansong, Charles; Cambronne, Eric; Smith, Richard D.; Savchenko, Alexei; Adkins, Joshua N.

    2015-07-06

    Ubiquitination is a key protein post-translational modification that regulates many important cellular pathways and whose levels are regulated by equilibrium between the activities of ubiquitin ligases and deubiquitinases. Here we present a method to identify specific deubiquitinase substrates based on treatment of cell lysates with recombinant enzymes, immunoaffinity purification and global quantitative proteomic analysis. As model system to identify substrates, we used a virulence-related deubiquitinase secreted by Salmonella enterica serovar Typhimurium into the host cells, SseL. By using this approach two SseL substrates were identified in RAW 264.7 murine macrophage-like cell line, S100A6 and het-erogeneous nuclear ribonuclear protein K, in addition to the previously reported K63-linked ubiquitin chains. These substrates were further validated by a combination of enzymatic and binding assays. Finally, this method can be used for the systematic identification of substrates of deubiquitinases from other organisms and applied to study their functions in physiology and disease.

  2. Quantitative Phosphokinome Analysis of the Met Pathway Activated by the Invasin Internalin B from Listeria monocytogenes*

    PubMed Central

    Reinl, Tobias; Nimtz, Manfred; Hundertmark, Claudia; Johl, Thorsten; Kéri, György; Wehland, Jürgen; Daub, Henrik; Jänsch, Lothar

    2009-01-01

    Stimulated by its physiological ligand, hepatocyte growth factor, the transmembrane receptor tyrosine kinase Met activates a signaling machinery that leads to mitogenic, motogenic, and morphogenic responses. Remarkably, the food-borne human pathogen Listeria monocytogenes also promotes autophosphorylation of Met through its virulence factor internalin B (InlB) and subsequently exploits Met signaling to induce phagocytosis into a broad range of host cells. Although the interaction between InlB and Met has been studied in detail, the signaling specificity of components involved in InlB-triggered cellular responses remains poorly characterized. The analysis of regulated phosphorylation events on protein kinases is therefore of particular relevance, although this could not as yet be characterized systematically by proteomics. Here, we implemented a new pyridopyrimidine-based strategy that enabled the efficient capture of a considerable subset of the human kinome in a robust one-step affinity chromatographic procedure. Additionally, and to gain functional insights into the InlB/Met-induced bacterial invasion process, a quantitative survey of the phosphorylation pattern of these protein kinases was accomplished. In total, the experimental design of this study comprises affinity chromatographic procedures for the systematic enrichment of kinases, as well as phosphopeptides; the quantification of all peptides based on the iTRAQTM reporter system; and a rational statistical strategy to evaluate the quality of phosphosite regulations. With this improved chemical proteomics strategy, we determined and relatively quantified 143 phosphorylation sites detected on 94 human protein kinases. Interestingly, InlB-mediated signaling shows striking similarities compared with the natural ligand hepatocyte growth factor that was intensively studied in the past. In addition, this systematic approach suggests a new subset of protein kinases including Nek9, which are differentially

  3. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  4. Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I

    ERIC Educational Resources Information Center

    Furner, Jonathan

    2009-01-01

    This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and printing history of…

  5. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    ERIC Educational Resources Information Center

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  6. A Computer Program for Calculation of Calibration Curves for Quantitative X-Ray Diffraction Analysis.

    ERIC Educational Resources Information Center

    Blanchard, Frank N.

    1980-01-01

    Describes a FORTRAN IV program written to supplement a laboratory exercise dealing with quantitative x-ray diffraction analysis of mixtures of polycrystalline phases in an introductory course in x-ray diffraction. Gives an example of the use of the program and compares calculated and observed calibration data. (Author/GS)

  7. Quantitative Analysis of Organic Compounds: A Simple and Rapid Method for Use in Schools

    ERIC Educational Resources Information Center

    Schmidt, Hans-Jurgen

    1973-01-01

    Describes the procedure for making a quantitative analysis of organic compounds suitable for secondary school chemistry classes. Using the Schoniger procedure, the organic compound, such as PVC, is decomposed in a conical flask with oxygen. The products are absorbed in a suitable liquid and analyzed by titration. (JR)

  8. Whose American Government? A Quantitative Analysis of Gender and Authorship in American Politics Texts

    ERIC Educational Resources Information Center

    Cassese, Erin C.; Bos, Angela L.; Schneider, Monica C.

    2014-01-01

    American government textbooks signal to students the kinds of topics that are important and, by omission, the kinds of topics that are not important to the discipline of political science. This article examines portrayals of women in introductory American politics textbooks through a quantitative content analysis of 22 widely used texts. We find…

  9. QUANTITATIVE PCR ANALYSIS OF MOLDS IN THE DUST FROM HOMES OF ASTHMATIC CHILDREN IN NORTH CAROLINA

    EPA Science Inventory

    The vacuum bag (VB) dust was analyzed by mold specific quantitative PCR. These results were compared to the analysis survey calculated for each of the homes. The mean and standard deviation (SD) of the ERMI values in the homes of the NC asthmatic children was 16.4 (6.77), compa...

  10. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  11. A Quantitative Categorical Analysis of Metadata Elements in Image-Applicable Metadata Schemas.

    ERIC Educational Resources Information Center

    Greenberg, Jane

    2001-01-01

    Reports on a quantitative categorical analysis of metadata elements in the Dublin Core, VRA (Visual Resource Association) Core, REACH (Record Export for Art and Cultural Heritage), and EAD (Encoded Archival Description) metadata schemas, all of which can be used for organizing and describing images. Introduces a new schema comparison methodology…

  12. Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium

    NASA Technical Reports Server (NTRS)

    Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.

    1969-01-01

    Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

  13. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  14. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  15. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  16. Quantitative Intersectionality: A Critical Race Analysis of the Chicana/o Educational Pipeline

    ERIC Educational Resources Information Center

    Covarrubias, Alejandro

    2011-01-01

    Utilizing the critical race framework of intersectionality, this research reexamines the Chicana/o educational pipeline through a quantitative intersectional analysis. This approach disaggregates data along the intersection of race, class, gender, and citizenship status to provide a detailed portrait of the educational trajectory of Mexican-origin…

  17. A Colorimetric Analysis Experiment Not Requiring a Spectrophotometer: Quantitative Determination of Albumin in Powdered Egg White

    ERIC Educational Resources Information Center

    Charlton, Amanda K.; Sevcik, Richard S.; Tucker, Dorie A.; Schultz, Linda D.

    2007-01-01

    A general science experiment for high school chemistry students might serve as an excellent review of the concepts of solution preparation, solubility, pH, and qualitative and quantitative analysis of a common food product. The students could learn to use safe laboratory techniques, collect and analyze data using proper scientific methodology and…

  18. A Quantitative Analysis of Cognitive Strategy Usage in the Marking of Two GCSE Examinations

    ERIC Educational Resources Information Center

    Suto, W. M. Irenka; Greatorex, Jackie

    2008-01-01

    Diverse strategies for marking GCSE examinations have been identified, ranging from simple automatic judgements to complex cognitive operations requiring considerable expertise. However, little is known about patterns of strategy usage or how such information could be utilised by examiners. We conducted a quantitative analysis of previous verbal…

  19. Gas chromatograph-mass spectrometer (GC/MS) system for quantitative analysis of reactive chemical compounds

    DOEpatents

    Grindstaff, Quirinus G.

    1992-01-01

    Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.

  20. A Quantitative Discourse Analysis of Student-Initiated Checks of Understanding during Teacher-Fronted Lessons

    ERIC Educational Resources Information Center

    Shepherd, Michael A.

    2012-01-01

    Recent research highlights the paradoxical importance of students' being able to check their understanding with teachers and of teachers' constraining student participation. Using quantitative discourse analysis, this paper examines third graders' discursive strategies in initiating such checks and teachers' strategies in constraining them. The…

  1. Clinical applications of a quantitative analysis of regional lift ventricular wall motion

    NASA Technical Reports Server (NTRS)

    Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.

    1975-01-01

    Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.

  2. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  3. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Rode, Karyn D.; Budge, Suzanne M.; Thiemann, Gregory W.

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted.

  4. Quantitative analysis of glycerol in dicarboxylic acid-rich cutins provides insights into Arabidopsis cutin structure.

    PubMed

    Yang, Weili; Pollard, Mike; Li-Beisson, Yonghua; Ohlrogge, John

    2016-10-01

    Cutin is an extracellular lipid polymer that contributes to protective cuticle barrier functions against biotic and abiotic stresses in land plants. Glycerol has been reported as a component of cutin, contributing up to 14% by weight of total released monomers. Previous studies using partial hydrolysis of cuticle-enriched preparations established the presence of oligomers with glycerol-aliphatic ester links. Furthermore, glycerol-3-phosphate 2-O-acyltransferases (sn-2-GPATs) are essential for cutin biosynthesis. However, precise roles of glycerol in cutin assembly and structure remain uncertain. Here, a stable isotope-dilution assay was developed for the quantitative analysis of glycerol by GC/MS of triacetin with simultaneous determination of aliphatic monomers. To provide clues about the role of glycerol in dicarboxylic acid (DCA)-rich cutins, this methodology was applied to compare wild-type (WT) Arabidopsis cutin with a series of mutants that are defective in cutin synthesis. The molar ratio of glycerol to total DCAs in WT cutins was 2:1. Even when allowing for a small additional contribution from hydroxy fatty acids, this is a substantially higher glycerol to aliphatic monomer ratio than previously reported for any cutin. Glycerol content was strongly reduced in both stem and leaf cutin from all Arabidopsis mutants analyzed (gpat4/gpat8, att1-2 and lacs2-3). In addition, the molar reduction of glycerol was proportional to the molar reduction of total DCAs. These results suggest "glycerol-DCA-glycerol" may be the dominant motif in DCA-rich cutins. The ramifications and caveats for this hypothesis are presented. PMID:27211345

  5. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    PubMed Central

    Bromaghin, Jeffrey F; Rode, Karyn D; Budge, Suzanne M; Thiemann, Gregory W

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback–Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted. PMID:25859330

  6. Genome-scan analysis for quantitative trait loci in an F2 tilapia hybrid.

    PubMed

    Cnaani, A; Zilberman, N; Tinman, S; Hulata, G; Ron, M

    2004-09-01

    We searched for genetic linkage between DNA markers and quantitative trait loci (QTLs) for innate immunity, response to stress, biochemical parameters of blood, and fish size in an F2 population derived from an interspecific tilapia hybrid (Oreochromis mossambicusx O. aureus). A family of 114 fish was scanned for 40 polymorphic microsatellite DNA markers and two polymorphic genes, covering approximately 80% of the tilapia genome. These fish had previously been phenotyped for seven immune-response traits and six blood parameters. Critical values for significance were P <0.05 with the false discovery rate (FDR) controlled at 40%. The genome-scan analysis resulted in 35 significant marker-trait associations, involving 26 markers in 16 linkage groups. In a second experiment, nine markers were re-sampled in a second family of 79 fish of the same species hybrid. Seven markers (GM180, GM553, MHC-I, UNH848, UNH868, UNH898 and UNH925) in five linkage groups (LG 1, 3, 4, 22 and 23) were associated with stress response traits. An additional six markers (GM47, GM552, UNH208, UNH881, UNH952, UNH998) in five linkage groups (LG 4, 16, 19, 20 and 23) were verified for their associations with immune response traits, by linkage to several different traits. The portion of variance explained by each QTL was 11% on average, with a maximum of 29%. The average additive effect of QTLs was 0.2 standard deviation units of stress response traits and fish size, with a maximum of 0.33. In three linkage groups (LG 1, 3 and 23) markers were associated with stress response, body weight and sex determination, confirming the location of QTLs reported by several other studies. PMID:15449174

  7. Distance measures and optimization spaces in quantitative fatty acid signature analysis.

    PubMed

    Bromaghin, Jeffrey F; Rode, Karyn D; Budge, Suzanne M; Thiemann, Gregory W

    2015-03-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted. PMID:25859330

  8. Quantitative analysis of defects in silicon. Silicon sheet growth development for the large are silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Smith, J. M.; Bruce, T.; Oidwai, H. A.

    1980-01-01

    One hundred and seventy four silicon sheet samples were analyzed for twin boundary density, dislocation pit density, and grain boundary length. Procedures were developed for the quantitative analysis of the twin boundary and dislocation pit densities using a QTM-720 Quantitative Image Analyzing system. The QTM-720 system was upgraded with the addition of a PDP 11/03 mini-computer with dual floppy disc drive, a digital equipment writer high speed printer, and a field-image feature interface module. Three versions of a computer program that controls the data acquisition and analysis on the QTM-720 were written. Procedures for the chemical polishing and etching were also developed.

  9. Electroencephalography reactivity for prognostication of post-anoxic coma after cardiopulmonary resuscitation: A comparison of quantitative analysis and visual analysis.

    PubMed

    Liu, Gang; Su, Yingying; Jiang, Mengdi; Chen, Weibi; Zhang, Yan; Zhang, Yunzhou; Gao, Daiquan

    2016-07-28

    Electroencephalogram reactivity (EEG-R) is a positive predictive factor for assessing outcomes in comatose patients. Most studies assess the prognostic value of EEG-R utilizing visual analysis; however, this method is prone to subjectivity. We sought to categorize EEG-R with a quantitative approach. We retrospectively studied consecutive comatose patients who had an EEG-R recording performed 1-3 days after cardiopulmonary resuscitation (CPR) or during normothermia after therapeutic hypothermia. EEG-R was assessed via visual analysis and quantitative analysis separately. Clinical outcomes were followed-up at 3-month and dichotomized as recovery of awareness or no recovery of awareness. A total of 96 patients met the inclusion criteria, and 38 (40%) patients recovered awareness at 3-month followed-up. Of 27 patients with EEG-R measured with visual analysis, 22 patients recovered awareness; and of the 69 patients who did not demonstrated EEG-R, 16 patients recovered awareness. The sensitivity and specificity of visually measured EEG-R were 58% and 91%, respectively. The area under the receiver operating characteristic curve for the quantitative analysis was 0.92 (95% confidence interval, 0.87-0.97), with the best cut-off value of 0.10. EEG-R through quantitative analysis might be a good method in predicting the recovery of awareness in patients with post-anoxic coma after CPR. PMID:27181515

  10. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.

  11. A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi

    PubMed Central

    Hodgkinson, A.

    1971-01-01

    A better understanding of the physico-chemical principles underlying the formation of calculus has led to a need for more precise information on the chemical composition of stones. A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi which is suitable for routine use is presented. The procedure involves five simple qualitative tests followed by the quantitative determination of calcium, magnesium, inorganic phosphate, and oxalate. These data are used to calculate the composition of the stone in terms of calcium oxalate, apatite, and magnesium ammonium phosphate. Analytical results and derived values for five representative types of calculi are presented. PMID:5551382

  12. Quantitative analysis of proteome extracted from barley crowns grown under different drought conditions

    PubMed Central

    Vítámvás, Pavel; Urban, Milan O.; Škodáček, Zbynek; Kosová, Klára; Pitelková, Iva; Vítámvás, Jan; Renaut, Jenny; Prášil, Ilja T.

    2015-01-01

    Barley cultivar Amulet was used to study the quantitative proteome changes through different drought conditions utilizing two-dimensional difference gel electrophoresis (2D-DIGE). Plants were cultivated for 10 days under different drought conditions. To obtain control and differentially drought-treated plants, the soil water content was kept at 65, 35, and 30% of soil water capacity (SWC), respectively. Osmotic potential, water saturation deficit, 13C discrimination, and dehydrin accumulation were monitored during sampling of the crowns for proteome analysis. Analysis of the 2D-DIGE gels revealed 105 differentially abundant spots; most were differentially abundant between the controls and drought-treated plants, and 25 spots displayed changes between both drought conditions. Seventy-six protein spots were successfully identified by tandem mass spectrometry. The most frequent functional categories of the identified proteins can be put into the groups of: stress-associated proteins, amino acid metabolism, carbohydrate metabolism, as well as DNA and RNA regulation and processing. Their possible role in the response of barley to drought stress is discussed. Our study has shown that under drought conditions barley cv. Amulet decreased its growth and developmental rates, displayed a shift from aerobic to anaerobic metabolism, and exhibited increased levels of several protective proteins. Comparison of the two drought treatments revealed plant acclimation to milder drought (35% SWC); but plant damage under more severe drought treatment (30% SWC). The results obtained revealed that cv. Amulet is sensitive to drought stress. Additionally, four spots revealing a continuous and significant increase with decreasing SWC (UDP-glucose 6-dehydrogenase, glutathione peroxidase, and two non-identified) could be good candidates for testing of their protein phenotyping capacity together with proteins that were significantly distinguished in both drought treatments. PMID:26175745

  13. Quantitative trace analysis of fullerenes in river sediment from Spain and soils from Saudi Arabia.

    PubMed

    Sanchís, Josep; Božović, Dalibor; Al-Harbi, Naif A; Silva, Luis F; Farré, Marinella; Barceló, Damià

    2013-07-01

    A quantitative method based on ultrasound-assisted toluene extraction followed by liquid chromatography-electrospray ionization-tandem mass spectrometry for the analysis of C60 and C70 fullerenes, N-methylfulleropyrrolidine, [6, 6]-phenyl C61 butyric acid methyl ester and [6, 6]-thienyl C61 butyric acid methyl ester has been developed. The method was validated using fortified blank river sediments according to the criteria of Commission Decision 2002/657/EC. The method limits of detection ranged from 14 to 290 pg/g, making it suitable for its application in environmental analysis. The method has been applied to investigate fullerene content in 58 soil samples collected from different urban and industrial areas in Saudi Arabia and in river sediment from six different sites in the Llobregat River Basin. In addition, in the case of the Llobregat River, superficial water samples from the same sites of the sediments were collected and analysed using a previous method. In soils from Saudi Arabia, C60-fullerene was the only compound that was detected and quantified in 19% of samples. In the sediments of the Llobregat River, C60-fullerene was also the only one detected (33% of the samples), while in river water, C70-fullerene was the most frequent compound, and it was quantified in 67% of the samples. However, C60-fullerene was present in two of the six samples, but at higher concentrations than C70-fullerene, ranging from 0.9 to 7.8 ng/L. PMID:23545859

  14. Perspective - synthetic DEMs: A vital underpinning for the quantitative future of landform analysis?

    NASA Astrophysics Data System (ADS)

    Hillier, J. K.; Sofia, G.; Conway, S. J.

    2015-12-01

    Physical processes, including anthropogenic feedbacks, sculpt planetary surfaces (e.g. Earth's). A fundamental tenet of geomorphology is that the shapes created, when combined with other measurements, can be used to understand those processes. Artificial or synthetic digital elevation models (DEMs) might be vital in progressing further with this endeavour in two ways. First, synthetic DEMs can be built (e.g. by directly using governing equations) to encapsulate the processes, making predictions from theory. A second, arguably underutilised, role is to perform checks on accuracy and robustness that we dub "synthetic tests". Specifically, synthetic DEMs can contain a priori known, idealised morphologies that numerical landscape evolution models, DEM-analysis algorithms, and even manual mapping can be assessed against. Some such tests, for instance examining inaccuracies caused by noise, are moderately commonly employed, whilst others are much less so. Derived morphological properties, including metrics and mapping (manual and automated), are required to establish whether or not conceptual models represent reality well, but at present their quality is typically weakly constrained (e.g. by mapper inter-comparison). Relatively rare examples illustrate how synthetic tests can make strong "absolute" statements about landform detection and quantification; for example, 84 % of valley heads in the real landscape are identified correctly. From our perspective, it is vital to verify such statistics quantifying the properties of landscapes as ultimately this is the link between physics-driven models of processes and morphological observations that allows quantitative hypotheses to be tested. As such the additional rigour possible with this second usage of synthetic DEMs feeds directly into a problem central to the validity of much of geomorphology. Thus, this note introduces synthetic tests and DEMs and then outlines a typology of synthetic DEMs along with their benefits

  15. Quantitative hopanoid analysis enables robust pattern detection and comparison between laboratories.

    PubMed

    Wu, C-H; Kong, L; Bialecka-Fornal, M; Park, S; Thompson, A L; Kulkarni, G; Conway, S J; Newman, D K

    2015-07-01

    Hopanoids are steroid-like lipids from the isoprenoid family that are produced primarily by bacteria. Hopanes, molecular fossils of hopanoids, offer the potential to provide insight into environmental transitions on the early Earth, if their sources and biological functions can be constrained. Semiquantitative methods for mass spectrometric analysis of hopanoids from cultures and environmental samples have been developed in the last two decades. However, the structural diversity of hopanoids, and possible variability in their ionization efficiencies on different instruments, have thus far precluded robust quantification and hindered comparison of results between laboratories. These ionization inconsistencies give rise to the need to calibrate individual instruments with purified hopanoids to reliably quantify hopanoids. Here, we present new approaches to obtain both purified and synthetic quantification standards. We optimized 2-methylhopanoid production in Rhodopseudomonas palustris TIE-1 and purified 2Me-diplopterol, 2Me-bacteriohopanetetrol (2Me-BHT), and their unmethylated species (diplopterol and BHT). We found that 2-methylation decreases the signal intensity of diplopterol between 2 and 34% depending on the instrument used to detect it, but decreases the BHT signal less than 5%. In addition, 2Me-diplopterol produces 10× higher ion counts than equivalent quantities of 2Me-BHT. Similar deviations were also observed using a flame ionization detector for signal quantification in GC. In LC-MS, however, 2Me-BHT produces 11× higher ion counts than 2Me-diplopterol but only 1.2× higher ion counts than the sterol standard pregnane acetate. To further improve quantification, we synthesized tetradeuterated (D4) diplopterol, a precursor for a variety of hopanoids. LC-MS analysis on a mixture of (D4)-diplopterol and phospholipids showed that under the influence of co-eluted phospholipids, the D4-diplopterol internal standard quantifies diplopterol more accurately than

  16. Quantitative hopanoid analysis enables robust pattern detection and comparison between laboratories

    PubMed Central

    Wu, C-H; Kong, L; Bialecka-Fornal, M; Park, S; Thompson, A L; Kulkarni, G; Conway, S J; Newman, D K

    2015-01-01

    Hopanoids are steroid-like lipids from the isoprenoid family that are produced primarily by bacteria. Hopanes, molecular fossils of hopanoids, offer the potential to provide insight into environmental transitions on the early Earth, if their sources and biological functions can be constrained. Semiquantitative methods for mass spectrometric analysis of hopanoids from cultures and environmental samples have been developed in the last two decades. However, the structural diversity of hopanoids, and possible variability in their ionization efficiencies on different instruments, have thus far precluded robust quantification and hindered comparison of results between laboratories. These ionization inconsistencies give rise to the need to calibrate individual instruments with purified hopanoids to reliably quantify hopanoids. Here, we present new approaches to obtain both purified and synthetic quantification standards. We optimized 2-methylhopanoid production in Rhodopseudomonas palustris TIE-1 and purified 2Me-diplopterol, 2Me-bacteriohopanetetrol (2Me-BHT), and their unmethylated species (diplopterol and BHT). We found that 2-methylation decreases the signal intensity of diplopterol between 2 and 34% depending on the instrument used to detect it, but decreases the BHT signal less than 5%. In addition, 2Me-diplopterol produces 10× higher ion counts than equivalent quantities of 2Me-BHT. Similar deviations were also observed using a flame ionization detector for signal quantification in GC. In LC-MS, however, 2Me-BHT produces 11× higher ion counts than 2Me-diplopterol but only 1.2× higher ion counts than the sterol standard pregnane acetate. To further improve quantification, we synthesized tetradeuterated (D4) diplopterol, a precursor for a variety of hopanoids. LC-MS analysis on a mixture of (D4)-diplopterol and phospholipids showed that under the influence of co-eluted phospholipids, the D4-diplopterol internal standard quantifies diplopterol more accurately than

  17. Qualitative and quantitative analysis of uroliths in dogs: definitive determination of chemical type.

    PubMed

    Bovee, K C; McGuire, T

    1984-11-01

    Effective treatment and prevention of urolithiasis depends on accurate determination of the chemical nature of the uroliths. A widely used qualitative chemical procedure was compared with quantitative crystallographic analysis of 272 canine uroliths. Agreement between the 2 methods was 78%. Qualitative analysis failed to detect 62% of calcium-containing uroliths and 83% of carbonate apatite uroliths. Qualitative analysis gave false-positive results for urates in 55% of cystine uroliths. Mixed uroliths comprising 6% of the total could not be classified without quantitative analysis. Silicate, cystine, and urate uroliths generally were of pure composition. Crystallographic analysis indicated the following distribution of major types: struvite, 69%; calcium oxalate, 10%; urate, 7%; silicate, 3.5%; cystine, 3.2%; calcium phosphate, 1%; and mixed, 6%. Among dogs with struvite uroliths, 66% had positive results of bacterial culturing from the urinary bladder. Six breeds (Miniature Schnauzer, Welsh Corgi, Lhasa Apso, Yorkshire Terrier, Pekingese, and Pug) had a significantly higher risk for urolithiasis, compared with other breeds. The German Shepherd Dog had a significantly lowered risk, compared with other breeds. Two breeds had significant relationship to a specific type of urolith: Miniature Schnauzer for oxalate, and Dalmatian for urate (P less than 0.001). It was concluded that quantitative analysis, using crystallography, was superior for the detection of calcium oxalate, carbonate apatite, cystine, urate, and mixed uroliths. PMID:6511641

  18. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows. PMID:26138893

  19. Analysis methods for the determination of anthropogenic additions of P to agricultural soils

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Phosphorus additions and measurement in soil is of concern on lands where biosolids have been applied. Colorimetric analysis for plant-available P may be inadequate for the accurate assessment of soil P. Phosphate additions in a regulatory environment need to be accurately assessed as the reported...

  20. Application of BP Neural Network Based on Genetic Algorithm in Quantitative Analysis of Mixed GAS

    NASA Astrophysics Data System (ADS)

    Chen, Hongyan; Liu, Wenzhen; Qu, Jian; Zhang, Bing; Li, Zhibin

    Aiming at the problem of mixed gas detection in neural network and analysis on the principle of gas detection. Combining BP algorithm of genetic algorithm with hybrid gas sensors, a kind of quantitative analysis system of mixed gas is designed. The local minimum of network learning is the main reason which affects the precision of gas analysis. On the basis of the network study to improve the learning algorithms, the analyses and tests for CO, CO2 and HC compounds were tested. The results showed that the above measures effectively improve and enhance the accuracy of the neural network for gas analysis.

  1. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples. PMID:25993858

  2. Quantitative Analysis of Pork and Chicken Products by Droplet Digital PCR

    PubMed Central

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises. PMID:25243184

  3. Stable isotope labeling of mammals (SILAM) for in vivo quantitative proteomic analysis.

    PubMed

    Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2013-06-15

    Metabolic labeling of rodent proteins with ¹⁵N, a heavy stable isotope of nitrogen, provides an efficient way for relative quantitation of differentially expressed proteins. Here we describe a protocol for metabolic labeling of rats with an ¹⁵N-enriched spirulina diet. As a case study, we also demonstrate the application of ¹⁵N-enriched tissue as a common internal standard in quantitative analysis of differentially expressed proteins in neurodevelopment in rats at two different time points, postnatal day 1 and 45. We briefly discuss the bioinformatics tools, ProLucid and Census, which can easily be used in a sequential manner to identify and quantitate relative protein levels on a proteomic scale. PMID:23523555

  4. Quantitative LC-MS/MS Glycomic Analysis of Biological Samples Using AminoxyTMT.

    PubMed

    Zhou, Shiyue; Hu, Yunli; Veillon, Lucas; Snovida, Sergei I; Rogers, John C; Saba, Julian; Mechref, Yehia

    2016-08-01

    Protein glycosylation plays an important role in various biological processes, such as modification of protein function, regulation of protein-protein interactions, and control of turnover rates of proteins. Moreover, glycans have been considered as potential biomarkers for many mammalian diseases and development of aberrant glycosylation profiles is an important indicator of the pathology of a disease or cancer. Hence, quantitation is an important aspect of a comprehensive glycomics study. Although numerous MS-based quantitation strategies have been developed in the past several decades, some issues affecting sensitivity and accuracy of quantitation still exist, and the development of more effective quantitation strategies is still required. Aminoxy tandem mass tag (aminoxyTMT) reagents are recently commercialized isobaric tags which enable relative quantitation of up to six different glycan samples simultaneously. In this study, liquid chromatography and mass spectrometry conditions have been optimized to achieve reliable LC-MS/MS quantitative glycomic analysis using aminoxyTMT reagents. Samples were resuspended in 0.2 M sodium chloride solution to promote the formation of sodium adduct precursor ions, which leads to higher MS/MS reporter ion yields. This method was first evaluated with glycans from model glycoproteins and pooled human blood serum samples. The observed variation of reporter ion ratios was generally less than 10% relative to the theoretical ratio. Even for the highly complex minor N-glycans, the variation was still below 15%. This strategy was further applied to the glycomic profiling of N-glycans released from blood serum samples of patients with different esophageal diseases. Our results demonstrate the benefits of utilizing aminoxyTMT reagents for reliable quantitation of biological glycomic samples. PMID:27377957

  5. A Quantitative Threats Analysis for the Florida Manatee (Trichechus manatus latirostris)

    USGS Publications Warehouse

    Runge, Michael C.; Sanders-Reed, Carol A.; Langtimm, Catherine A.; Fonnesbeck, Christopher J.

    2007-01-01

    The Florida manatee (Trichechus manatus latirostris) is an endangered marine mammal endemic to the southeastern United States. The primary threats to manatee populations are collisions with watercraft and the potential loss of warm-water refuges. For the purposes of listing, recovery, and regulation under the Endangered Species Act (ESA), an understanding of the relative effects of the principal threats is needed. This work is a quantitative approach to threats analysis, grounded in the assumption that an appropriate measure of status under the ESA is based on the risk of extinction, as quantified by the probability of quasi-extinction. This is related to the qualitative threats analyses that are more common under the ESA, but provides an additional level of rigor, objectivity, and integration. In this approach, our philosophy is that analysis of the five threat factors described in Section 4(a)(1) of the ESA can be undertaken within an integrated quantitative framework. The basis of this threats analysis is a comparative population viability analysis. This involves forecasting the Florida manatee population under different scenarios regarding the presence of threats, while accounting for process variation (environmental, demographic, and catastrophic stochasticity) as well as parametric and structural uncertainty. We used the manatee core biological model (CBM) for this viability analysis, and considered the role of five threats: watercraft-related mortality, loss of warm-water habitat in winter, mortality in water-control structures, entanglement, and red tide. All scenarios were run with an underlying parallel structure that allowed a more powerful estimation of the effects of the various threats. The results reflect our understanding of manatee ecology (as captured in the structure of the CBM), our estimates of manatee demography (as described by the parameters in the model), and our characterization of the mechanisms by which the threats act on manatees. As an

  6. Adenoid tissue lymphocyte subpopulations--evaluation of a quantitative analysis with flow cytometry.

    PubMed

    Hemlin, C; Carenfelt, C; Halldén, G; Hed, J; Scheynius, A

    1993-07-01

    Secretory otitis media (SOM) is a common childhood disease without a completely clarified etiology. A chronic inflammatory condition in the nasopharynx, presumably caused by an increased bacterial load, is one factor of probable etiological importance. In the present study a flow cytometric method for analysis of adenoid lymphoid cell populations was developed to facilitate quantitative comparisons between children with SOM and children without ear disease. Adenoids removed from 18 children due to adenoid hyperplasia and obstructive symptoms were studied. Results of the flow cytometric analysis correlated well with the findings from immunohistological studies of five of the adenoids. PCA-1 and CD25 were found to be good markers of increased cellular activity after non-specific stimulation in cell culture. It is concluded that the flow cytometric method is suitable for further quantitative analysis of adenoid tissue. PMID:8398095

  7. Identification and quantitative analysis of chemical compounds based on multiscale linear fitting of terahertz spectra

    NASA Astrophysics Data System (ADS)

    Qiao, Lingbo; Wang, Yingxin; Zhao, Ziran; Chen, Zhiqiang

    2014-07-01

    Terahertz (THz) time-domain spectroscopy is considered as an attractive tool for the analysis of chemical composition. The traditional methods for identification and quantitative analysis of chemical compounds by THz spectroscopy are all based on full-spectrum data. However, intrinsic features of the THz spectrum only lie in absorption peaks due to existence of disturbances, such as unexpected components, scattering effects, and barrier materials. We propose a strategy that utilizes Lorentzian parameters of THz absorption peaks, extracted by a multiscale linear fitting method, for both identification of pure chemicals and quantitative analysis of mixtures. The multiscale linear fitting method can automatically remove background content and accurately determine Lorentzian parameters of the absorption peaks. The high recognition rate for 16 pure chemical compounds and the accurate predicted concentrations for theophylline-lactose mixtures demonstrate the practicability of our approach.

  8. Quantitative end qualitative analysis of the electrical activity of rectus abdominis muscle portions.

    PubMed

    Negrão Filho, R de Faria; Bérzin, F; Souza, G da Cunha

    2003-01-01

    The purpose of this study was to investigate the electrical behavior pattern of the Rectus abdominis muscle by qualitative and quantitative analysis of the electromyographic signal obtained from its superior, medium and inferior portions during dynamic and static activities. Ten voluntaries (aged X = 17.8 years, SD = 1.6) athletic males were studied without history of muscle skeletal disfunction. For the quantitative analysis the RMS (Root Mean Square) values obtained in the electromyographic signal during the isometric exercises were normalized and expressed in maximum voluntary isometric contraction percentages. For the qualitative analysis of the dynamic activity the electromyographic signal was processed by full-wave rectification, linear envelope and normalization (amplitude and time), so that the resulting curve of the processed signal was submitted to descriptive graphic analysis. The results of the quantitative study show that there is not a statistically significant difference among the portions of the muscle. Qualitative analysis demonstrated two aspects: the presence of a common activation electric pattern in the portions of Rectus abdominis muscle and the absence of significant difference in the inclination angles in the electrical activity curve during the isotonic exercises. PMID:12964259

  9. Application of terahertz time-domain spectroscopy combined with chemometrics to quantitative analysis of imidacloprid in rice samples

    NASA Astrophysics Data System (ADS)

    Chen, Zewei; Zhang, Zhuoyong; Zhu, Ruohua; Xiang, Yuhong; Yang, Yuping; Harrington, Peter B.

    2015-12-01

    Terahertz time-domain spectroscopy (THz-TDS) has been utilized as an effective tool for quantitative analysis of imidacloprid in rice powder samples. Unlike previous studies, our method for sample preparation was mixing imidacloprid with rice powder instead of polyethylene. Then, terahertz time domain transmission spectra of these mixed samples were measured and the absorption coefficient spectra of the samples with frequency range extending from 0.3 to 1.7 THz were obtained. Asymmetric least square (AsLS) method was utilized to correct the slope baselines that are presented in THz absorption coefficient spectra and improve signal-to-noise ratio of THz spectra. Chemometrics methods, including partial least squares (PLS), support vector regression (SVR), interval partial least squares (iPLS), and backward interval partial least squares (biPLS), were used for quantitative model building and prediction. To achieve a reliable and unbiased estimation, bootstrapped Latin partition was chosen as an approach for statistical cross-validation. Results showed that the mean value of root mean square error of prediction (RMSEP) for PLS (0.5%) is smaller than SVR (0.7%), these two methods were based on the whole absorption coefficient spectra. In addition, PLS performed a better performance with a lower RMSEP (0.3%) based on the THz absorption coefficient spectra after AsLS baseline correction. Alternatively, two methods for variable selection, namely iPLS and biPLS, yielded models with improved predictions. Comparing with conventional PLS and SVR, the mean values of RMSEP were 0.4% (iPLS) and 0.3% (biPLS) by selecting the informative frequency ranges. The results demonstrated that an accurate quantitative analysis of imidacloprid in rice powder samples could be achieved by terahertz time-domain transmission spectroscopy combined with chemometrics. Furthermore, these results demonstrate that THz time-domain spectroscopy can be used for quantitative determinations of other

  10. The Isotropic Fractionator as a Tool for Quantitative Analysis in Central Nervous System Diseases.

    PubMed

    Repetto, Ivan E; Monti, Riccardo; Tropiano, Marta; Tomasi, Simone; Arbini, Alessia; Andrade-Moraes, Carlos-Humberto; Lent, Roberto; Vercelli, Alessandro

    2016-01-01

    brain and also in discrete regions of interest, with the potential to investigate non-neuronal alterations. Moreover, IF could be used in addition or in substitution to classical stereological techniques or TTC staining used so far, since it is fast, precise and easily combined with complex molecular analysis. PMID:27547177

  11. The Isotropic Fractionator as a Tool for Quantitative Analysis in Central Nervous System Diseases

    PubMed Central

    Repetto, Ivan E.; Monti, Riccardo; Tropiano, Marta; Tomasi, Simone; Arbini, Alessia; Andrade-Moraes, Carlos-Humberto; Lent, Roberto; Vercelli, Alessandro

    2016-01-01

    brain and also in discrete regions of interest, with the potential to investigate non-neuronal alterations. Moreover, IF could be used in addition or in substitution to classical stereological techniques or TTC staining used so far, since it is fast, precise and easily combined with complex molecular analysis. PMID:27547177

  12. Quantitative material analysis by dual-energy computed tomography for industrial NDT applications

    NASA Astrophysics Data System (ADS)

    Nachtrab, F.; Weis, S.; Keßling, P.; Sukowski, F.; Haßler, U.; Fuchs, T.; Uhlmann, N.; Hanke, R.

    2011-05-01

    Dual-energy computed tomography (DECT) is an established method in the field of medical CT to obtain quantitative information on a material of interest instead of mean attenuation coefficients only. In the field of industrial X-ray imaging dual-energy techniques have been used to solve special problems on a case-by-case basis rather than as a standard tool. Our goal is to develop an easy-to-use dual-energy solution that can be handled by the average industrial operator without the need for a specialist. We are aiming at providing dual-energy CT as a measurement tool for those cases where qualitative images are not enough and one needs additional quantitative information (e.g. mass density ρ and atomic number Z) about the sample at hand. Our solution is based on an algorithm proposed by Heismann et al. (2003) [1] for application in medical CT . As input data this algorithm needs two CT data sets, one with low (LE) and one with high effective energy (HE). A first order linearization is applied to the raw data, and two volumes are reconstructed thereafter. The dual-energy analysis is done voxel by voxel, using a pre-calculated function F(Z) that implies the parameters of the low and high energy measurement (such as tube voltage, filtration and detector sensitivity). As a result, two volume data sets are obtained, one providing information about the mass density ρ in each voxel, the other providing the effective atomic number Z of the material therein. One main difference between medical and industrial CT is that the range of materials that can be contained in a sample is much wider and can cover the whole range of elements, from hydrogen to uranium. Heismann's algorithm is limited to the range of elements Z=1-30, because for Z>30 the function F(Z) as given by Heismann is not a bijective function anymore. While this still seems very suitable for medical application, it is not enough to cover the complete range of industrial applications. We therefore investigated the

  13. Assimilation of radar quantitative precipitation estimations in the Canadian Precipitation Analysis (CaPA)

    NASA Astrophysics Data System (ADS)

    Fortin, Vincent; Roy, Guy; Donaldson, Norman; Mahidjiba, Ahmed

    2015-12-01

    The Canadian Precipitation Analysis (CaPA) is a data analysis system used operationally at the Canadian Meteorological Center (CMC) since April 2011 to produce gridded 6-h and 24-h precipitation accumulations in near real-time on a regular grid covering all of North America. The current resolution of the product is 10-km. Due to the low density of the observational network in most of Canada, the system relies on a background field provided by the Regional Deterministic Prediction System (RDPS) of Environment Canada, which is a short-term weather forecasting system for North America. For this reason, the North American configuration of CaPA is known as the Regional Deterministic Precipitation Analysis (RDPA). Early in the development of the CaPA system, weather radar reflectivity was identified as a very promising additional data source for the precipitation analysis, but necessary quality control procedures and bias-correction algorithms were lacking for the radar data. After three years of development and testing, a new version of CaPA-RDPA system was implemented in November 2014 at CMC. This version is able to assimilate radar quantitative precipitation estimates (QPEs) from all 31 operational Canadian weather radars. The radar QPE is used as an observation source and not as a background field, and is subject to a strict quality control procedure, like any other observation source. The November 2014 upgrade to CaPA-RDPA was implemented at the same time as an upgrade to the RDPS system, which brought minor changes to the skill and bias of CaPA-RDPA. This paper uses the frequency bias indicator (FBI), the equitable threat score (ETS) and the departure from the partial mean (DPM) in order to assess the improvements to CaPA-RDPA brought by the assimilation of radar QPE. Verification focuses on the 6-h accumulations, and is done against a network of 65 synoptic stations (approximately two stations per radar) that were withheld from the station data assimilated by Ca

  14. Analysis of CNT additives in porous layered thin film lubrication with electric double layer

    NASA Astrophysics Data System (ADS)

    Rao, T. V. V. L. N.; Rani, A. M. A.; Sufian, S.; Mohamed, N. M.

    2015-07-01

    This paper presents an analysis of thin film lubrication of porous layered carbon nanotubes (CNTs) additive slider bearing with electric double layer. The CNTs additive lubricant flow in the thin fluid film and porous layers are governed by Stokes and Brinkman equations respectively, including electro-kinetic force. The apparent viscosity and nondimensional pressure expression are derived. The nondimensional load capacity increases under the influence of electro-viscosity, CNT additives volume fraction, permeability and thickness of porous layer. A CNTs additive lubricated porous thin film slider bearing with electric double layer provides higher load capacity.

  15. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer

    NASA Astrophysics Data System (ADS)

    Fu, Guanglei; Sanjay, Sharma T.; Dou, Maowei; Li, Xiujun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. Electronic supplementary information (ESI) available: Additional information on FTIR characterization (Fig. S1), photothermal immunoassay of PSA in human serum samples (Table S1), and the Experimental section, including preparation of antibody-conjugated iron oxide NPs, sandwich-type immunoassay, characterization, and photothermal detection protocol. See DOI: 10.1039/c5nr09051b

  16. HPTLC Hyphenated with FTIR: Principles, Instrumentation and Qualitative Analysis and Quantitation

    NASA Astrophysics Data System (ADS)

    Cimpoiu, Claudia

    In recent years, much effort has been devoted to the coupling of high-performance thin-layer chromatography (HPTLC) with spectrometric methods because of the robustness and simplicity of HPTLC and the need for detection techniques that provide identification and determination of sample constituents. IR is one of the spectroscopic methods that have been coupled with HPTLC. IR spectroscopy has a high potential for the elucidation of molecular structures, and the characteristic absorption bands can be used for compound-specific detection. HPTLC-FTIR coupled method has been widely used in the modern laboratories for the qualitative and quantitative analysis. The potential of this method is demonstrated by its application in different fields of analysis such as drug analysis, forensic analysis, food analysis, environmental analysis, biological analysis, etc. The hyphenated HPTLC-FTIR technique will be developed in the future with the aim of taking full advantage of this method.

  17. Quantitative Proteomic Analysis of Human Lung Tumor Xenografts Treated with the Ectopic ATP Synthase Inhibitor Citreoviridin

    PubMed Central

    Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen

    2013-01-01

    ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy. PMID:23990911

  18. Real-time cell analysis--a new method for dynamic, quantitative measurement of infectious viruses and antiserum neutralizing activity.

    PubMed

    Teng, Zheng; Kuang, Xiaozhou; Wang, Jiayu; Zhang, Xi

    2013-11-01

    A newly developed electronic cell sensor array--the xCELLigence real-time cell analysis (RTCA) system is tested currently for dynamic monitoring of cell attachment, proliferation, damage, and death. In this study, human enterovirus (HEV71) infection of human rhabdomyosarcoma (RD) was used as an in vitro model to validate the application of this novel system as a straightforward and efficient assay for quantitative measurement of infectious viruses based on virus-induced cytopathic effect (CPE). Several experimental tests were performed including the determination of optimal seeding density of the RD cells in 96-well E-plates, RTCA real-time monitoring of the virus induced CPE and virus titer calculation, and viral neutralization test to determine HEV71 antibody titer. Traditional 50% tissue culture infective dose (TCID50) assay was also conducted for methodology comparison and validation, which indicated a consistent result between the two assays. These findings indicate that the xCELLigence RTCA system can be a valuable addition to current viral assays for quantitative measurement of infectious viruses and quantitation of neutralization antibody titer in real-time, warranting for future research and exploration of applications to many other animal and human viruses. PMID:23835032

  19. The Quantitative and Qualitative Analysis of Cohorts' Early Enrollment in Physics: concurrent with enrollment in mathematics, biology and chemistry

    NASA Astrophysics Data System (ADS)

    Lynch, Robert Bruce Rodes

    Cohorts of 48 entering biological science majors was recruited in the fall of 2007 and again in 2008 and 2009 for the Interdisciplinary Science Experience (ISE). These ISE students enrolled in their own sections of standard courses of physics, chemistry, and biology. In these courses average ISE student out-performed their non-cohort peers by up to a full letter grade. A qualitative analysis of ISE student interviews illuminates the student experience and shows how the ISE students perceived themselves to be different than their non-cohort peers. Quantitative modeling of student performance shows that higher grades are correlated with multiple factors. These factors includes admissions characteristics such as high school GPA, and SAT scores, as well as demographic information. These trends support and elaborate on the selection narratives told by participants. Additionally the quantitative model found that higher student performance is predicted by structural aspects of the ISE program, specifically the timing of course, enrolling as a freshmen in many of their courses, and the sequencing of physics and chemistry courses. There is a statistically significant benefit to student performance in general and organic chemistry courses associated with completing the first quarter of the Physics for Bio-Science majors prior to enrollment. Further the combination of quantitative and qualitative data suggest that there is a epistemological transfer of problem solving skills and outlook from the physics to the chemistry courses.

  20. Efficient antifouling surface for quantitative surface plasmon resonance based biosensor analysis.

    PubMed

    Nogues, Claude; Leh, Hervé; Lautru, Joseph; Delelis, Olivier; Buckle, Malcolm

    2012-01-01

    Non-specific binding to biosensor surfaces is a major obstacle to quantitative analysis of selective retention of analytes at immobilized target molecules. Although a range of chemical antifouling monolayers has been developed to address this problem, many macromolecular interactions still remain refractive to analysis due to the prevalent high degree of non-specific binding. In this manuscript we explore the dynamic process of the formation of self-assembled monolayers and optimize physical and chemical properties thus reducing considerably non-specific binding while maintaining the integrity of the immobilized biomolecules. As a result, analysis of specific binding of analytes to immobilized target molecules is significantly facilitated. PMID:22984487