Sample records for quantitative analysis techniques

  1. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  2. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph

    PubMed Central

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045

  3. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph.

    PubMed

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  4. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  5. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  6. Analysis of defect structure in silicon. Characterization of SEMIX material. Silicon sheet growth development for the large area silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.

    1983-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.

  7. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  8. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  9. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  10. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  11. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  12. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  13. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  14. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  15. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  16. Mathematics Competency for Beginning Chemistry Students Through Dimensional Analysis.

    PubMed

    Pursell, David P; Forlemu, Neville Y; Anagho, Leonard E

    2017-01-01

    Mathematics competency in nursing education and practice may be addressed by an instructional variation of the traditional dimensional analysis technique typically presented in beginning chemistry courses. The authors studied 73 beginning chemistry students using the typical dimensional analysis technique and the variation technique. Student quantitative problem-solving performance was evaluated. Students using the variation technique scored significantly better (18.3 of 20 points, p < .0001) on the final examination quantitative titration problem than those who used the typical technique (10.9 of 20 points). American Chemical Society examination scores and in-house assessment indicate that better performing beginning chemistry students were more likely to use the variation technique rather than the typical technique. The variation technique may be useful as an alternative instructional approach to enhance beginning chemistry students' mathematics competency and problem-solving ability in both education and practice. [J Nurs Educ. 2017;56(1):22-26.]. Copyright 2017, SLACK Incorporated.

  17. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  18. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  19. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  20. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  1. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  2. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.

    PubMed

    Summers, A E

    2000-01-01

    ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.

  4. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  5. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  6. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  7. Recommendations for Quantitative Analysis of Small Molecules by Matrix-assisted laser desorption ionization mass spectrometry

    PubMed Central

    Wang, Poguang; Giese, Roger W.

    2017-01-01

    Matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) has been used for quantitative analysis of small molecules for many years. It is usually preceded by an LC separation step when complex samples are tested. With the development several years ago of “modern MALDI” (automation, high repetition laser, high resolution peaks), the ease of use and performance of MALDI as a quantitative technique greatly increased. This review focuses on practical aspects of modern MALDI for quantitation of small molecules conducted in an ordinary way (no special reagents, devices or techniques for the spotting step of MALDI), and includes our ordinary, preferred Methods The review is organized as 18 recommendations with accompanying explanations, criticisms and exceptions. PMID:28118972

  8. Holographic Interferometry and Image Analysis for Aerodynamic Testing

    DTIC Science & Technology

    1980-09-01

    tunnels, (2) development of automated image analysis techniques for reducing quantitative flow-field data from holographic interferograms, and (3...investigation and development of software for the application of digital image analysis to other photographic techniques used in wind tunnel testing.

  9. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  10. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  12. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  13. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  14. Leukotriene B4 catabolism: quantitation of leukotriene B4 and its omega-oxidation products by reversed-phase high-performance liquid chromatography.

    PubMed

    Shak, S

    1987-01-01

    LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.

  15. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  16. Analysis of Synthetic Polymers.

    ERIC Educational Resources Information Center

    Smith, Charles G.; And Others

    1989-01-01

    Reviews techniques for the characterization and analysis of synthetic polymers, copolymers, and blends. Includes techniques for structure determination, separation, and quantitation of additives and residual monomers; determination of molecular weight; and the study of thermal properties including degradation mechanisms. (MVL)

  17. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  18. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  19. High performance thin layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) for the qualitative and quantitative analysis of Calendula officinalis-advantages and limitations.

    PubMed

    Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana

    2014-09-01

    Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  1. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  2. Analysis of Ergot Alkaloids

    PubMed Central

    Crews, Colin

    2015-01-01

    The principles and application of established and newer methods for the quantitative and semi-quantitative determination of ergot alkaloids in food, feed, plant materials and animal tissues are reviewed. The techniques of sampling, extraction, clean-up, detection, quantification and validation are described. The major procedures for ergot alkaloid analysis comprise liquid chromatography with tandem mass spectrometry (LC-MS/MS) and liquid chromatography with fluorescence detection (LC-FLD). Other methods based on immunoassays are under development and variations of these and minor techniques are available for specific purposes. PMID:26046699

  3. Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Guyer, T.; Stringfellow, G. B.

    1982-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.

  4. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  5. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for... materials are available for purchase from the Environmental Protection Agency, Research Triangle Park, NC...

  6. Qualitative and quantitative analysis of lignocellulosic biomass using infrared techniques: A mini-review

    USDA-ARS?s Scientific Manuscript database

    Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...

  7. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  8. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    NASA Astrophysics Data System (ADS)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  9. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  10. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  11. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  12. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  13. Analysis of a document/reporting system

    NASA Technical Reports Server (NTRS)

    Narrow, B.

    1971-01-01

    An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.

  14. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  15. Microstructural study of the nickel-base alloy WAZ-20 using qualitative and quantitative electron optical techniques

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1973-01-01

    The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.

  16. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  17. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  18. Incorporating Multiple-Choice Questions into an AACSB Assurance of Learning Process: A Course-Embedded Assessment Application to an Introductory Finance Course

    ERIC Educational Resources Information Center

    Santos, Michael R.; Hu, Aidong; Jordan, Douglas

    2014-01-01

    The authors offer a classification technique to make a quantitative skills rubric more operational, with the groupings of multiple-choice questions to match the student learning levels in knowledge, calculation, quantitative reasoning, and analysis. The authors applied this classification technique to the mid-term exams of an introductory finance…

  19. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    NASA Astrophysics Data System (ADS)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of <24%, 24-30%, >30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  20. Analysis of Gold Ores by Fire Assay

    ERIC Educational Resources Information Center

    Blyth, Kristy M.; Phillips, David N.; van Bronswijk, Wilhelm

    2004-01-01

    Students of an Applied Chemistry degree course carried out a fire-assay exercise. The analysis showed that the technique was a worthwhile quantitative analytical technique and covered interesting theory including acid-base and redox chemistry and other concepts such as inquarting and cupelling.

  1. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  2. Antibodies against toluene diisocyanate protein conjugates. Three methods of measurement.

    PubMed

    Patterson, R; Harris, K E; Zeiss, C R

    1983-12-01

    With the use of canine antisera against toluene diisocyanate (TDI)-dog serum albumin (DSA), techniques for measuring antibody against TDI-DSA were evaluated. The use of an ammonium sulfate precipitation assay showed suggestive evidence of antibody binding but high levels of TDI-DSA precipitation in the absence of antibody limit any usefulness of this technique. Double-antibody co-precipitation techniques will measure total antibody or Ig class antibody against 125I-TDI-DSA. These techniques are quantitative. The polystyrene tube radioimmunoassay is a highly sensitive method of detecting and quantitatively estimating IgG antibody. The enzyme linked immunosorbent assay is a rapidly adaptable method for the quantitative estimation of IgG, IgA, and IgM against TDI-homologous proteins. All these techniques were compared and results are demonstrated by using the same serum sample for analysis.

  3. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  4. A collection of flow visualization techniques used in the Aerodynamic Research Branch

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.

  5. Sociological Paradoxes and Graduate Statistics Classes. A Response to "The Sociology of Teaching Graduate Statistics"

    ERIC Educational Resources Information Center

    Hardy, Melissa

    2005-01-01

    This article presents a response to Timothy Patrick Moran's article "The Sociology of Teaching Graduate Statistics." In his essay, Moran argues that exciting developments in techniques of quantitative analysis are currently coupled with a much less exciting formulaic approach to teaching sociology graduate students about quantitative analysis. The…

  6. Identification and confirmation of chemical residues by chromatography-mass spectrometry and other techniques

    USDA-ARS?s Scientific Manuscript database

    A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...

  7. Comparison of three‐dimensional analysis and stereological techniques for quantifying lithium‐ion battery electrode microstructures

    PubMed Central

    TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.

    2016-01-01

    Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804

  8. Comparison of three-dimensional analysis and stereological techniques for quantifying lithium-ion battery electrode microstructures.

    PubMed

    Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R

    2016-09-01

    Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  9. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  10. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  11. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  12. Quantitative analysis of virgin coconut oil in cream cosmetics preparations using fourier transform infrared (FTIR) spectroscopy.

    PubMed

    Rohman, A; Man, Yb Che; Sismindari

    2009-10-01

    Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998.

  13. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in challenging Raman endoscopic applications.

  14. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  15. On the Applications of IBA Techniques to Biological Samples Analysis: PIXE and RBS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falcon-Gonzalez, J. M.; Bernal-Alvarado, J.; Sosa, M.

    2008-08-11

    The analytical techniques based on ion beams or IBA techniques give quantitative information on elemental concentration in samples of a wide variety of nature. In this work, we focus on PIXE technique, analyzing thick target biological specimens (TTPIXE), using 3 MeV protons produced by an electrostatic accelerator. A nuclear microprobe was used performing PIXE and RBS simultaneously, in order to solve the uncertainties produced in the absolute PIXE quantifying. The advantages of using both techniques and a nuclear microprobe are discussed. Quantitative results are shown to illustrate the multielemental resolution of the PIXE technique; for this, a blood standard wasmore » used.« less

  16. Secondary Analysis of Qualitative Data.

    ERIC Educational Resources Information Center

    Turner, Paul D.

    The reanalysis of data to answer the original research question with better statistical techniques or to answer new questions with old data is not uncommon in quantitative studies. Meta analysis and research syntheses have increased with the increase in research using similar statistical analyses, refinements of analytical techniques, and the…

  17. Quantitation of Phenol Levels in Oil of Wintergreen Using Gas Chromatography-Mass Spectrometry with Selected Ion Monitoring

    ERIC Educational Resources Information Center

    Sobel, Robert M.; Ballantine, David S.; Ryzhov, Victor

    2005-01-01

    Industrial application of gas chromatography-mass spectrometry (GC-MS) analysis is a powerful technique that could be used to elucidate components of a complex mixture while offering the benefits of high-precision quantitative analysis. The natural wintergreen oil is examined for its phenol concentration to determine the level of refining…

  18. qHNMR Analysis of Purity of Common Organic Solvents--An Undergraduate Quantitative Analysis Laboratory Experiment

    ERIC Educational Resources Information Center

    Bell, Peter T.; Whaley, W. Lance; Tochterman, Alyssa D.; Mueller, Karl S.; Schultz, Linda D.

    2017-01-01

    NMR spectroscopy is currently a premier technique for structural elucidation of organic molecules. Quantitative NMR (qNMR) methodology has developed more slowly but is now widely accepted, especially in the areas of natural product and medicinal chemistry. However, many undergraduate students are not routinely exposed to this important concept.…

  19. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  20. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    PubMed

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher sensitivity and specificity for the diagnosis of CR-BSIs in newborns when compared to the quantitative technique. In addition, this method is easier to perform and shows better agreement with the gold standard, and should therefore be recommended for routine clinical laboratory use. PFGE may contribute to the control of CR-BSIs by identifying clusters of microorganisms in neonatal ICUs, providing a means of determining potential cross-infection between patients.

  1. Radiologic-Pathologic Analysis of Contrast-enhanced and Diffusion-weighted MR Imaging in Patients with HCC after TACE: Diagnostic Accuracy of 3D Quantitative Image Analysis

    PubMed Central

    Chapiro, Julius; Wood, Laura D.; Lin, MingDe; Duran, Rafael; Cornish, Toby; Lesage, David; Charu, Vivek; Schernthaner, Rüdiger; Wang, Zhijun; Tacher, Vania; Savic, Lynn Jeanette; Kamel, Ihab R.

    2014-01-01

    Purpose To evaluate the diagnostic performance of three-dimensional (3Dthree-dimensional) quantitative enhancement-based and diffusion-weighted volumetric magnetic resonance (MR) imaging assessment of hepatocellular carcinoma (HCChepatocellular carcinoma) lesions in determining the extent of pathologic tumor necrosis after transarterial chemoembolization (TACEtransarterial chemoembolization). Materials and Methods This institutional review board–approved retrospective study included 17 patients with HCChepatocellular carcinoma who underwent TACEtransarterial chemoembolization before surgery. Semiautomatic 3Dthree-dimensional volumetric segmentation of target lesions was performed at the last MR examination before orthotopic liver transplantation or surgical resection. The amount of necrotic tumor tissue on contrast material–enhanced arterial phase MR images and the amount of diffusion-restricted tumor tissue on apparent diffusion coefficient (ADCapparent diffusion coefficient) maps were expressed as a percentage of the total tumor volume. Visual assessment of the extent of tumor necrosis and tumor response according to European Association for the Study of the Liver (EASLEuropean Association for the Study of the Liver) criteria was performed. Pathologic tumor necrosis was quantified by using slide-by-slide segmentation. Correlation analysis was performed to evaluate the predictive values of the radiologic techniques. Results At histopathologic examination, the mean percentage of tumor necrosis was 70% (range, 10%–100%). Both 3Dthree-dimensional quantitative techniques demonstrated a strong correlation with tumor necrosis at pathologic examination (R2 = 0.9657 and R2 = 0.9662 for quantitative EASLEuropean Association for the Study of the Liver and quantitative ADCapparent diffusion coefficient, respectively) and a strong intermethod agreement (R2 = 0.9585). Both methods showed a significantly lower discrepancy with pathologically measured necrosis (residual standard error [RSEresidual standard error] = 6.38 and 6.33 for quantitative EASLEuropean Association for the Study of the Liver and quantitative ADCapparent diffusion coefficient, respectively), when compared with non-3Dthree-dimensional techniques (RSEresidual standard error = 12.18 for visual assessment). Conclusion This radiologic-pathologic correlation study demonstrates the diagnostic accuracy of 3Dthree-dimensional quantitative MR imaging techniques in identifying pathologically measured tumor necrosis in HCChepatocellular carcinoma lesions treated with TACEtransarterial chemoembolization. © RSNA, 2014 Online supplemental material is available for this article. PMID:25028783

  2. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  3. Analysis of airborne MAIS imaging spectrometric data for mineral exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Jinnian; Zheng Lanfen; Tong Qingxi

    1996-11-01

    The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less

  4. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  5. Respondent Techniques for Reduction of Emotions Limiting School Adjustment: A Quantitative Review and Methodological Critique.

    ERIC Educational Resources Information Center

    Misra, Anjali; Schloss, Patrick J.

    1989-01-01

    The critical analysis of 23 studies using respondent techniques for the reduction of excessive emotional reactions in school children focuses on research design, dependent variables, independent variables, component analysis, and demonstrations of generalization and maintenance. Results indicate widespread methodological flaws that limit the…

  6. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    PubMed

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  7. Fundamentals of quantitative dynamic contrast-enhanced MR imaging.

    PubMed

    Paldino, Michael J; Barboriak, Daniel P

    2009-05-01

    Quantitative analysis of dynamic contrast-enhanced MR imaging (DCE-MR imaging) has the power to provide information regarding physiologic characteristics of the microvasculature and is, therefore, of great potential value to the practice of oncology. In particular, these techniques could have a significant impact on the development of novel anticancer therapies as a promising biomarker of drug activity. Standardization of DCE-MR imaging acquisition and analysis to provide more reproducible measures of tumor vessel physiology is of crucial importance to realize this potential. The purpose of this article is to review the pathophysiologic basis and technical aspects of DCE-MR imaging techniques.

  8. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  9. A Colorimetric Analysis Experiment Not Requiring a Spectrophotometer: Quantitative Determination of Albumin in Powdered Egg White

    ERIC Educational Resources Information Center

    Charlton, Amanda K.; Sevcik, Richard S.; Tucker, Dorie A.; Schultz, Linda D.

    2007-01-01

    A general science experiment for high school chemistry students might serve as an excellent review of the concepts of solution preparation, solubility, pH, and qualitative and quantitative analysis of a common food product. The students could learn to use safe laboratory techniques, collect and analyze data using proper scientific methodology and…

  10. Impact of Oriented Clay Particles on X-Ray Spectroscopy Analysis

    NASA Astrophysics Data System (ADS)

    Lim, A. J. M. S.; Syazwani, R. N.; Wijeyesekera, D. C.

    2016-07-01

    Understanding the engineering properties of the mineralogy and microfabic of clayey soils is very complex and thus very difficult for soil characterization. Micromechanics of soils recognize that the micro structure and mineralogy of clay have a significant influence on its engineering behaviour. To achieve a more reliable quantitative evaluation of clay mineralogy, a proper sample preparation technique for quantitative clay mineral analysis is necessary. This paper presents the quantitative evaluation of elemental analysis and chemical characterization of oriented and random oriented clay particles using X-ray spectroscopy. Three different types of clays namely marine clay, bentonite and kaolin clay were studied. The oriented samples were prepared by placing the dispersed clay in water and left to settle on porous ceramic tiles by applying a relatively weak suction through a vacuum pump. Images form a Scanning Electron Microscope (SEM) was also used to show the comparison between the orientation patterns of both the sample preparation techniques. From the quantitative analysis of the X-ray spectroscopy, oriented sampling method showed more accuracy in identifying mineral deposits, because it produced better peak intensity on the spectrum and more mineral content can be identified compared to randomly oriented samples.

  11. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  12. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  13. Technique for quantitative RT-PCR analysis directly from single muscle fibers.

    PubMed

    Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M

    2008-07-01

    The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.

  14. Linear regression analysis and its application to multivariate chromatographic calibration for the quantitative analysis of two-component mixtures.

    PubMed

    Dinç, Erdal; Ozdemir, Abdil

    2005-01-01

    Multivariate chromatographic calibration technique was developed for the quantitative analysis of binary mixtures enalapril maleate (EA) and hydrochlorothiazide (HCT) in tablets in the presence of losartan potassium (LST). The mathematical algorithm of multivariate chromatographic calibration technique is based on the use of the linear regression equations constructed using relationship between concentration and peak area at the five-wavelength set. The algorithm of this mathematical calibration model having a simple mathematical content was briefly described. This approach is a powerful mathematical tool for an optimum chromatographic multivariate calibration and elimination of fluctuations coming from instrumental and experimental conditions. This multivariate chromatographic calibration contains reduction of multivariate linear regression functions to univariate data set. The validation of model was carried out by analyzing various synthetic binary mixtures and using the standard addition technique. Developed calibration technique was applied to the analysis of the real pharmaceutical tablets containing EA and HCT. The obtained results were compared with those obtained by classical HPLC method. It was observed that the proposed multivariate chromatographic calibration gives better results than classical HPLC.

  15. Glutenite bodies sequence division of the upper Es4 in northern Minfeng zone of Dongying Sag, Bohai Bay Basin, China

    NASA Astrophysics Data System (ADS)

    Shao, Xupeng

    2017-04-01

    Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy

  16. Improved cardiac motion detection from ultrasound images using TDIOF: a combined B-mode/ tissue Doppler approach

    NASA Astrophysics Data System (ADS)

    Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.

    2013-03-01

    Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.

  17. Two imaging techniques for 3D quantification of pre-cementation space for CAD/CAM crowns.

    PubMed

    Rungruanganunt, Patchanee; Kelly, J Robert; Adams, Douglas J

    2010-12-01

    Internal three-dimensional (3D) "fit" of prostheses to prepared teeth is likely more important clinically than "fit" judged only at the level of the margin (i.e. marginal "opening"). This work evaluates two techniques for quantitatively defining 3D "fit", both using pre-cementation space impressions: X-ray microcomputed tomography (micro-CT) and quantitative optical analysis. Both techniques are of interest for comparison of CAD/CAM system capabilities and for documenting "fit" as part of clinical studies. Pre-cementation space impressions were taken of a single zirconia coping on its die using a low viscosity poly(vinyl siloxane) impression material. Calibration specimens of this material were fabricated between the measuring platens of a micrometre. Both calibration curves and pre-cementation space impression data sets were obtained by examination using micro-CT and quantitative optical analysis. Regression analysis was used to compare calibration curves with calibration sets. Micro-CT calibration data showed tighter 95% confidence intervals and was able to measure over a wider thickness range than for the optical technique. Regions of interest (e.g., lingual, cervical) were more easily analysed with optical image analysis and this technique was more suitable for extremely thin impression walls (<10-15μm). Specimen preparation is easier for micro-CT and segmentation parameters appeared to capture dimensions accurately. Both micro-CT and the optical method can be used to quantify the thickness of pre-cementation space impressions. Each has advantages and limitations but either technique has the potential for use as part of clinical studies or CAD/CAM protocol optimization. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  19. Subsurface imaging and cell refractometry using quantitative phase/ shear-force feedback microscopy

    NASA Astrophysics Data System (ADS)

    Edward, Kert; Farahi, Faramarz

    2009-10-01

    Over the last few years, several novel quantitative phase imaging techniques have been developed for the study of biological cells. However, many of these techniques are encumbered by inherent limitations including 2π phase ambiguities and diffraction limited spatial resolution. In addition, subsurface information in the phase data is not exploited. We hereby present a novel quantitative phase imaging system without 2 π ambiguities, which also allows for subsurface imaging and cell refractometry studies. This is accomplished by utilizing simultaneously obtained shear-force topography information. We will demonstrate how the quantitative phase and topography data can be used for subsurface and cell refractometry analysis and will present results for a fabricated structure and a malaria infected red blood cell.

  20. Single shot white light interference microscopy with colour fringe analysis for quantitative phase imaging of biological cells

    NASA Astrophysics Data System (ADS)

    Srivastava, Vishal; Mehta, D. S.

    2013-02-01

    To quantitatively obtain the phase map of Onion and human red blood cell (RBC) from white light interferogram we used Hilbert transform color fringe analysis technique. The three Red, Blue and Green color components are decomposed from single white light interferogram and Refractive index profile for Red, Blue and Green colour were computed in a completely non-invasive manner for Onion and human RBC. The present technique might be useful for non-invasive determination of the refractive index variation within cells and tissues and morphological features of sample with ease of operation and low cost.

  1. Quantitative analysis of binary polymorphs mixtures of fusidic acid by diffuse reflectance FTIR spectroscopy, diffuse reflectance FT-NIR spectroscopy, Raman spectroscopy and multivariate calibration.

    PubMed

    Guo, Canyong; Luo, Xuefang; Zhou, Xiaohua; Shi, Beijia; Wang, Juanjuan; Zhao, Jinqi; Zhang, Xiaoxia

    2017-06-05

    Vibrational spectroscopic techniques such as infrared, near-infrared and Raman spectroscopy have become popular in detecting and quantifying polymorphism of pharmaceutics since they are fast and non-destructive. This study assessed the ability of three vibrational spectroscopy combined with multivariate analysis to quantify a low-content undesired polymorph within a binary polymorphic mixture. Partial least squares (PLS) regression and support vector machine (SVM) regression were employed to build quantitative models. Fusidic acid, a steroidal antibiotic, was used as the model compound. It was found that PLS regression performed slightly better than SVM regression in all the three spectroscopic techniques. Root mean square errors of prediction (RMSEP) were ranging from 0.48% to 1.17% for diffuse reflectance FTIR spectroscopy and 1.60-1.93% for diffuse reflectance FT-NIR spectroscopy and 1.62-2.31% for Raman spectroscopy. The results indicate that diffuse reflectance FTIR spectroscopy offers significant advantages in providing accurate measurement of polymorphic content in the fusidic acid binary mixtures, while Raman spectroscopy is the least accurate technique for quantitative analysis of polymorphs. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  3. Integrating multiparametric prostate MRI into clinical practice

    PubMed Central

    2011-01-01

    Abstract Multifunctional magnetic resonance imaging (MRI) techniques are increasingly being used to address bottlenecks in prostate cancer patient management. These techniques yield qualitative, semi-quantitative and fully quantitative biomarkers that reflect on the underlying biological status of a tumour. If these techniques are to have a role in patient management, then standard methods of data acquisition, analysis and reporting have to be developed. Effective communication by the use of scoring systems, structured reporting and a graphical interface that matches prostate anatomy are key elements. Practical guidelines for integrating multiparametric MRI into clinical practice are presented. PMID:22187067

  4. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  5. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  7. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  8. The Metaphors That Elementary School Students Use to Describe the Term "Teacher"

    ERIC Educational Resources Information Center

    Karadag, Ruhan; Gültekin, Mehmet

    2012-01-01

    The aim of this study is to investigate metaphors that elementary school 5th and 8th grade students (N = 567) use in order to describe the term "teacher". The data were collected using a questionnaire consisting of open-ended questions, and analyzed using qualitative and quantitative analysis techniques. Content analysis technique was…

  9. High-coverage quantitative proteomics using amine-specific isotopic labeling.

    PubMed

    Melanson, Jeremy E; Avery, Steven L; Pinto, Devanand M

    2006-08-01

    Peptide dimethylation with isotopically coded formaldehydes was evaluated as a potential alternative to techniques such as the iTRAQ method for comparative proteomics. The isotopic labeling strategy and custom-designed protein quantitation software were tested using protein standards and then applied to measure proteins levels associated with Alzheimer's disease (AD). The method provided high accuracy (10% error), precision (14% RSD) and coverage (70%) when applied to the analysis of a standard solution of BSA by LC-MS/MS. The technique was then applied to measure protein abundance levels in brain tissue afflicted with AD relative to normal brain tissue. 2-D LC-MS analysis identified 548 unique proteins (p<0.05). Of these, 349 were quantified with two or more peptides that met the statistical criteria used in this study. Several classes of proteins exhibited significant changes in abundance. For example, elevated levels of antioxidant proteins and decreased levels of mitochondrial electron transport proteins were observed. The results demonstrate the utility of the labeling method for high-throughput quantitative analysis.

  10. Critical factors determining the quantification capability of matrix-assisted laser desorption/ionization– time-of-flight mass spectrometry

    PubMed Central

    Wang, Chia-Chen; Lai, Yin-Hung; Ou, Yu-Meng; Chang, Huan-Tsung; Wang, Yi-Sheng

    2016-01-01

    Quantitative analysis with mass spectrometry (MS) is important but challenging. Matrix-assisted laser desorption/ionization (MALDI) coupled with time-of-flight (TOF) MS offers superior sensitivity, resolution and speed, but such techniques have numerous disadvantages that hinder quantitative analyses. This review summarizes essential obstacles to analyte quantification with MALDI-TOF MS, including the complex ionization mechanism of MALDI, sensitive characteristics of the applied electric fields and the mass-dependent detection efficiency of ion detectors. General quantitative ionization and desorption interpretations of ion production are described. Important instrument parameters and available methods of MALDI-TOF MS used for quantitative analysis are also reviewed. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644968

  11. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  12. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carla J. Miller

    This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements

  14. A further component analysis for illicit drugs mixtures with THz-TDS

    NASA Astrophysics Data System (ADS)

    Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui

    2009-07-01

    A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.

  15. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  16. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  17. Integration of different data gap filling techniques to facilitate assessment of polychlorinated biphenyls: A proof of principle case study (ASCCT meeting)

    EPA Science Inventory

    Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used d...

  18. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  19. Quantitative characterization of the spatial distribution of particles in materials: Application to materials processing

    NASA Technical Reports Server (NTRS)

    Parse, Joseph B.; Wert, J. A.

    1991-01-01

    Inhomogeneities in the spatial distribution of second phase particles in engineering materials are known to affect certain mechanical properties. Progress in this area has been hampered by the lack of a convenient method for quantitative description of the spatial distribution of the second phase. This study intends to develop a broadly applicable method for the quantitative analysis and description of the spatial distribution of second phase particles. The method was designed to operate on a desktop computer. The Dirichlet tessellation technique (geometrical method for dividing an area containing an array of points into a set of polygons uniquely associated with the individual particles) was selected as the basis of an analysis technique implemented on a PC. This technique is being applied to the production of Al sheet by PM processing methods; vacuum hot pressing, forging, and rolling. The effect of varying hot working parameters on the spatial distribution of aluminum oxide particles in consolidated sheet is being studied. Changes in distributions of properties such as through-thickness near-neighbor distance correlate with hot-working reduction.

  20. A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde

    NASA Astrophysics Data System (ADS)

    Cofer, Wesley R.; Edahl, Robert A.

    This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH 2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g. CH 2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH 2O at global background levels (˜ 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH 2O, once concentrated, is accomplished using high performance liquid chromatography (HPLC) with ultraviolet photometric detection. The CH 2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H 2SO 4 acidified aqueous solution, is detected as CH 2O.

  1. Correlative fractography: combining scanning electron microscopy and light microscopes for qualitative and quantitative analysis of fracture surfaces.

    PubMed

    Hein, Luis Rogerio de Oliveira; de Oliveira, José Alberto; de Campos, Kamila Amato

    2013-04-01

    Correlative fractography is a new expression proposed here to describe a new method for the association between scanning electron microscopy (SEM) and light microscopy (LM) for the qualitative and quantitative analysis of fracture surfaces. This article presents a new method involving the fusion of one elevation map obtained by extended depth from focus reconstruction from LM with exactly the same area by SEM and associated techniques, as X-ray mapping. The true topographic information is perfectly associated to local fracture mechanisms with this new technique, presented here as an alternative to stereo-pair reconstruction for the investigation of fractured components. The great advantage of this technique resides in the possibility of combining any imaging methods associated with LM and SEM for the same observed field from fracture surface.

  2. A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde

    NASA Technical Reports Server (NTRS)

    Cofer, W. R., III; Edahl, R. A., Jr.

    1986-01-01

    This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g., CH2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH2O at global background levels (about 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH2O, once concentrated, is accomplished using high performance liquid chromatography with ultraviolet photometric detection. The CH2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H2SO4 acidified aqueous solution, is detected as CH2O.

  3. Analysis techniques for tracer studies of oxidation. M. S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Basu, S. N.

    1984-01-01

    Analysis techniques to obtain quantitative diffusion data from tracer concentration profiles were developed. Mass balance ideas were applied to determine the mechanism of oxide growth and to separate the fraction of inward and outward growth of oxide scales. The process of inward oxygen diffusion with exchange was theoretically modelled and the effect of lattice diffusivity, grain boundary diffusivity and grain size on the tracer concentration profile was studied. The development of the tracer concentration profile in a growing oxide scale was simulated. The double oxidation technique was applied to a FeCrAl-Zr alloy using 0-18 as a tracer. SIMS was used to obtain the tracer concentration profile. The formation of lacey oxide on the alloy was discussed. Careful consideration was given to the quality of data required to obtain quantitative information.

  4. Quantitative analysis and feature recognition in 3-D microstructural data sets

    NASA Astrophysics Data System (ADS)

    Lewis, A. C.; Suh, C.; Stukowski, M.; Geltmacher, A. B.; Spanos, G.; Rajan, K.

    2006-12-01

    A three-dimensional (3-D) reconstruction of an austenitic stainless-steel microstructure was used as input for an image-based finite-element model to simulate the anisotropic elastic mechanical response of the microstructure. The quantitative data-mining and data-warehousing techniques used to correlate regions of high stress with critical microstructural features are discussed. Initial analysis of elastic stresses near grain boundaries due to mechanical loading revealed low overall correlation with their location in the microstructure. However, the use of data-mining and feature-tracking techniques to identify high-stress outliers revealed that many of these high-stress points are generated near grain boundaries and grain edges (triple junctions). These techniques also allowed for the differentiation between high stresses due to boundary conditions of the finite volume reconstructed, and those due to 3-D microstructural features.

  5. Effectiveness of Myocardial Contrast Echocardiography Quantitative Analysis during Adenosine Stress versus Visual Analysis before Percutaneous Therapy in Acute Coronary Pain: A Coronary Artery TIMI Grading Comparing Study

    PubMed Central

    Yang, Lixia; Mu, Yuming; Quaglia, Luiz Augusto; Tang, Qi; Guan, Lina; Wang, Chunmei; Shih, Ming Chi

    2012-01-01

    The study aim was to compare two different stress echocardiography interpretation techniques based on the correlation with thrombosis in myocardial infarction (TIMI ) flow grading from acute coronary syndrome (ACS) patients. Forty-one patients with suspected ACS were studied before diagnostic coronary angiography with myocardial contrast echocardiography (MCE) at rest and at stress. The correlation of visual interpretation of MCE and TIMI flow grade was significant. The quantitative analysis (myocardial perfusion parameters: A, β, and A × β) and TIMI flow grade were significant. MCE visual interpretation and TIMI flow grade had a high degree of agreement, on diagnosing myocardial perfusion abnormality. If one considers TIMI flow grade <3 as abnormal, MCE visual interpretation at rest had 73.1% accuracy with 58.2% sensitivity and 84.2% specificity and at stress had 80.4% accuracy with 76.6% sensitivity and 83.3% specificity. The MCE quantitative analysis has better accuracy with 100% of agreement with different level of TIMI flow grading. MCE quantitative analysis at stress has showed a direct correlation with TIMI flow grade, more significant than the visual interpretation technique. Further studies could measure the clinical relevance of this more objective approach to managing acute coronary syndrome patient before percutaneous coronary intervention (PCI). PMID:22778555

  6. Age determination by teeth examination: a comparison between different morphologic and quantitative analyses.

    PubMed

    Amariti, M L; Restori, M; De Ferrari, F; Paganelli, C; Faglia, R; Legnani, G

    1999-06-01

    Age determination by teeth examination is one of the main means of determining personal identification. Current studies have suggested different techniques for determining the age of a subject by means of the analysis of microscopic and macroscopic structural modifications of the tooth with ageing. The histological approach is useful among the various methodologies utilized for this purpose. It is still unclear as to what is the best technique, as almost all the authors suggest the use of the approach they themselves have tested. In the present study, age determination by means of microscopic techniques has been based on the quantitative analysis of three parameters, all well recognized in specialized literature: 1. dentinal tubules density/sclerosis 2. tooth translucency 3. analysis of the cementum thickness. After a description of the three methodologies (with automatic image processing of the dentinal sclerosis utilizing an appropriate computer program developed by the authors) the results obtained on cases using the three different approaches are presented, and the merits and failings of each technique are identified with the intention of identifying the one offering the least degree of error in age determination.

  7. General quantitative genetic methods for comparative biology: phylogenies, taxonomies and multi-trait models for continuous and categorical characters.

    PubMed

    Hadfield, J D; Nakagawa, S

    2010-03-01

    Although many of the statistical techniques used in comparative biology were originally developed in quantitative genetics, subsequent development of comparative techniques has progressed in relative isolation. Consequently, many of the new and planned developments in comparative analysis already have well-tested solutions in quantitative genetics. In this paper, we take three recent publications that develop phylogenetic meta-analysis, either implicitly or explicitly, and show how they can be considered as quantitative genetic models. We highlight some of the difficulties with the proposed solutions, and demonstrate that standard quantitative genetic theory and software offer solutions. We also show how results from Bayesian quantitative genetics can be used to create efficient Markov chain Monte Carlo algorithms for phylogenetic mixed models, thereby extending their generality to non-Gaussian data. Of particular utility is the development of multinomial models for analysing the evolution of discrete traits, and the development of multi-trait models in which traits can follow different distributions. Meta-analyses often include a nonrandom collection of species for which the full phylogenetic tree has only been partly resolved. Using missing data theory, we show how the presented models can be used to correct for nonrandom sampling and show how taxonomies and phylogenies can be combined to give a flexible framework with which to model dependence.

  8. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  9. LIBS: a potential tool for industrial/agricultural waste water analysis

    NASA Astrophysics Data System (ADS)

    Karpate, Tanvi; K. M., Muhammed Shameem; Nayak, Rajesh; V. K., Unnikrishnan; Santhosh, C.

    2016-04-01

    Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique with various advantages and has the ability to detect any element in real time. This technique holds a potential for environmental monitoring and various such analysis has been done in soil, glass, paint, water, plastic etc confirms the robustness of this technique for such applications. Compared to the currently available water quality monitoring methods and techniques, LIBS has several advantages, viz. no need for sample preparation, fast and easy operation, and chemical free during the process. In LIBS, powerful pulsed laser generates plasma which is then analyzed to get quantitative and qualitative details of the elements present in the sample. Another main advantage of LIBS technique is that it can perform in standoff mode for real time analysis. Water samples from industries and agricultural strata tend to have a lot of pollutants making it harmful for consumption. The emphasis of this project is to determine such harmful pollutants present in trace amounts in industrial and agricultural wastewater. When high intensity laser is made incident on the sample, a plasma is generated which gives a multielemental emission spectra. LIBS analysis has shown outstanding success for solids samples. For liquid samples, the analysis is challenging as the liquid sample has the chances of splashing due to the high energy of laser and thus making it difficult to generate plasma. This project also deals with determining the most efficient method for testing of water sample for qualitative as well as quantitative analysis using LIBS.

  10. A simple algorithm for quantifying DNA methylation levels on multiple independent CpG sites in bisulfite genomic sequencing electropherograms.

    PubMed

    Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A

    2008-06-01

    DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.

  11. Diffraction enhance x-ray imaging for quantitative phase contrast studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.

    2016-05-23

    Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.

  12. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites.

    PubMed

    Lietz, Christopher B; Gemperline, Erin; Li, Lingjun

    2013-07-01

    Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites

    PubMed Central

    Lietz, Christopher B.; Gemperline, Erin; Li, Lingjun

    2013-01-01

    Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. PMID:23603211

  14. Cognitive task analysis: Techniques applied to airborne weapons training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less

  15. Evaluation of electrochemical, UV/VIS and Raman spectroelectrochemical detection of Naratriptan with screen-printed electrodes.

    PubMed

    Hernández, Carla Navarro; Martín-Yerga, Daniel; González-García, María Begoña; Hernández-Santos, David; Fanjul-Bolado, Pablo

    2018-02-01

    Naratriptan, active pharmaceutical ingredient with antimigraine activity was electrochemically detected in untreated screen-printed carbon electrodes (SPCEs). Cyclic voltammetry and differential pulse voltammetry were used to carry out quantitative analysis of this molecule (in a Britton-Robinson buffer solution at pH 3.0) through its irreversible oxidation (diffusion controlled) at a potential of +0.75V (vs. Ag pseudoreference electrode). Naratriptan oxidation product is an indole based dimer with a yellowish colour (maximum absorption at 320nm) so UV-VIS spectroelectrochemistry technique was used for the very first time as an in situ characterization and quantification technique for this molecule. A reflection configuration approach allowed its measurement over the untreated carbon based electrode. Finally, time resolved Raman Spectroelectrochemistry is used as a powerful technique to carry out qualitative and quantitative analysis of Naratriptan. Electrochemically treated silver screen-printed electrodes are shown as easy to use and cost-effective SERS substrates for the analysis of Naratriptan. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Comparative analysis of techniques for detection of quiescent Botrytis cinerea in grapes by quantitative PCR

    USDA-ARS?s Scientific Manuscript database

    Quantitative PCR (qPCR) can be used to detect and monitor pathogen colonization, but early attempts to apply the technology to quiescent Botrytis cinerea infections of grape berries identified some specific limitations. In this study, four DNA extraction methods, two tissue-grinding methods, two gra...

  17. Combining Qualitative and Quantitative Data: An Example.

    ERIC Educational Resources Information Center

    Sikka, Anjoo; And Others

    Methodology from an ongoing research study to validate teaching techniques for deaf and blind students provides an example of the ways that several types of quantitative and qualitative data can be combined in analysis. Four teacher and student pairs were selected. The students were between 14 and 21 years old, had both auditory and visual…

  18. The Role of Hemispheral Asymmetry and Regional Activity of Quantitative EEG in Children with Stuttering

    ERIC Educational Resources Information Center

    Ozge, Aynur; Toros, Fevziye; Comelekoglu, Ulku

    2004-01-01

    We investigated the role of delayed cerebral maturation, hemisphere asymmetry and regional differences in children with stuttering and healthy controls during resting state and hyperventilation, using conventional EEG techniques and quantitative EEG (QEEG) analysis. This cross-sectional case control study included 26 children with stuttering and…

  19. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  20. Application of Person-Centered Approaches to Critical Quantitative Research: Exploring Inequities in College Financing Strategies

    ERIC Educational Resources Information Center

    Malcom-Piqueux, Lindsey

    2014-01-01

    This chapter discusses the utility of person-centered approaches to critical quantitative researchers. These techniques, which identify groups of individuals who share similar attributes, experiences, or outcomes, are contrasted with more commonly used variable-centered approaches. An illustrative example of a latent class analysis of the college…

  1. Collection Evaluation Techniques in the Academic Art Library.

    ERIC Educational Resources Information Center

    Kusnerz, Peggy Ann

    1983-01-01

    Presents an overview of library collection evaluation techniques described in the literature--list-checking, quantitative analysis, use studies, and subject specialist review--and offers suggestions to the librarian for the application of these methods in an art library. Twenty-five references are provided. (EJS)

  2. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  4. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  5. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  6. Techniques for quantitative LC-MS/MS analysis of protein therapeutics: advances in enzyme digestion and immunocapture.

    PubMed

    Fung, Eliza N; Bryan, Peter; Kozhich, Alexander

    2016-04-01

    LC-MS/MS has been investigated to quantify protein therapeutics in biological matrices. The protein therapeutics is digested by an enzyme to generate surrogate peptide(s) before LC-MS/MS analysis. One challenge is isolating protein therapeutics in the presence of large number of endogenous proteins in biological matrices. Immunocapture, in which a capture agent is used to preferentially bind the protein therapeutics over other proteins, is gaining traction. The protein therapeutics is eluted for digestion and LC-MS/MS analysis. One area of tremendous potential for immunocapture-LC-MS/MS is to obtain quantitative data where ligand-binding assay alone is not sufficient, for example, quantitation of antidrug antibody complexes. Herein, we present an overview of recent advance in enzyme digestion and immunocapture applicable to protein quantitation.

  7. Boiler Tube Corrosion Characterization with a Scanning Thermal Line

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Jacobstein, Ronald; Reilly, Thomas

    2001-01-01

    Wall thinning due to corrosion in utility boiler water wall tubing is a significant operational concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. Unfortunately, ultrasonic inspection is very manpower intense and slow. Therefore, thickness measurements are typically taken over a relatively small percentage of the total boiler wall and statistical analysis is used to determine the overall condition of the boiler tubing. Other inspection techniques, such as electromagnetic acoustic transducer (EMAT), have recently been evaluated, however they provide only a qualitative evaluation - identifying areas or spots where corrosion has significantly reduced the wall thickness. NASA Langley Research Center, in cooperation with ThermTech Services, has developed a thermal NDE technique designed to quantitatively measure the wall thickness and thus determine the amount of material thinning present in steel boiler tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed and accuracy for large structures such as boiler water walls. A theoretical basis for the technique will be presented to establish the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of the application of this technology to actual water wall tubing samples and in-situ inspections will be presented.

  8. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less

  9. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  10. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  11. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  12. Three-dimensional segmentation of luminal and adventitial borders in serial intravascular ultrasound images

    NASA Technical Reports Server (NTRS)

    Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.

    1999-01-01

    Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.

  13. Quantitative Analysis of Urine Vapor and Breath by Gas-Liquid Partition Chromatography

    PubMed Central

    Pauling, Linus; Robinson, Arthur B.; Teranishi, Roy; Cary, Paul

    1971-01-01

    When a human being is placed for several days on a completely defined diet, consisting almost entirely of small molecules that are absorbed from the stomach into the blood, intestinal flora disappear because of lack of nutrition. By this technique, the composition of body fluids can be made constant (standard deviation about 10%) after a few days, permitting significant quantitative analyses to be performed. A method of temperature-programmed gas-liquid partition chromatography has been developed for this purpose. It permits the quantitative determination of about 250 substances in a sample of breath, and of about 280 substances in a sample of urine vapor. The technique should be useful in the application of the principles of orthomolecular medicine. PMID:5289873

  14. Quantitative analysis of urine vapor and breath by gas-liquid partition chromatography.

    PubMed

    Pauling, L; Robinson, A B; Teranishi, R; Cary, P

    1971-10-01

    When a human being is placed for several days on a completely defined diet, consisting almost entirely of small molecules that are absorbed from the stomach into the blood, intestinal flora disappear because of lack of nutrition. By this technique, the composition of body fluids can be made constant (standard deviation about 10%) after a few days, permitting significant quantitative analyses to be performed. A method of temperature-programmed gas-liquid partition chromatography has been developed for this purpose. It permits the quantitative determination of about 250 substances in a sample of breath, and of about 280 substances in a sample of urine vapor. The technique should be useful in the application of the principles of orthomolecular medicine.

  15. Quantitative fractography by digital image processing: NIH Image macro tools for stereo pair analysis and 3-D reconstruction.

    PubMed

    Hein, L R

    2001-10-01

    A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.

  16. Quantitation of influenza virus using field flow fractionation and multi-angle light scattering for quantifying influenza A particles

    PubMed Central

    Bousse, Tatiana; Shore, David A.; Goldsmith, Cynthia S.; Hossain, M. Jaber; Jang, Yunho; Davis, Charles T.; Donis, Ruben O.; Stevens, James

    2017-01-01

    Summary Recent advances in instrumentation and data analysis in field flow fractionation and multi-angle light scattering (FFF-MALS) have enabled greater use of this technique to characterize and quantitate viruses. In this study, the FFF-MALS technique was applied to the characterization and quantitation of type A influenza virus particles to assess its usefulness for vaccine preparation. The use of FFF-MALS for quantitation and measurement of control particles provided data accurate to within 5% of known values, reproducible with a coefficient of variation of 1.9 %. The methods, sensitivity and limit of detection were established by analyzing different volumes of purified virus, which produced a linear regression with fitting value R2 of 0.99. FFF-MALS was further applied to detect and quantitate influenza virus in the supernatant of infected MDCK cells and allantoic fluids of infected eggs. FFF fractograms of the virus present in these different fluids revealed similar distribution of monomeric and oligomeric virions. However, the monomer fraction of cell grown virus has greater size variety. Notably, β-propialactone (BPL) inactivation of influenza viruses did not influence any of the FFF-MALS measurements. Quantitation analysis by FFF-MALS was compared to infectivity assays and real-time RT-PCR (qRT-PCR) and the limitations of each assay were discussed. PMID:23916678

  17. Quantitative surface topography assessment of directly compressed and roller compacted tablet cores using photometric stereo image analysis.

    PubMed

    Allesø, Morten; Holm, Per; Carstensen, Jens Michael; Holm, René

    2016-05-25

    Surface topography, in the context of surface smoothness/roughness, was investigated by the use of an image analysis technique, MultiRay™, related to photometric stereo, on different tablet batches manufactured either by direct compression or roller compaction. In the present study, oblique illumination of the tablet (darkfield) was considered and the area of cracks and pores in the surface was used as a measure of tablet surface topography; the higher a value, the rougher the surface. The investigations demonstrated a high precision of the proposed technique, which was able to rapidly (within milliseconds) and quantitatively measure the obtained surface topography of the produced tablets. Compaction history, in the form of applied roll force and tablet punch pressure, was also reflected in the measured smoothness of the tablet surfaces. Generally it was found that a higher degree of plastic deformation of the microcrystalline cellulose resulted in a smoother tablet surface. This altogether demonstrated that the technique provides the pharmaceutical developer with a reliable, quantitative response parameter for visual appearance of solid dosage forms, which may be used for process and ultimately product optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    NASA Astrophysics Data System (ADS)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  19. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  20. Conceptual development and retention within the learning cycle

    NASA Astrophysics Data System (ADS)

    McWhirter, Lisa Jo

    1998-12-01

    This research was designed to achieve two goals: (1) examine concept development and retention within the learning cycle and (2) examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. Forty-eight sixth-grade students and one teacher at an urban middle school participated in the study. The research utilized both quantitative and qualitative analyses. Quantitative assessments included a concept mapping technique as well as teacher generated multiple choice tests. Preliminary quantitative analysis found that students' reading levels had an effect on students' pretest scores in both the concept mapping and the multiple-choice assessment. Therefore, a covariant design was implemented for the quantitative analyses. Quantitative analysis techniques were used to examine concept development and retention, it was discovered that the students' concept knowledge increased significantly from the time of the conclusion of the term introduction phase to the conclusion of the expansion phase. These findings would indicate that all three phases of the learning cycle are necessary for conceptual development. However, quantitative analyses of concept maps indicated that this is not true for all students. Individual students showed evidence of concept development and integration at each phase. Therefore, concept development is individualized and all phases of the learning cycle are not necessary for all students. As a result, individual's assimilation, disequilibration, accommodation and organization may not correlate with the phases of the learning cycle. Quantitative analysis also indicated a significant decrease in the retention of concepts over time. Qualitative analyses were used to examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. It was discovered that there was a correlation between teacher-student interaction and small-group interaction and concept mediation. Therefore, students who had a high level of teacher-student dialogue which utilized teacher led discussions with integrated scaffolding techniques where the same students who mediated the ideas within the small group discussions. Those students whose teacher-student interactions consisted of dialogue with little positive teacher feedback made no contributions within the small group regardless of their level of concept development.

  1. Calibration-free quantitative analysis of elemental ratios in intermetallic nanoalloys and nanocomposites using Laser Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Davari, Seyyed Ali; Hu, Sheng; Mukherjee, Dibyendu

    2017-03-01

    Intermetallic nanoalloys (NAs) and nanocomposites (NCs) have increasingly gained prominence as efficient catalytic materials in electrochemical energy conversion and storage systems. But their morphology and chemical compositions play critical role in tuning their catalytic activities, and precious metal contents. While advanced microscopy techniques facilitate morphological characterizations, traditional chemical characterizations are either qualitative or extremely involved. In this study, we apply Laser Induced Breakdown Spectroscopy (LIBS) for quantitative compositional analysis of NAs and NCs synthesized with varied elemental ratios by our in-house built pulsed laser ablation technique. Specifically, elemental ratios of binary PtNi, PdCo (NAs) and PtCo (NCs) of different compositions are determined from LIBS measurements employing an internal calibration scheme using the bulk matrix species as internal standards. Morphology and qualitative elemental compositions of the aforesaid NAs and NCs are confirmed from Transmission Electron Microscopy (TEM) images and Energy Dispersive X-ray Spectroscopy (EDX) measurements. LIBS experiments are carried out in ambient conditions with the NA and NC samples drop cast on silicon wafers after centrifugation to increase their concentrations. The technique does not call for cumbersome sample preparations including acid digestions and external calibration standards commonly required in Inductively Coupled Plasma-Optical Emission Spectroscopy (ICP-OES) techniques. Yet the quantitative LIBS results are in good agreement with the results from ICP-OES measurements. Our results indicate the feasibility of using LIBS in future for rapid and in-situ quantitative chemical characterizations of wide classes of synthesized NAs and NCs. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. The Application of Operations Research Techniques to the Evaluation of Military Management Information Systems.

    DTIC Science & Technology

    systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each

  3. Analysis of defect structure in silicon. Effect of grain boundary density on carrier mobility in UCP material

    NASA Technical Reports Server (NTRS)

    Dunn, J.; Stringfellow, G. B.; Natesh, R.

    1982-01-01

    The relationships between hole mobility and grain boundary density were studied. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using a quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.

  4. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  5. Linear Programming for Vocational Education Planning. Interim Report.

    ERIC Educational Resources Information Center

    Young, Robert C.; And Others

    The purpose of the paper is to define for potential users of vocational education management information systems a quantitative analysis technique and its utilization to facilitate more effective planning of vocational education programs. Defining linear programming (LP) as a management technique used to solve complex resource allocation problems…

  6. [Influence of sample surface roughness on mathematical model of NIR quantitative analysis of wood density].

    PubMed

    Huang, An-Min; Fei, Ben-Hua; Jiang, Ze-Hui; Hse, Chung-Yun

    2007-09-01

    Near infrared spectroscopy is widely used as a quantitative method, and the main multivariate techniques consist of regression methods used to build prediction models, however, the accuracy of analysis results will be affected by many factors. In the present paper, the influence of different sample roughness on the mathematical model of NIR quantitative analysis of wood density was studied. The result of experiments showed that if the roughness of predicted samples was consistent with that of calibrated samples, the result was good, otherwise the error would be much higher. The roughness-mixed model was more flexible and adaptable to different sample roughness. The prediction ability of the roughness-mixed model was much better than that of the single-roughness model.

  7. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  8. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  9. [Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].

    PubMed

    Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia

    2008-07-01

    Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.

  10. Laser-induced breakdown spectroscopy application in environmental monitoring of water quality: a review.

    PubMed

    Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li

    2014-12-01

    Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.

  11. Inter-rater reliability of motor unit number estimates and quantitative motor unit analysis in the tibialis anterior muscle.

    PubMed

    Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L

    2009-05-01

    To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.

  12. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    NASA Technical Reports Server (NTRS)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  13. A comparative quantitative analysis of the IDEAL (iterative decomposition of water and fat with echo asymmetry and least-squares estimation) and the CHESS (chemical shift selection suppression) techniques in 3.0 T L-spine MRI

    NASA Astrophysics Data System (ADS)

    Kim, Eng-Chan; Cho, Jae-Hwan; Kim, Min-Hye; Kim, Ki-Hong; Choi, Cheon-Woong; Seok, Jong-min; Na, Kil-Ju; Han, Man-Seok

    2013-03-01

    This study was conducted on 20 patients who had undergone pedicle screw fixation between March and December 2010 to quantitatively compare a conventional fat suppression technique, CHESS (chemical shift selection suppression), and a new technique, IDEAL (iterative decomposition of water and fat with echo asymmetry and least squares estimation). The general efficacy and usefulness of the IDEAL technique was also evaluated. Fat-suppressed transverse-relaxation-weighed images and longitudinal-relaxation-weighted images were obtained before and after contrast injection by using these two techniques with a 1.5T MR (magnetic resonance) scanner. The obtained images were analyzed for image distortion, susceptibility artifacts and homogenous fat removal in the target region. The results showed that the image distortion due to the susceptibility artifacts caused by implanted metal was lower in the images obtained using the IDEAL technique compared to those obtained using the CHESS technique. The results of a qualitative analysis also showed that compared to the CHESS technique, fewer susceptibility artifacts and more homogenous fat removal were found in the images obtained using the IDEAL technique in a comparative image evaluation of the axial plane images before and after contrast injection. In summary, compared to the CHESS technique, the IDEAL technique showed a lower occurrence of susceptibility artifacts caused by metal and lower image distortion. In addition, more homogenous fat removal was shown in the IDEAL technique.

  14. Generation of High-Quality SWATH® Acquisition Data for Label-free Quantitative Proteomics Studies Using TripleTOF® Mass Spectrometers

    PubMed Central

    Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.

    2017-01-01

    Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533

  15. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Determination of cell metabolite VEGF₁₆₅ and dynamic analysis of protein-DNA interactions by combination of microfluidic technique and luminescent switch-on probe.

    PubMed

    Lin, Xuexia; Leung, Ka-Ho; Lin, Ling; Lin, Luyao; Lin, Sheng; Leung, Chung-Hang; Ma, Dik-Lung; Lin, Jin-Ming

    2016-05-15

    In this paper, we rationally design a novel G-quadruplex-selective luminescent iridium (III) complex for rapid detection of oligonucleotide and VEGF165 in microfluidics. This new probe is applied as a convenient biosensor for label-free quantitative analysis of VEGF165 protein from cell metabolism, as well as for studying the kinetics of the aptamer-protein interaction combination with a microfluidic platform. As a result, we have successfully established a quantitative analysis of VEGF165 from cell metabolism. Furthermore, based on the principles of hydrodynamic focusing and diffusive mixing, different transient states during kinetics process were monitored and recorded. Thus, the combination of microfluidic technique and G-quadruplex luminescent probe will be potentially applied in the studies of intramolecular interactions and molecule recognition in the future. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. The quantitative analysis of silicon carbide surface smoothing by Ar and Xe cluster ions

    NASA Astrophysics Data System (ADS)

    Ieshkin, A. E.; Kireev, D. S.; Ermakov, Yu. A.; Trifonov, A. S.; Presnov, D. E.; Garshev, A. V.; Anufriev, Yu. V.; Prokhorova, I. G.; Krupenin, V. A.; Chernysh, V. S.

    2018-04-01

    The gas cluster ion beam technique was used for the silicon carbide crystal surface smoothing. The effect of processing by two inert cluster ions, argon and xenon, was quantitatively compared. While argon is a standard element for GCIB, results for xenon clusters were not reported yet. Scanning probe microscopy and high resolution transmission electron microscopy techniques were used for the analysis of the surface roughness and surface crystal layer quality. The gas cluster ion beam processing results in surface relief smoothing down to average roughness about 1 nm for both elements. It was shown that xenon as the working gas is more effective: sputtering rate for xenon clusters is 2.5 times higher than for argon at the same beam energy. High resolution transmission electron microscopy analysis of the surface defect layer gives values of 7 ± 2 nm and 8 ± 2 nm for treatment with argon and xenon clusters.

  18. Quantitative methods in fractography; Proceedings of the Symposium on Evaluation and Techniques in Fractography, Atlanta, GA, Nov. 10, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strauss, B.M.; Putatunda, S.K.

    1990-01-01

    Papers are presented on the application of quantitative fractography and computed tomography to fracture processes in materials, the relationships between fractographic features and material toughness, the quantitative analysis of fracture surfaces using fractals, and the analysis and interpretation of aircraft component defects by means of quantitative fractography. Also discussed are the characteristics of hydrogen-assisted cracking measured by the holding-load and fractographic method, a fractographic study of isolated cleavage regions in nuclear pressure vessel steels and their weld metals, a fractographic and metallographic study of the initiation of brittle fracture in weldments, cracking mechanisms for mean stress/strain low-cycle multiaxial fatigue loadings,more » and corrosion fatigue crack arrest in Al alloys.« less

  19. [A comparison of convenience sampling and purposive sampling].

    PubMed

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  20. The effects of AVIRIS atmospheric calibration methodology on identification and quantitative mapping of surface mineralogy, Drum Mountains, Utah

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dwyer, John L.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.

  1. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  2. Multi-frequency local wavenumber analysis and ply correlation of delamination damage.

    PubMed

    Juarez, Peter D; Leckey, Cara A C

    2015-09-01

    Wavenumber domain analysis through use of scanning laser Doppler vibrometry has been shown to be effective for non-contact inspection of damage in composites. Qualitative and semi-quantitative local wavenumber analysis of realistic delamination damage and quantitative analysis of idealized damage scenarios (Teflon inserts) have been performed previously in the literature. This paper presents a new methodology based on multi-frequency local wavenumber analysis for quantitative assessment of multi-ply delamination damage in carbon fiber reinforced polymer (CFRP) composite specimens. The methodology is presented and applied to a real world damage scenario (impact damage in an aerospace CFRP composite). The methodology yields delamination size and also correlates local wavenumber results from multiple excitation frequencies to theoretical dispersion curves in order to robustly determine the delamination ply depth. Results from the wavenumber based technique are validated against a traditional nondestructive evaluation method. Published by Elsevier B.V.

  3. Micromechanical thermogravimetry

    NASA Astrophysics Data System (ADS)

    Berger, R.; Lang, H. P.; Gerber, Ch.; Gimzewski, J. K.; Fabian, J. H.; Scandella, L.; Meyer, E.; Güntherodt, H.-J.

    1998-09-01

    We demonstrate a new method for thermal analysis of nanogram quantities of material using a micromechanical thermogravimetric technique. The cantilever-type device uses an integrated piezoresistor to sense bending and simultaneously to ramp the temperature and control temperature cycles. It has a mass resolution in the picogram range. A quantitative analysis of the dehydration of copper-sulfate-pentahydrate (CuSO 4·5H 2O) is presented. The technique outperforms current thermogravimetric approaches by five orders of magnitude.

  4. Advanced imaging techniques in brain tumors

    PubMed Central

    2009-01-01

    Abstract Perfusion, permeability and magnetic resonance spectroscopy (MRS) are now widely used in the research and clinical settings. In the clinical setting, qualitative, semi-quantitative and quantitative approaches such as review of color-coded maps to region of interest analysis and analysis of signal intensity curves are being applied in practice. There are several pitfalls with all of these approaches. Some of these shortcomings are reviewed, such as the relative low sensitivity of metabolite ratios from MRS and the effect of leakage on the appearance of color-coded maps from dynamic susceptibility contrast (DSC) magnetic resonance (MR) perfusion imaging and what correction and normalization methods can be applied. Combining and applying these different imaging techniques in a multi-parametric algorithmic fashion in the clinical setting can be shown to increase diagnostic specificity and confidence. PMID:19965287

  5. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  6. Continuous EEG monitoring in the intensive care unit.

    PubMed

    Scheuer, Mark L

    2002-01-01

    Continuous EEG (CEEG) monitoring allows uninterrupted assessment of cerebral cortical activity with good spatial resolution and excellent temporal resolution. Thus, this procedure provides a means of constantly assessing brain function in critically ill obtunded and comatose patients. Recent advances in digital EEG acquisition, storage, quantitative analysis, and transmission have made CEEG monitoring in the intensive care unit (ICU) technically feasible and useful. This article summarizes the indications and methodology of CEEG monitoring in the ICU, and discusses the role of some quantitative EEG analysis techniques in near real-time remote observation of CEEG recordings. Clinical examples of CEEG use, including monitoring of status epilepticus, assessment of ongoing therapy for treatment of seizures in critically ill patients, and monitoring for cerebral ischemia, are presented. Areas requiring further development of CEEG monitoring techniques and indications are discussed.

  7. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. Meta-analysis of the technical performance of an imaging procedure: Guidelines and statistical methodology

    PubMed Central

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2017-01-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353

  9. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  11. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  12. Ultrasound-guided injection for MR arthrography of the hip: comparison of two different techniques.

    PubMed

    Kantarci, Fatih; Ozbayrak, Mustafa; Gulsen, Fatih; Gencturk, Mert; Botanlioglu, Huseyin; Mihmanli, Ismail

    2013-01-01

    The purpose of this study was to prospectively evaluate the two different ultrasound-guided injection techniques for MR arthrography of the hip. Fifty-nine consecutive patients (21 men, 38 women) referred for MR arthrographies of the hip were prospectively included in the study. Three patients underwent bilateral MR arthrography. The two injection techniques were quantitatively and qualitatively compared. Quantitative analysis was performed by the comparison of injected contrast material volume into the hip joint. Qualitative analysis was performed with regard to extraarticular leakage of contrast material into the soft tissues. Extraarticular leakage of contrast material was graded as none, minimal, moderate, or severe according to the MR images. Each patient rated discomfort after the procedure using a visual analogue scale (VAS). The injected contrast material volume was less in femoral head puncture technique (mean 8.9 ± 3.4 ml) when compared to femoral neck puncture technique (mean 11.2 ± 2.9 ml) (p < 0.05). The chi-squared test showed significantly more contrast leakage by femoral head puncture technique (p < 0.05). Statistical analysis showed no difference between the head and neck puncture groups in terms of feeling of pain (p = 0.744) or in the body mass index (p = 0.658) of the patients. The femoral neck injection technique provides high intraarticular contrast volume and produces less extraarticular contrast leakage than the femoral head injection technique when US guidance is used for MR arthrography of the hip.

  13. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    PubMed

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.

  15. Improved sample preparation of glyphosate and methylphosphonic acid by EPA method 6800A and time-of-flight mass spectrometry using novel solid-phase extraction.

    PubMed

    Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip

    2012-02-01

    The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Use of a capillary electrophoresis instrument with laser-induced fluorescence detection for DNA quantitation. Comparison of YO-PRO-1 and PicoGreen assays.

    PubMed

    Guillo, Christelle; Ferrance, Jerome P; Landers, James P

    2006-04-28

    Highly selective and sensitive assays are required for detection and quantitation of the small masses of DNA typically encountered in clinical and forensic settings. High detection sensitivity is achieved using fluorescent labeling dyes and detection techniques such as spectrofluorometers, microplate readers and cytometers. This work describes the use of a laser-induced fluorescence (LIF) detector in conjunction with a commercial capillary electrophoresis instrument for DNA quantitation. PicoGreen and YO-PRO-1, two fluorescent DNA labeling dyes, were used to assess the potential of the system for routine DNA analysis. Linearity, reproducibility, sensitivity, limits of detection and quantitation, and sample stability were examined for the two assays. The LIF detector response was found to be linear (R2 > 0.999) and reproducible (RSD < 9%) in both cases. The PicoGreen assay displayed lower limits of detection and quantitation (20 pg and 60 pg, respectively) than the YO-PRO-1 assay (60 pg and 260 pg, respectively). Although a small variation in fluorescence was observed for the DNA/dye complexes over time, quantitation was not significantly affected and the solutions were found to be relatively stable for 80 min. The advantages of the technique include a 4- to 40-fold reduction in the volume of sample required compared to traditional assays, a 2- to 20-fold reduction in the volume of reagents consumed, fast and automated analysis, and low cost (no specific instrumentation required).

  17. Photogrammetry of the Human Brain: A Novel Method for Three-Dimensional Quantitative Exploration of the Structural Connectivity in Neurosurgery and Neurosciences.

    PubMed

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomic awareness of the structural connectivity of the brain is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard microdissection is a validated technique to investigate the different white matter (WM) pathways and to verify the results of tractography, the possibility of interactive exploration of the specimens and reliable acquisition of quantitative information has not been described. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined three-dimensional (3D) models. The aim of this work is to propose the application of the photogrammetric technique for supporting the 3D exploration and the quantitative analysis on the cerebral WM connectivity. The main perisylvian pathways, including the superior longitudinal fascicle and the arcuate fascicle were exposed using the Klingler technique. The photogrammetric acquisition followed each dissection step. The point clouds were registered to a reference magnetic resonance image of the specimen. All the acquisitions were coregistered into an open-source model. We analyzed 5 steps, including the cortical surface, the short intergyral fibers, the indirect posterior and anterior superior longitudinal fascicle, and the arcuate fascicle. The coregistration between the magnetic resonance imaging mesh and the point clouds models was highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D reproduction of WM anatomy and the acquisition of unlimited quantitative data directly on the real specimen during the postdissection analysis. These results open many new promising neuroscientific and educational perspectives and also optimize the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  19. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  20. Principles, performance, and applications of spectral reconstitution (SR) in quantitative analysis of oils by Fourier transform infrared spectroscopy (FT-IR).

    PubMed

    García-González, Diego L; Sedman, Jacqueline; van de Voort, Frederik R

    2013-04-01

    Spectral reconstitution (SR) is a dilution technique developed to facilitate the rapid, automated, and quantitative analysis of viscous oil samples by Fourier transform infrared spectroscopy (FT-IR). This technique involves determining the dilution factor through measurement of an absorption band of a suitable spectral marker added to the diluent, and then spectrally removing the diluent from the sample and multiplying the resulting spectrum to compensate for the effect of dilution on the band intensities. The facsimile spectrum of the neat oil thus obtained can then be qualitatively or quantitatively analyzed for the parameter(s) of interest. The quantitative performance of the SR technique was examined with two transition-metal carbonyl complexes as spectral markers, chromium hexacarbonyl and methylcyclopentadienyl manganese tricarbonyl. The estimation of the volume fraction (VF) of the diluent in a model system, consisting of canola oil diluted to various extents with odorless mineral spirits, served as the basis for assessment of these markers. The relationship between the VF estimates and the true volume fraction (VF(t)) was found to be strongly dependent on the dilution ratio and also depended, to a lesser extent, on the spectral resolution. These dependences are attributable to the effect of changes in matrix polarity on the bandwidth of the ν(CO) marker bands. Excellent VF(t) estimates were obtained by making a polarity correction devised with a variance-spectrum-delineated correction equation. In the absence of such a correction, SR was shown to introduce only a minor and constant bias, provided that polarity differences among all the diluted samples analyzed were minimal. This bias can be built into the calibration of a quantitative FT-IR analytical method by subjecting appropriate calibration standards to the same SR procedure as the samples to be analyzed. The primary purpose of the SR technique is to simplify preparation of diluted samples such that only approximate proportions need to be adhered to, rather than using exact weights or volumes, the marker accounting for minor variations. Additional applications discussed include the use of the SR technique in extraction-based, quantitative, automated FT-IR methods for the determination of moisture, acid number, and base number in lubricating oils, as well as of moisture content in edible oils.

  1. Thermal Nondestructive Characterization of Corrosion in Boiler Tubes by Application fo a Moving Line Heat Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used lor inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a "spot check" approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat Panel specimens with fabricated material loss regions will be presented.

  2. The Evolution of 3D Microimaging Techniques in Geosciences

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Proussevitch, A.

    2009-05-01

    In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.

  3. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  4. CPTAC Accelerates Precision Proteomics Biomedical Research | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The accurate quantitation of proteins or peptides using Mass Spectrometry (MS) is gaining prominence in the biomedical research community as an alternative method for analyte measurement. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) investigators have been at the forefront in the promotion of reproducible MS techniques, through the development and application of standardized proteomic methods for protein quantitation on biologically relevant samples.

  5. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  6. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  7. The computer treatment of remotely sensed data: An introduction to techniques which have geologic applications. [image enhancement and thematic classification in Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Paradella, W. R.; Vitorello, I.

    1982-01-01

    Several aspects of computer-assisted analysis techniques for image enhancement and thematic classification by which LANDSAT MSS imagery may be treated quantitatively are explained. On geological applications, computer processing of digital data allows, possibly, the fullest use of LANDSAT data, by displaying enhanced and corrected data for visual analysis and by evaluating and assigning each spectral pixel information to a given class.

  8. Contrast-enhanced magnetic resonance imaging of pulmonary lesions: description of a technique aiming clinical practice.

    PubMed

    Koenigkam-Santos, Marcel; Optazaite, Elzbieta; Sommer, Gregor; Safi, Seyer; Heussel, Claus Peter; Kauczor, Hans-Ulrich; Puderbach, Michael

    2015-01-01

    To propose a technique for evaluation of pulmonary lesions using contrast-enhanced MRI; to assess morphological patterns of enhancement and correlate quantitative analysis with histopathology. Thirty-six patients were prospectively studied. Volumetric-interpolated T1W images were obtained during consecutive breath holds after bolus triggered contrast injection. Volume coverage of first three acquisitions was limited (higher temporal resolution) and last acquisition obtained at 4th min. Two radiologists individually evaluated the patterns of enhancement. Region-of-interest-based signal intensity (SI)-time curves were created to assess quantitative parameters. Readers agreed moderately to substantially concerning lesions' enhancement pattern. SI-time curves could be created for all lesions. In comparison to benign, malignant lesions showed higher values of maximum enhancement, early peak, slope and 4th min enhancement. Early peak >15% showed 100% sensitivity to detect malignancy, maximum enhancement >40% showed 100% specificity. The proposed technique is robust, simple to perform and can be applied in clinical scenario. It allows visual evaluation of enhancement pattern/progression together with creation of SI-time curves and assessment of derived quantitative parameters. Perfusion analysis was highly sensitive to detect malignancy, in accordance to what is recommended by most recent guidelines on imaging evaluation of pulmonary lesions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Using Log Linear Analysis for Categorical Family Variables.

    ERIC Educational Resources Information Center

    Moen, Phyllis

    The Goodman technique of log linear analysis is ideal for family research, because it is designed for categorical (non-quantitative) variables. Variables are dichotomized (for example, married/divorced, childless/with children) or otherwise categorized (for example, level of permissiveness, life cycle stage). Contingency tables are then…

  10. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  11. Effects of dynamic diffraction conditions on magnetic parameter determination in a double perovskite Sr2FeMoO6 using electron energy-loss magnetic chiral dichroism.

    PubMed

    Wang, Z C; Zhong, X Y; Jin, L; Chen, X F; Moritomo, Y; Mayer, J

    2017-05-01

    Electron energy-loss magnetic chiral dichroism (EMCD) spectroscopy, which is similar to the well-established X-ray magnetic circular dichroism spectroscopy (XMCD), can determine the quantitative magnetic parameters of materials with high spatial resolution. One of the major obstacles in quantitative analysis using the EMCD technique is the relatively poor signal-to-noise ratio (SNR), compared to XMCD. Here, in the example of a double perovskite Sr 2 FeMoO 6 , we predicted the optimal dynamical diffraction conditions such as sample thickness, crystallographic orientation and detection aperture position by theoretical simulations. By using the optimized conditions, we showed that the SNR of experimental EMCD spectra can be significantly improved and the error of quantitative magnetic parameter determined by EMCD technique can be remarkably lowered. Our results demonstrate that, with enhanced SNR, the EMCD technique can be a unique tool to understand the structure-property relationship of magnetic materials particularly in the high-density magnetic recording and spintronic devices by quantitatively determining magnetic structure and properties at the nanometer scale. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Melt-Flow Behaviours of Thermoplastic Materials under Fire Conditions: Recent Experimental Studies and Some Theoretical Approaches

    PubMed Central

    Joseph, Paul; Tretsiakova-McNally, Svetlana

    2015-01-01

    Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions. PMID:28793746

  13. Pulmonary nodule characterization, including computer analysis and quantitative features.

    PubMed

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  14. Melt-Flow Behaviours of Thermoplastic Materials under Fire Conditions: Recent Experimental Studies and Some Theoretical Approaches.

    PubMed

    Joseph, Paul; Tretsiakova-McNally, Svetlana

    2015-12-15

    Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions.

  15. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  16. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  17. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  18. Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.

    PubMed

    Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J

    2017-10-20

    This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p < 0.05) between the two lesion types. A hybrid biomarker developed using a stepwise feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.

  19. Quantitative analysis of aircraft multispectral-scanner data and mapping of water-quality parameters in the James River in Virginia

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Bahn, G. S.

    1977-01-01

    Statistical analysis techniques were applied to develop quantitative relationships between in situ river measurements and the remotely sensed data that were obtained over the James River in Virginia on 28 May 1974. The remotely sensed data were collected with a multispectral scanner and with photographs taken from an aircraft platform. Concentration differences among water quality parameters such as suspended sediment, chlorophyll a, and nutrients indicated significant spectral variations. Calibrated equations from the multiple regression analysis were used to develop maps that indicated the quantitative distributions of water quality parameters and the dispersion characteristics of a pollutant plume entering the turbid river system. Results from further analyses that use only three preselected multispectral scanner bands of data indicated that regression coefficients and standard errors of estimate were not appreciably degraded compared with results from the 10-band analysis.

  20. To label or not to label: applications of quantitative proteomics in neuroscience research.

    PubMed

    Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W

    2012-02-01

    Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Accurate Quantitation and Analysis of Nitrofuran Metabolites, Chloramphenicol, and Florfenicol in Seafood by Ultrahigh-Performance Liquid Chromatography-Tandem Mass Spectrometry: Method Validation and Regulatory Samples.

    PubMed

    Aldeek, Fadi; Hsieh, Kevin C; Ugochukwu, Obiadada N; Gerard, Ghislain; Hammack, Walter

    2018-05-23

    We developed and validated a method for the extraction, identification, and quantitation of four nitrofuran metabolites, 3-amino-2-oxazolidinone (AOZ), 3-amino-5-morpholinomethyl-2-oxazolidinone (AMOZ), semicarbazide (SC), and 1-aminohydantoin (AHD), as well as chloramphenicol and florfenicol in a variety of seafood commodities. Samples were extracted by liquid-liquid extraction techniques, analyzed by ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS), and quantitated using commercially sourced, derivatized nitrofuran metabolites, with their isotopically labeled internal standards in-solvent. We obtained recoveries of 90-100% at various fortification levels. The limit of detection (LOD) was set at 0.25 ng/g for AMOZ and AOZ, 1 ng/g for AHD and SC, and 0.1 ng/g for the phenicols. Various extraction methods, standard stability, derivatization efficiency, and improvements to conventional quantitation techniques were also investigated. We successfully applied this method to the identification and quantitation of nitrofuran metabolites and phenicols in 102 imported seafood products. Our results revealed that four of the samples contained residues from banned veterinary drugs.

  2. A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.

    PubMed

    Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain

    2015-10-01

    Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.

  3. Quantitative analysis of pork and chicken products by droplet digital PCR.

    PubMed

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises.

  4. Sub-band denoising and spline curve fitting method for hemodynamic measurement in perfusion MRI

    NASA Astrophysics Data System (ADS)

    Lin, Hong-Dun; Huang, Hsiao-Ling; Hsu, Yuan-Yu; Chen, Chi-Chen; Chen, Ing-Yi; Wu, Liang-Chi; Liu, Ren-Shyan; Lin, Kang-Ping

    2003-05-01

    In clinical research, non-invasive MR perfusion imaging is capable of investigating brain perfusion phenomenon via various hemodynamic measurements, such as cerebral blood volume (CBV), cerebral blood flow (CBF), and mean trasnit time (MTT). These hemodynamic parameters are useful in diagnosing brain disorders such as stroke, infarction and periinfarct ischemia by further semi-quantitative analysis. However, the accuracy of quantitative analysis is usually affected by poor signal-to-noise ratio image quality. In this paper, we propose a hemodynamic measurement method based upon sub-band denoising and spline curve fitting processes to improve image quality for better hemodynamic quantitative analysis results. Ten sets of perfusion MRI data and corresponding PET images were used to validate the performance. For quantitative comparison, we evaluate gray/white matter CBF ratio. As a result, the hemodynamic semi-quantitative analysis result of mean gray to white matter CBF ratio is 2.10 +/- 0.34. The evaluated ratio of brain tissues in perfusion MRI is comparable to PET technique is less than 1-% difference in average. Furthermore, the method features excellent noise reduction and boundary preserving in image processing, and short hemodynamic measurement time.

  5. Atlas of computerized blood flow analysis in bone disease.

    PubMed

    Gandsman, E J; Deutsch, S D; Tyson, I B

    1983-11-01

    The role of computerized blood flow analysis in routine bone scanning is reviewed. Cases illustrating the technique include proven diagnoses of toxic synovitis, Legg-Perthes disease, arthritis, avascular necrosis of the hip, fractures, benign and malignant tumors, Paget's disease, cellulitis, osteomyelitis, and shin splints. Several examples also show the use of the technique in monitoring treatment. The use of quantitative data from the blood flow, bone uptake phase, and static images suggests specific diagnostic patterns for each of the diseases presented in this atlas. Thus, this technique enables increased accuracy in the interpretation of the radionuclide bone scan.

  6. Spectral Analysis and Experimental Modeling of Ice Accretion Roughness

    NASA Technical Reports Server (NTRS)

    Orr, D. J.; Breuer, K. S.; Torres, B. E.; Hansman, R. J., Jr.

    1996-01-01

    A self-consistent scheme for relating wind tunnel ice accretion roughness to the resulting enhancement of heat transfer is described. First, a spectral technique of quantitative analysis of early ice roughness images is reviewed. The image processing scheme uses a spectral estimation technique (SET) which extracts physically descriptive parameters by comparing scan lines from the experimentally-obtained accretion images to a prescribed test function. Analysis using this technique for both streamwise and spanwise directions of data from the NASA Lewis Icing Research Tunnel (IRT) are presented. An experimental technique is then presented for constructing physical roughness models suitable for wind tunnel testing that match the SET parameters extracted from the IRT images. The icing castings and modeled roughness are tested for enhancement of boundary layer heat transfer using infrared techniques in a "dry" wind tunnel.

  7. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  8. An optimized color transformation for the analysis of digital images of hematoxylin & eosin stained slides.

    PubMed

    Zarella, Mark D; Breen, David E; Plagov, Andrei; Garcia, Fernando U

    2015-01-01

    Hematoxylin and eosin (H&E) staining is ubiquitous in pathology practice and research. As digital pathology has evolved, the reliance of quantitative methods that make use of H&E images has similarly expanded. For example, cell counting and nuclear morphometry rely on the accurate demarcation of nuclei from other structures and each other. One of the major obstacles to quantitative analysis of H&E images is the high degree of variability observed between different samples and different laboratories. In an effort to characterize this variability, as well as to provide a substrate that can potentially mitigate this factor in quantitative image analysis, we developed a technique to project H&E images into an optimized space more appropriate for many image analysis procedures. We used a decision tree-based support vector machine learning algorithm to classify 44 H&E stained whole slide images of resected breast tumors according to the histological structures that are present. This procedure takes an H&E image as an input and produces a classification map of the image that predicts the likelihood of a pixel belonging to any one of a set of user-defined structures (e.g., cytoplasm, stroma). By reducing these maps into their constituent pixels in color space, an optimal reference vector is obtained for each structure, which identifies the color attributes that maximally distinguish one structure from other elements in the image. We show that tissue structures can be identified using this semi-automated technique. By comparing structure centroids across different images, we obtained a quantitative depiction of H&E variability for each structure. This measurement can potentially be utilized in the laboratory to help calibrate daily staining or identify troublesome slides. Moreover, by aligning reference vectors derived from this technique, images can be transformed in a way that standardizes their color properties and makes them more amenable to image processing.

  9. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  10. Civil infrastructure monitoring for IVHS using optical fiber sensors

    NASA Astrophysics Data System (ADS)

    de Vries, Marten J.; Arya, Vivek; Grinder, C. R.; Murphy, Kent A.; Claus, Richard O.

    1995-01-01

    8Early deployment of Intelligent Vehicle Highway Systems would necessitate the internal instrumentation of infrastructure for emergency preparedness. Existing quantitative analysis and visual analysis techniques are time consuming, cost prohibitive, and are often unreliable. Fiber optic sensors are rapidly replacing conventional instrumentation because of their small size, light weight, immunity to electromagnetic interference, and extremely high information carrying capability. In this paper research on novel optical fiber sensing techniques for health monitoring of civil infrastructure such as highways and bridges is reported. Design, fabrication, and implementation of fiber optic sensor configurations used for measurements of strain are discussed. Results from field tests conducted to demonstrate the effectiveness of fiber sensors at determining quantitative strain vector components near crack locations in bridges are presented. Emerging applications of fiber sensors for vehicle flow, vehicle speed, and weigh-in-motion measurements are also discussed.

  11. Quantitative Hyperspectral Reflectance Imaging

    PubMed Central

    Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.

    2008-01-01

    Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831

  12. Quantitative body fluid proteomics in medicine - A focus on minimal invasiveness.

    PubMed

    Csősz, Éva; Kalló, Gergő; Márkus, Bernadett; Deák, Eszter; Csutak, Adrienne; Tőzsér, József

    2017-02-05

    Identification of new biomarkers specific for various pathological conditions is an important field in medical sciences. Body fluids have emerging potential in biomarker studies especially those which are continuously available and can be collected by non-invasive means. Changes in the protein composition of body fluids such as tears, saliva, sweat, etc. may provide information on both local and systemic conditions of medical relevance. In this review, our aim is to discuss the quantitative proteomics techniques used in biomarker studies, and to present advances in quantitative body fluid proteomics of non-invasively collectable body fluids with relevance to biomarker identification. The advantages and limitations of the widely used quantitative proteomics techniques are also presented. Based on the reviewed literature, we suggest an ideal pipeline for body fluid analyses aiming at biomarkers discoveries: starting from identification of biomarker candidates by shotgun quantitative proteomics or protein arrays, through verification of potential biomarkers by targeted mass spectrometry, to the antibody-based validation of biomarkers. The importance of body fluids as a rich source of biomarkers is discussed. Quantitative proteomics is a challenging part of proteomics applications. The body fluids collected by non-invasive means have high relevance in medicine; they are good sources for biomarkers used in establishing the diagnosis, follow up of disease progression and predicting high risk groups. The review presents the most widely used quantitative proteomics techniques in body fluid analysis and lists the potential biomarkers identified in tears, saliva, sweat, nasal mucus and urine for local and systemic diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Simulations of motor unit number estimation techniques

    NASA Astrophysics Data System (ADS)

    Major, Lora A.; Jones, Kelvin E.

    2005-06-01

    Motor unit number estimation (MUNE) is an electrodiagnostic procedure used to evaluate the number of motor axons connected to a muscle. All MUNE techniques rely on assumptions that must be fulfilled to produce a valid estimate. As there is no gold standard to compare the MUNE techniques against, we have developed a model of the relevant neuromuscular physiology and have used this model to simulate various MUNE techniques. The model allows for a quantitative analysis of candidate MUNE techniques that will hopefully contribute to consensus regarding a standard procedure for performing MUNE.

  14. Advances in Surface Plasmon Resonance Imaging allowing for quantitative measurement of laterally heterogeneous samples

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2012-02-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  15. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    PubMed

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  16. Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.

    PubMed

    Neilson, Julia W; Jordan, Fiona L; Maier, Raina M

    2013-03-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Metabolic Mapping: Quantitative Enzyme Cytochemistry and Histochemistry to Determine the Activity of Dehydrogenases in Cells and Tissues.

    PubMed

    Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F

    2018-05-26

    Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.

  18. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  19. The Intercultural Component in Textbooks for Teaching a Service Technical Writing Course

    ERIC Educational Resources Information Center

    Matveeva, Natalia

    2007-01-01

    This research article investigates new developments in the representation of the intercultural component in textbooks for a service technical writing course. Through textual analysis, using quantitative and qualitative techniques, I report discourse analysis of 15 technical writing textbooks published during 1993-2006. The theoretical and…

  20. The National Health Educator Job Analysis 2010: Process and Outcomes

    ERIC Educational Resources Information Center

    Doyle, Eva I.; Caro, Carla M.; Lysoby, Linda; Auld, M. Elaine; Smith, Becky J.; Muenzen, Patricia M.

    2012-01-01

    The National Health Educator Job Analysis 2010 was conducted to update the competencies model for entry- and advanced-level health educators. Qualitative and quantitative methods were used. Structured interviews, focus groups, and a modified Delphi technique were implemented to engage 59 health educators from diverse work settings and experience…

  1. Beginning Learners' Development of Interactional Competence: Alignment Activity

    ERIC Educational Resources Information Center

    Tecedor, Marta

    2016-01-01

    This study examined the development of interactional competence (Hall, 1993; He & Young, 1998) by beginning learners of Spanish as indexed by their use of alignment moves. Discourse analysis techniques and quantitative data analysis were used to explore how 52 learners expressed alignment and changes in participation patterns in two sets of…

  2. Approaches to the Analysis of School Costs, an Introduction.

    ERIC Educational Resources Information Center

    Payzant, Thomas

    A review and general discussion of quantitative and qualitative techniques for the analysis of economic problems outside of education is presented to help educators discover new tools for planning, allocating, and evaluating educational resources. The pamphlet covers some major components of cost accounting, cost effectiveness, cost-benefit…

  3. Missing Value Monitoring Enhances the Robustness in Proteomics Quantitation.

    PubMed

    Matafora, Vittoria; Corno, Andrea; Ciliberto, Andrea; Bachi, Angela

    2017-04-07

    In global proteomic analysis, it is estimated that proteins span from millions to less than 100 copies per cell. The challenge of protein quantitation by classic shotgun proteomic techniques relies on the presence of missing values in peptides belonging to low-abundance proteins that lowers intraruns reproducibility affecting postdata statistical analysis. Here, we present a new analytical workflow MvM (missing value monitoring) able to recover quantitation of missing values generated by shotgun analysis. In particular, we used confident data-dependent acquisition (DDA) quantitation only for proteins measured in all the runs, while we filled the missing values with data-independent acquisition analysis using the library previously generated in DDA. We analyzed cell cycle regulated proteins, as they are low abundance proteins with highly dynamic expression levels. Indeed, we found that cell cycle related proteins are the major components of the missing values-rich proteome. Using the MvM workflow, we doubled the number of robustly quantified cell cycle related proteins, and we reduced the number of missing values achieving robust quantitation for proteins over ∼50 molecules per cell. MvM allows lower quantification variance among replicates for low abundance proteins with respect to DDA analysis, which demonstrates the potential of this novel workflow to measure low abundance, dynamically regulated proteins.

  4. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  5. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.

  6. Extinction measurement of dense media by an optical coherence tomography technique

    NASA Astrophysics Data System (ADS)

    Ago, Tomoki; Iwai, Toshiaki; Yokota, Ryoko

    2016-10-01

    The optical coherence tomography will make progress as the next stage toward a spectroscopic analysis technique. The spectroscopic analysis is based on the Beer-Lambert law. The absorption and scattering coefficients even for the dense medium can be measured by the Beer-Lambert law because the OCT can detect only the light keeping the coherency which propagated rectilinearly and retro-reflected from scatters. This study is concerned with the quantitative verification of Beer-Lambert law in the OCT imaging.

  7. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    PubMed

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  8. High speed digital holographic interferometry for hypersonic flow visualization

    NASA Astrophysics Data System (ADS)

    Hegde, G. M.; Jagdeesh, G.; Reddy, K. P. J.

    2013-06-01

    Optical imaging techniques have played a major role in understanding the flow dynamics of varieties of fluid flows, particularly in the study of hypersonic flows. Schlieren and shadowgraph techniques have been the flow diagnostic tools for the investigation of compressible flows since more than a century. However these techniques provide only the qualitative information about the flow field. Other optical techniques such as holographic interferometry and laser induced fluorescence (LIF) have been used extensively for extracting quantitative information about the high speed flows. In this paper we present the application of digital holographic interferometry (DHI) technique integrated with short duration hypersonic shock tunnel facility having 1 ms test time, for quantitative flow visualization. Dynamics of the flow fields in hypersonic/supersonic speeds around different test models is visualized with DHI using a high-speed digital camera (0.2 million fps). These visualization results are compared with schlieren visualization and CFD simulation results. Fringe analysis is carried out to estimate the density of the flow field.

  9. Bayesian aggregation versus majority vote in the characterization of non-specific arm pain based on quantitative needle electromyography

    PubMed Central

    2010-01-01

    Background Methods for the calculation and application of quantitative electromyographic (EMG) statistics for the characterization of EMG data detected from forearm muscles of individuals with and without pain associated with repetitive strain injury are presented. Methods A classification procedure using a multi-stage application of Bayesian inference is presented that characterizes a set of motor unit potentials acquired using needle electromyography. The utility of this technique in characterizing EMG data obtained from both normal individuals and those presenting with symptoms of "non-specific arm pain" is explored and validated. The efficacy of the Bayesian technique is compared with simple voting methods. Results The aggregate Bayesian classifier presented is found to perform with accuracy equivalent to that of majority voting on the test data, with an overall accuracy greater than 0.85. Theoretical foundations of the technique are discussed, and are related to the observations found. Conclusions Aggregation of motor unit potential conditional probability distributions estimated using quantitative electromyographic analysis, may be successfully used to perform electrodiagnostic characterization of "non-specific arm pain." It is expected that these techniques will also be able to be applied to other types of electrodiagnostic data. PMID:20156353

  10. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  11. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  12. Quantitative analysis of PEG-functionalized colloidal gold nanoparticles using charged aerosol detection.

    PubMed

    Smith, Mackensie C; Crist, Rachael M; Clogston, Jeffrey D; McNeil, Scott E

    2015-05-01

    Surface characteristics of a nanoparticle, such as functionalization with polyethylene glycol (PEG), are critical to understand and achieve optimal biocompatibility. Routine physicochemical characterization such as UV-vis spectroscopy (for gold nanoparticles), dynamic light scattering, and zeta potential are commonly used to assess the presence of PEG. However, these techniques are merely qualitative and are not sensitive enough to distinguish differences in PEG quantity, density, or presentation. As an alternative, two methods are described here which allow for quantitative measurement of PEG on PEGylated gold nanoparticles. The first, a displacement method, utilizes dithiothreitol to displace PEG from the gold surface. The dithiothreitol-coated gold nanoparticles are separated from the mixture via centrifugation, and the excess dithiothreitol and dissociated PEG are separated through reversed-phase high-performance liquid chromatography (RP-HPLC). The second, a dissolution method, utilizes potassium cyanide to dissolve the gold nanoparticles and liberate PEG. Excess CN(-), Au(CN)2 (-), and free PEG are separated using RP-HPLC. In both techniques, the free PEG can be quantified against a standard curve using charged aerosol detection. The displacement and dissolution methods are validated here using 2-, 5-, 10-, and 20-kDa PEGylated 30-nm colloidal gold nanoparticles. Further value in these techniques is demonstrated not only by quantitating the total PEG fraction but also by being able to be adapted to quantitate the free unbound PEG and the bound PEG fractions. This is an important distinction, as differences in the bound and unbound PEG fractions can affect biocompatibility, which would not be detected in techniques that only quantitate the total PEG fraction.

  13. Molecular and Cellular Quantitative Microscopy: theoretical investigations, technological developments and applications to neurobiology

    NASA Astrophysics Data System (ADS)

    Esposito, Alessandro

    2006-05-01

    This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the investigation of molecular and cellular properties at high throughput levels and the analysis of cellular heterogeneity. State-of-the-art Förster Resonance Energy Transfer imaging, Fluorescence Lifetime Imaging Microscopy, Confocal Laser Scanning Microscopy and the newly developed tools have been combined with cellular and molecular biology techniques for the investigation of protein-protein interactions, oligomerization and post-translational modifications of α-Synuclein and Tau, two proteins involved in Parkinson’s and Alzheimer’s disease, respectively. The high inter-disciplinarity of this project required the merging of the expertise of both the Molecular Biophysics Group at the Debye Institute - Utrecht University and the Cell Biophysics Group at the European Neuroscience Institute - Gαttingen University. This project was conducted also with the support and the collaboration of the Center for the Molecular Physiology of the Brain (Göttingen), particularly with the groups associated with the Molecular Quantitative Microscopy and Parkinson’s Disease and Aggregopathies areas. This work demonstrates that molecular and cellular quantitative microscopy can be used in combination with high-throughput screening as a powerful tool for the investigation of the molecular mechanisms of complex biological phenomena like those occurring in neurodegenerative diseases.

  14. Quantified Energy Dissipation Rates in the Terrestrial Bow Shock. 1.; Analysis Techniques and Methodology

    NASA Technical Reports Server (NTRS)

    Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-01-01

    We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.

  15. Improving the geological interpretation of magnetic and gravity satellite anomalies

    NASA Technical Reports Server (NTRS)

    Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.

    1987-01-01

    Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.

  16. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  17. Combined use of quantitative ED-EPMA, Raman microspectrometry, and ATR-FTIR imaging techniques for the analysis of individual particles.

    PubMed

    Jung, Hae-Jin; Eom, Hyo-Jin; Kang, Hyun-Woo; Moreau, Myriam; Sobanska, Sophie; Ro, Chul-Un

    2014-08-21

    In this work, quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA) (called low-Z particle EPMA), Raman microspectrometry (RMS), and attenuated total reflectance Fourier transform infrared spectroscopic (ATR-FTIR) imaging were applied in combination for the analysis of the same individual airborne particles for the first time. After examining individual particles of micrometer size by low-Z particle EPMA, consecutive examinations by RMS and ATR-FTIR imaging of the same individual particles were then performed. The relocation of the same particles on Al or Ag foils was successfully carried out among the three standalone instruments for several standard samples and an indoor airborne particle sample, resulting in the successful acquisition of quality spectral data from the three single-particle analytical techniques. The combined application of the three techniques to several different standard particles confirmed that those techniques provided consistent and complementary chemical composition information on the same individual particles. Further, it was clearly demonstrated that the three different types of spectral and imaging data from the same individual particles in an indoor aerosol sample provided richer information on physicochemical characteristics of the particle ensemble than that obtainable by the combined use of two single-particle analytical techniques.

  18. Quantitative strain and compositional studies of InxGa1-xAs Epilayer in a GaAs-based pHEMT device structure by TEM techniques.

    PubMed

    Sridhara Rao, Duggi V; Sankarasubramanian, Ramachandran; Muraleedharan, Kuttanellore; Mehrtens, Thorsten; Rosenauer, Andreas; Banerjee, Dipankar

    2014-08-01

    In GaAs-based pseudomorphic high-electron mobility transistor device structures, strain and composition of the In x Ga1-x As channel layer are very important as they influence the electronic properties of these devices. In this context, transmission electron microscopy techniques such as (002) dark-field imaging, high-resolution transmission electron microscopy (HRTEM) imaging, scanning transmission electron microscopy-high angle annular dark field (STEM-HAADF) imaging and selected area diffraction, are useful. A quantitative comparative study using these techniques is relevant for assessing the merits and limitations of the respective techniques. In this article, we have investigated strain and composition of the In x Ga1-x As layer with the mentioned techniques and compared the results. The HRTEM images were investigated with strain state analysis. The indium content in this layer was quantified by HAADF imaging and correlated with STEM simulations. The studies showed that the In x Ga1-x As channel layer was pseudomorphically grown leading to tetragonal strain along the [001] growth direction and that the average indium content (x) in the epilayer is ~0.12. We found consistency in the results obtained using various methods of analysis.

  19. INFRARED SPECTROSCOPY: A TOOL FOR DETERMINATION OF THE DEGREE OF CONVERSION IN DENTAL COMPOSITES

    PubMed Central

    Moraes, Luciene Gonçalves Palmeira; Rocha, Renata Sanches Ferreira; Menegazzo, Lívia Maluf; de AraÚjo, Eudes Borges; Yukimitu, Keizo; Moraes, João Carlos Silos

    2008-01-01

    Infrared spectroscopy is one of the most widely used techniques for measurement of conversion degree in dental composites. However, to obtain good quality spectra and quantitative analysis from spectral data, appropriate expertise and knowledge of the technique are mandatory. This paper presents important details to use infrared spectroscopy for determination of the conversion degree. PMID:19089207

  20. Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.

    2004-05-01

    Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.

  1. Structural, chemical and physical properties of pure and La3+ doped L-Threonine acetate crystals

    NASA Astrophysics Data System (ADS)

    Senthamizhan, A.; Sambathkumar, K.; Nithiyanantham, S.; Venkatachalapathy, M.; Rajkamal, N.

    2017-12-01

    The pure and La3+ doped L- Threonine crystals can be grown by slow evaporation techniques. The crystal structure were examined through X-Ray diffraction (XRD) analysis, confirmed the P212121 system. The quantitative nature of dopant can be analyzed with Inductively Coupled Plasma (ICP) study. The Fourier Transform Infra-Red (FTIR) and Fourier Transform (FT- Raman) investigations yields the possible stretching/bonding with their functional groups and the qualitative/quantitative nature of both crystals is analyzed. The optical behavior of crystals can be studied through Ultra Violet (UV) - Visible spectrometer. The mechanical, thermal and decomposition studies can be carried out through Vickers hardness test, Thermo Gravometric Analysis (TGA) and Differential Thermal Analysis (DTA). The Non Linear Optical (NLO) properties are found more than Potassium Phosphate (KDP) through Kurtz powders technique. The dielectric and optical absorption studies for both pure and L-doped crystals were studied and interpreted all the properties. The La3+ dopant increases the properties are investigated.

  2. Quantitative fluorescence correlation spectroscopy on DNA in living cells

    NASA Astrophysics Data System (ADS)

    Hodges, Cameron; Kafle, Rudra P.; Meiners, Jens-Christian

    2017-02-01

    FCS is a fluorescence technique conventionally used to study the kinetics of fluorescent molecules in a dilute solution. Being a non-invasive technique, it is now drawing increasing interest for the study of more complex systems like the dynamics of DNA or proteins in living cells. Unlike an ordinary dye solution, the dynamics of macromolecules like proteins or entangled DNA in crowded environments is often slow and subdiffusive in nature. This in turn leads to longer residence times of the attached fluorophores in the excitation volume of the microscope and artifacts from photobleaching abound that can easily obscure the signature of the molecular dynamics of interest and make quantitative analysis challenging.We discuss methods and procedures to make FCS applicable to quantitative studies of the dynamics of DNA in live prokaryotic and eukaryotic cells. The intensity autocorrelation is computed function from weighted arrival times of the photons on the detector that maximizes the information content while simultaneously correcting for the effect of photobleaching to yield an autocorrelation function that reflects only the underlying dynamics of the sample. This autocorrelation function in turn is used to calculate the mean square displacement of the fluorophores attached to DNA. The displacement data is more amenable to further quantitative analysis than the raw correlation functions. By using a suitable integral transform of the mean square displacement, we can then determine the viscoelastic moduli of the DNA in its cellular environment. The entire analysis procedure is extensively calibrated and validated using model systems and computational simulations.

  3. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals.

    PubMed

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A

    2016-03-05

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.

  4. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals

    PubMed Central

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.

    2016-01-01

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029

  5. Multivariate analysis, mass balance techniques, and statistical tests as tools in igneous petrology: application to the Sierra de las Cruces volcanic range (Mexican Volcanic Belt).

    PubMed

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).

  6. Texture analysis of medical images for radiotherapy applications

    PubMed Central

    Rizzo, Giovanna

    2017-01-01

    The high-throughput extraction of quantitative information from medical images, known as radiomics, has grown in interest due to the current necessity to quantitatively characterize tumour heterogeneity. In this context, texture analysis, consisting of a variety of mathematical techniques that can describe the grey-level patterns of an image, plays an important role in assessing the spatial organization of different tissues and organs. For these reasons, the potentiality of texture analysis in the context of radiotherapy has been widely investigated in several studies, especially for the prediction of the treatment response of tumour and normal tissues. Nonetheless, many different factors can affect the robustness, reproducibility and reliability of textural features, thus limiting the impact of this technique. In this review, an overview of the most recent works that have applied texture analysis in the context of radiotherapy is presented, with particular focus on the assessment of tumour and tissue response to radiations. Preliminary, the main factors that have an influence on features estimation are discussed, highlighting the need of more standardized image acquisition and reconstruction protocols and more accurate methods for region of interest identification. Despite all these limitations, texture analysis is increasingly demonstrating its ability to improve the characterization of intratumour heterogeneity and the prediction of clinical outcome, although prospective studies and clinical trials are required to draw a more complete picture of the full potential of this technique. PMID:27885836

  7. Quantitative analysis of thoria phase in Th-U alloys using diffraction studies

    NASA Astrophysics Data System (ADS)

    Thakur, Shital; Krishna, P. S. R.; Shinde, A. B.; Kumar, Raj; Roy, S. B.

    2017-05-01

    In the present study the quantitative phase analysis of Th-U alloys in bulk form namely Th-52 wt% U and Th-3wt%U has been performed over the data obtained from both X ray diffraction and neutron diffraction technique using Rietveld method of FULLPROF software. Quantifying thoria (ThO2) phase present in bulk of the sample is limited due to surface oxidation and low penetration of x rays in high Z material. Neutron diffraction study probing bulk of the samples has been presented in comparison with x-ray diffraction study.

  8. Determination of trace element mineral/liquid partition coefficients in melilite and diopside by ion and electron microprobe techniques

    NASA Technical Reports Server (NTRS)

    Kuehner, S. M.; Laughlin, J. R.; Grossman, L.; Johnson, M. L.; Burnett, D. S.

    1989-01-01

    The applicability of ion microprobe (IMP) for quantitative analysis of minor elements (Sr, Y, Zr, La, Sm, and Yb) in the major phases present in natural Ca-, Al-rich inclusions (CAIs) was investigated by comparing IMP results with those of an electron microprobe (EMP). Results on three trace-element-doped glasses indicated that it is not possible to obtain precise quantitative analysis by using IMP if there are large differences in SiO2 content between the standards used to derive the ion yields and the unknowns.

  9. Quantitative spectral and orientational analysis in surface sum frequency generation vibrational spectroscopy (SFG-VS)

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua

    Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D = /, and comparison between the PNA method with the commonly used polarization intensity ratio (PIR) method is discussed. The polarization and incident angle dependencies of the SFG-VS intensity are also reviewed, in the light of how experimental arrangements can be optimized to effectively abstract crucial information from the SFG-VS experiments. The values and models of the local field factors in the molecular layers are discussed. In order to examine the validity and limitations of the bond polarizability derivative model, the general expressions for molecular hyperpolarizability tensors and their expression with the bond polarizability derivative model for C3v, C2v and C∞v molecular groups are given in the two appendixes. We show that the bond polarizability derivative model can quantitatively describe many aspects of the intensities observed in the SFG-VS spectrum of the vapour/neat liquid interfaces in different polarizations. Using the polarization analysis in SFG-VS, polarization selection rules or guidelines are developed for assignment of the SFG-VS spectrum. Using the selection rules, SFG-VS spectra of vapour/diol, and vapour/n-normal alcohol (n˜ 1-8) interfaces are assigned, and some of the ambiguity and confusion, as well as their implications in previous IR and Raman assignment, are duly discussed. The ability to assign a SFG-VS spectrum using the polarization selection rules makes SFG-VS not only an effective and useful vibrational spectroscopy technique for interface studies, but also a complementary vibrational spectroscopy method in general condensed phase studies. These developments will put quantitative orientational and spectral analysis in SFG-VS on a more solid foundation. The formulations, concepts and issues discussed in this review are expected to find broad applications for investigations on molecular interfaces in the future.

  10. Gas Chromatographic Determination of Methyl Salicylate in Rubbing Alcohol: An Experiment Employing Standard Addition.

    ERIC Educational Resources Information Center

    Van Atta, Robert E.; Van Atta, R. Lewis

    1980-01-01

    Provides a gas chromatography experiment that exercises the quantitative technique of standard addition to the analysis for a minor component, methyl salicylate, in a commercial product, "wintergreen rubbing alcohol." (CS)

  11. Urea Biosynthesis Using Liver Slices

    ERIC Educational Resources Information Center

    Teal, A. R.

    1976-01-01

    Presented is a practical scheme to enable introductory biology students to investigate the mechanism by which urea is synthesized in the liver. The tissue-slice technique is discussed, and methods for the quantitative analysis of metabolites are presented. (Author/SL)

  12. Mechanisms of Mitochondrial Defects in Gulf War Syndrome

    DTIC Science & Technology

    2014-10-01

    parameters: uncoupling ratio, net routine flux control ratio, respiratory control ratio, leak flux control ratio, phosphorylation respiratory... oxidative phosphorylation subunit) Quantitative analysis of individual mitochondrial proteins. The technique has been established and validated for muscle...Blue Native and Clear Native Analyses (non-denatured analysis of supercomplex formation and monomeric oxidative phosphorylation enzyme assembly

  13. Teaching Students with Visual Impairments in an Inclusive Educational Setting: A Case from Nepal

    ERIC Educational Resources Information Center

    Lamichhane, Kamal

    2017-01-01

    Using the data set from teachers and students and utilising both qualitative and quantitative techniques for analysis, I discuss teaching style considerations in Nepal's mainstream schools for students with visual impairments. Results of the econometric analysis show that teachers' years of schooling, teaching experience, and using blackboard were…

  14. Publication Bias in Research Synthesis: Sensitivity Analysis Using A Priori Weight Functions

    ERIC Educational Resources Information Center

    Vevea, Jack L.; Woods, Carol M.

    2005-01-01

    Publication bias, sometimes known as the "file-drawer problem" or "funnel-plot asymmetry," is common in empirical research. The authors review the implications of publication bias for quantitative research synthesis (meta-analysis) and describe existing techniques for detecting and correcting it. A new approach is proposed that is suitable for…

  15. A Re-Examination of the Education Production Function Using Individual Participant Data

    ERIC Educational Resources Information Center

    Pigott, Therese D.; Williams, Ryan T.; Polanin, Joshua R.

    2011-01-01

    The focus and purpose of this research is to examine the benefits, limitations, and implications of Individual Participant Data (IPD) meta-analysis in education. Comprehensive research reviews in education have been limited to the use of aggregated data (AD) meta- analysis, techniques based on quantitatively combining information from studies on…

  16. Determination of Sulfate by Conductometric Titration: An Undergraduate Laboratory Experiment

    ERIC Educational Resources Information Center

    Garcia, Jennifer; Schultz, Linda D.

    2016-01-01

    The classic technique for sulfate analysis in an undergraduate quantitative analysis lab involves precipitation as the barium salt with barium chloride, collection of the precipitate by gravity filtration using ashless filter paper, and removal of the filter paper by charring over a Bunsen burner. The entire process is time-consuming, hazardous,…

  17. Teaching for Art Criticism: Incorporating Feldman's Critical Analysis Learning Model in Students' Studio Practice

    ERIC Educational Resources Information Center

    Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib

    2016-01-01

    This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…

  18. Application of Critical Classroom Discourse Analysis (CCDA) in Analyzing Classroom Interaction

    ERIC Educational Resources Information Center

    Sadeghi, Sima; Ketabi, Saeed; Tavakoli, Mansoor; Sadeghi, Moslem

    2012-01-01

    As an area of classroom research, Interaction Analysis developed from the need and desire to investigate the process of classroom teaching and learning in terms of action-reaction between individuals and their socio-cultural context (Biddle, 1967). However, sole reliance on quantitative techniques could be problematic, since they conceal more than…

  19. Development of scan analysis techniques employing a small computer. Final report, February 1, 1963--July 31, 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhl, D.E.

    1976-08-05

    During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized. (CH)

  20. Phospholipid Fatty Acid Analysis: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Findlay, R. H.

    2008-12-01

    With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.

  1. Focussed ion beam thin sample microanalysis using a field emission gun electron probe microanalyser

    NASA Astrophysics Data System (ADS)

    Kubo, Y.

    2018-01-01

    Field emission gun electron probe microanalysis (FEG-EPMA) in conjunction with wavelength-dispersive X-ray spectrometry using a low acceleration voltage (V acc) allows elemental analysis with sub-micrometre lateral spatial resolution (SR). However, this degree of SR does not necessarily meet the requirements associated with increasingly miniaturised devices. Another challenge related to performing FEG-EPMA with a low V acc is that the accuracy of quantitative analyses is adversely affected, primarily because low energy X-ray lines such as the L- and M-lines must be employed and due to the potential of line interference. One promising means of obtaining high SR with FEG-EPMA is to use thin samples together with high V acc values. This mini-review covers the basic principles of thin-sample FEG-EPMA and describes an application of this technique to the analysis of optical fibres. Outstanding issues related to this technique that must be addressed are also discussed, which include the potential for electron beam damage during analysis of insulating materials and the development of methods to use thin samples for quantitative analysis.

  2. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  3. Application of remote sensing to monitoring and studying dispersion in ocean dumping

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Ohlhorst, C. W.

    1981-01-01

    Remotely sensed wide area synoptic data provides information on ocean dumping that is not readily available by other means. A qualitative approach has been used to map features, such as river plumes. Results of quantitative analyses have been used to develop maps showing quantitative distributions of one or more water quality parameters, such as suspended solids or chlorophyll a. Joint NASA/NOAA experiments have been conducted at designated dump areas in the U.S. coastal zones to determine the applicability of aircraft remote sensing systems to map plumes resulting from ocean dumping of sewage sludge and industrial wastes. A second objective is related to the evaluation of previously developed quantitative analysis techniques for studying dispersion of materials in these plumes. It was found that plumes resulting from dumping of four waste materials have distinctive spectral characteristics. The development of a technology for use in a routine monitoring system, based on remote sensing techniques, is discussed.

  4. Localization-based super-resolution imaging meets high-content screening.

    PubMed

    Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste

    2017-12-01

    Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.

  5. Quantitative Detection of Pharmaceuticals Using a Combination of Paper Microfluidics and Wavelength Modulated Raman Spectroscopy

    PubMed Central

    Craig, Derek; Mazilu, Michael; Dholakia, Kishan

    2015-01-01

    Raman spectroscopy has proven to be an indispensable technique for the identification of various types of analytes due to the fingerprint vibration spectrum obtained. Paper microfluidics has also emerged as a low cost, easy to fabricate and portable approach for point of care testing. However, due to inherent background fluorescence, combining Raman spectroscopy with paper microfluidics is to date an unmet challenge in the absence of using surface enhanced mechanisms. We describe the first use of wavelength modulated Raman spectroscopy (WMRS) for analysis on a paper microfluidics platform. This study demonstrates the ability to suppress the background fluorescence of the paper using WMRS and the subsequent implementation of this technique for pharmaceutical analysis. The results of this study demonstrate that it is possible to discriminate between both paracetamol and ibuprofen, whilst, also being able to detect the presence of each analyte quantitatively at nanomolar concentrations. PMID:25938464

  6. Principles of Metamorphic Petrology

    NASA Astrophysics Data System (ADS)

    Williams, Michael L.

    2009-05-01

    The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.

  7. Quantitative Analysis, Design, and Fabrication of Biosensing and Bioprocessing Devices in Living Cells

    DTIC Science & Technology

    2015-03-10

    AFRL-OSR-VA-TR-2015-0080 Biosensing and Bioprocessing Devices in Living Cells Domitilla Del Vecchio MASSACHUSETTS INSTITUTE OF TECHNOLOGY Final...Of Biosensing And Bioprocessing Devices In Living Cells FA9550-12-1-0129 D. Del Vecchio Massachusetts Institute of Technology -- 77 Massachusetts...research is to develop quantitative techniques for the de novo design and fabrication of biosensing devices in living cells . Such devices will be entirely

  8. A projection pursuit algorithm to classify individuals using fMRI data: Application to schizophrenia.

    PubMed

    Demirci, Oguz; Clark, Vincent P; Calhoun, Vince D

    2008-02-15

    Schizophrenia is diagnosed based largely upon behavioral symptoms. Currently, no quantitative, biologically based diagnostic technique has yet been developed to identify patients with schizophrenia. Classification of individuals into patient with schizophrenia and healthy control groups based on quantitative biologically based data is of great interest to support and refine psychiatric diagnoses. We applied a novel projection pursuit technique on various components obtained with independent component analysis (ICA) of 70 subjects' fMRI activation maps obtained during an auditory oddball task. The validity of the technique was tested with a leave-one-out method and the detection performance varied between 80% and 90%. The findings suggest that the proposed data reduction algorithm is effective in classifying individuals into schizophrenia and healthy control groups and may eventually prove useful as a diagnostic tool.

  9. A ten-week biochemistry lab project studying wild-type and mutant bacterial alkaline phosphatase.

    PubMed

    Witherow, D Scott

    2016-11-12

    This work describes a 10-week laboratory project studying wild-type and mutant bacterial alkaline phosphatase, in which students purify, quantitate, and perform kinetic assays on wild-type and selected mutants of the enzyme. Students also perform plasmid DNA purification, digestion, and gel analysis. In addition to simply learning important techniques, students acquire novel biochemical data in their kinetic analysis of mutant enzymes. The experiments are designed to build on students' work from week to week in a way that requires them to apply quantitative analysis and reasoning skills, reinforcing traditional textbook biochemical concepts. Students are assessed through lab reports focused on journal style writing, quantitative and conceptual question sheets, and traditional exams. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(6):555-564, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  10. Structures of glycans bound to receptors from saturation transfer difference (STD) NMR spectroscopy: quantitative analysis by using CORCEMA-ST.

    PubMed

    Enríquez-Navas, Pedro M; Guzzi, Cinzia; Muñoz-García, Juan C; Nieto, Pedro M; Angulo, Jesús

    2015-01-01

    Glycan-receptor interactions are of fundamental relevance for a large number of biological processes, and their kinetics properties (medium/weak binding affinities) make them appropriated to be studied by ligand observed NMR techniques, among which saturation transfer difference (STD) NMR spectroscopy has been shown to be a very robust and powerful approach. The quantitative analysis of the results from a STD NMR study of a glycan-receptor interaction is essential to be able to translate the resulting spectral intensities into a 3D molecular model of the complex. This chapter describes how to carry out such a quantitative analysis by means of the Complete Relaxation and Conformational Exchange Matrix Approach for STD NMR (CORCEMA-ST), in general terms, and an example of a previous work on an antibody-glycan interaction is also shown.

  11. Quantitation of Mycotoxins Using Direct Analysis in Real Time Mass Spectrometry (DART-MS).

    PubMed

    Busman, Mark

    2018-05-01

    Ambient ionization represents a new generation of MS ion sources and is used for the rapid ionization of small molecules under ambient conditions. The combination of ambient ionization and MS allows the analysis of multiple food samples with simple or no sample treatment or in conjunction with prevailing sample preparation methods. Two ambient ionization methods, desorptive electrospray ionization (DESI) and direct analysis in real time (DART) have been adapted for food safety application. Both ionization techniques provide unique advantages and capabilities. DART has been used for a variety of qualitative and quantitative applications. In particular, mycotoxin contamination of food and feed materials has been addressed by DART-MS. Applications to mycotoxin analysis by ambient ionization MS and particularly DART-MS are summarized.

  12. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  13. Quantitative structure-property relationship (correlation analysis) of phosphonic acid-based chelates in design of MRI contrast agent.

    PubMed

    Tiwari, Anjani K; Ojha, Himanshu; Kaul, Ankur; Dutta, Anupama; Srivastava, Pooja; Shukla, Gauri; Srivastava, Rakesh; Mishra, Anil K

    2009-07-01

    Nuclear magnetic resonance imaging is a very useful tool in modern medical diagnostics, especially when gadolinium (III)-based contrast agents are administered to the patient with the aim of increasing the image contrast between normal and diseased tissues. With the use of soft modelling techniques such as quantitative structure-activity relationship/quantitative structure-property relationship after a suitable description of their molecular structure, we have studied a series of phosphonic acid for designing new MRI contrast agent. Quantitative structure-property relationship studies with multiple linear regression analysis were applied to find correlation between different calculated molecular descriptors of the phosphonic acid-based chelating agent and their stability constants. The final quantitative structure-property relationship mathematical models were found as--quantitative structure-property relationship Model for phosphonic acid series (Model 1)--log K(ML) = {5.00243(+/-0.7102)}- MR {0.0263(+/-0.540)}n = 12 l r l = 0.942 s = 0.183 F = 99.165 quantitative structure-property relationship Model for phosphonic acid series (Model 2)--log K(ML) = {5.06280(+/-0.3418)}- MR {0.0252(+/- .198)}n = 12 l r l = 0.956 s = 0.186 F = 99.256.

  14. Analysis of the differentially expressed low molecular weight peptides in human serum via an N-terminal isotope labeling technique combining nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Leng, Jiapeng; Zhu, Dong; Wu, Duojiao; Zhu, Tongyu; Zhao, Ningwei; Guo, Yinlong

    2012-11-15

    Peptidomics analysis of human serum is challenging due to the low abundance of serum peptides and interference from the complex matrix. This study analyzed the differentially expressed (DE) low molecular weight peptides in human serum integrating a DMPITC-based N-terminal isotope labeling technique with nano-liquid chromatography and matrix-assisted laser desorption/ionization mass spectrometry (nano-LC/MALDI-MS). The workflow introduced a [d(6)]-4,6-dimethoxypyrimidine-2-isothiocyanate (DMPITC)-labeled mixture of aliquots from test samples as the internal standard. The spiked [d(0)]-DMPITC-labeled samples were separated by nano-LC then spotted on the MALDI target. Both quantitative and qualitative studies for serum peptides were achieved based on the isotope-labeled peaks. The DMPITC labeling technique combined with nano-LC/MALDI-MS not only minimized the errors in peptide quantitation, but also allowed convenient recognition of the labeled peptides due to the 6 Da mass difference. The data showed that the entire research procedure as well as the subsequent data analysis method were effective, reproducible, and sensitive for the analysis of DE serum peptides. This study successfully established a research model for DE serum peptides using DMPITC-based N-terminal isotope labeling and nano-LC/MALDI-MS. Application of the DMPITC-based N-terminal labeling technique is expected to provide a promising tool for the investigation of peptides in vivo, especially for the analysis of DE peptides under different biological conditions. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Amino acid distribution in meteorites: diagenesis, extraction methods, and standard metrics in the search for extraterrestrial biosignatures.

    PubMed

    McDonald, Gene D; Storrie-Lombardi, Michael C

    2006-02-01

    The relative abundance of the protein amino acids has been previously investigated as a potential marker for biogenicity in meteoritic samples. However, these investigations were executed without a quantitative metric to evaluate distribution variations, and they did not account for the possibility of interdisciplinary systematic error arising from inter-laboratory differences in extraction and detection techniques. Principal component analysis (PCA), hierarchical cluster analysis (HCA), and stochastic probabilistic artificial neural networks (ANNs) were used to compare the distributions for nine protein amino acids previously reported for the Murchison carbonaceous chondrite, Mars meteorites (ALH84001, Nakhla, and EETA79001), prebiotic synthesis experiments, and terrestrial biota and sediments. These techniques allowed us (1) to identify a shift in terrestrial amino acid distributions secondary to diagenesis; (2) to detect differences in terrestrial distributions that may be systematic differences between extraction and analysis techniques in biological and geological laboratories; and (3) to determine that distributions in meteoritic samples appear more similar to prebiotic chemistry samples than they do to the terrestrial unaltered or diagenetic samples. Both diagenesis and putative interdisciplinary differences in analysis complicate interpretation of meteoritic amino acid distributions. We propose that the analysis of future samples from such diverse sources as meteoritic influx, sample return missions, and in situ exploration of Mars would be less ambiguous with adoption of standardized assay techniques, systematic inclusion of assay standards, and the use of a quantitative, probabilistic metric. We present here one such metric determined by sequential feature extraction and normalization (PCA), information-driven automated exploration of classification possibilities (HCA), and prediction of classification accuracy (ANNs).

  16. Laser microprobe characterization of C species in Interplanetary Dust Particles (IDP)

    NASA Technical Reports Server (NTRS)

    Dibrozolo, F. R.; Bunch, T. E.; Chang, S.; Brownlee, D. E.

    1986-01-01

    Preliminary results of a study whose aim is the characterization of carbon (C) species in microvolumes of materials by means of laser ionization mass spectrometry (LIMS) are presented. The LIMS instrument employs a pulsed UV laser to produce nearly instantaneous vaporization and ionization of materials, followed by acceleration and time-of-flight analysis of the ions produced. LIMS provides a survey technique with nearly simultaneous acquisition of mass spectra covering the entire elemental range. The main limitation of the LIMS technique at present is its limited ability to perform quantitative analysis, due in part to insufficient knowledge of the mechanism of laser-solid interaction. However, considerable effort is now being directed at making LIMS a more quantitative technique. A variety of different C samples, both natural and man made were analyzed to establish the ability of LIMS to differentiate among the various C phases. The results of preliminary analyses performed on meteoritical and interplanetary dust samples are also presented. The C standards selected for the LIMS characterization range from essentially amorphous soot to diamond, which exhibits the highest degree of ordering.

  17. Characterization of Colloidal Quantum Dot Ligand Exchange by X-ray Photoelectron Spectroscopy

    NASA Astrophysics Data System (ADS)

    Atewologun, Ayomide; Ge, Wangyao; Stiff-Roberts, Adrienne D.

    2013-05-01

    Colloidal quantum dots (CQDs) are chemically synthesized semiconductor nanoparticles with size-dependent wavelength tunability. Chemical synthesis of CQDs involves the attachment of long organic surface ligands to prevent aggregation; however, these ligands also impede charge transport. Therefore, it is beneficial to exchange longer surface ligands for shorter ones for optoelectronic devices. Typical characterization techniques used to analyze surface ligand exchange include Fourier-transform infrared spectroscopy, x-ray diffraction, transmission electron microscopy, and nuclear magnetic resonance spectroscopy, yet these techniques do not provide a simultaneously direct, quantitative, and sensitive method for evaluating surface ligands on CQDs. In contrast, x-ray photoelectron spectroscopy (XPS) can provide nanoscale sensitivity for quantitative analysis of CQD surface ligand exchange. A unique aspect of this work is that a fingerprint is identified for shorter surface ligands by resolving the regional XPS spectrum corresponding to different types of carbon bonds. In addition, a deposition technique known as resonant infrared matrix-assisted pulsed laser evaporation is used to improve the CQD film uniformity such that stronger XPS signals are obtained, enabling more accurate analysis of the ligand exchange process.

  18. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    PubMed

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  19. Noninvasive characterization of the fission yeast cell cycle by monitoring dry mass with digital holographic microscopy.

    PubMed

    Rappaz, Benjamin; Cano, Elena; Colomb, Tristan; Kühn, Jonas; Depeursinge, Christian; Simanis, Viesturs; Magistretti, Pierre J; Marquet, Pierre

    2009-01-01

    Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.

  20. Application of gas chromatography to analysis of spirit-based alcoholic beverages.

    PubMed

    Wiśniewska, Paulina; Śliwińska, Magdalena; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek

    2015-01-01

    Spirit-based beverages are alcoholic drinks; their production processes are dependent on the type and origin of raw materials. The composition of this complex matrix is difficult to analyze, and scientists commonly choose gas chromatography techniques for this reason. With a wide selection of extraction methods and detectors it is possible to provide qualitative and quantitative analysis for many chemical compounds with various functional groups. This article describes different types of gas chromatography techniques and their most commonly used associated extraction techniques (e.g., LLE, SPME, SPE, SFE, and SBME) and detectors (MS, TOFMS, FID, ECD, NPD, AED, O or EPD). Additionally, brief characteristics of internationally popular spirit-based beverages and application of gas chromatography to the analysis of selected alcoholic drinks are presented.

  1. Thermographic Imaging of Material Loss in Boiler Water-Wall Tubing by Application of Scanning Line Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Localized wall thinning due to corrosion in utility boiler water-wall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. This technique has proven to be very manpower and time intensive. This has resulted in a spot check approach to inspections, documenting thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed for large structures such as boiler water-walls. A theoretical basis for the technique will be presented which explains the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of applying this technology to actual water-wall tubing samples and in situ inspections will be presented.

  2. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  3. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  4. Fabrication of type I collagen microcarrier using a microfluidic 3D T-junction device and its application for the quantitative analysis of cell-ECM interactions.

    PubMed

    Yoon, Junghyo; Kim, Jaehoon; Jeong, Hyo Eun; Sudo, Ryo; Park, Myung-Jin; Chung, Seok

    2016-08-26

    We presented a new quantitative analysis for cell and extracellular matrix (ECM) interactions, using cell-coated ECM hydrogel microbeads (hydrobeads) made of type I collagen. The hydrobeads can carry cells as three-dimensional spheroidal forms with an ECM inside, facilitating a direct interaction between the cells and ECM. The cells on hydrobeads do not have a hypoxic core, which opens the possibility for using as a cell microcarrier for bottom-up tissue reconstitution. This technique can utilize various types of cells, even MDA-MB-231 cells, which have weak cell-cell interactions and do not form spheroids in conventional spheroid culture methods. Morphological indices of the cell-coated hydrobead visually present cell-ECM interactions in a quantitative manner.

  5. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis.

    PubMed

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L; Hwang, Jae Youn

    2016-12-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis.

  6. Application of several physical techniques in the total analysis of a canine urinary calculus.

    PubMed

    Rodgers, A L; Mezzabotta, M; Mulder, K J; Nassimbeni, L R

    1981-06-01

    A single calculus from the bladder of a Beagle bitch has been analyzed by a multiple technique approach employing x-ray diffraction, infrared spectroscopy, scanning electron microscopy, x-ray fluorescence spectrometry, atomic absorption spectrophotometry and density gradient fractionation. The qualitative and quantitative data obtained showed excellent agreement, lending confidence to such an approach for the evaluation and understanding of stone disease.

  7. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  8. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  9. Quantitative impedimetric monitoring of cell migration under the stimulation of cytokine or anti-cancer drug in a microfluidic chip

    PubMed Central

    Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao

    2015-01-01

    Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566

  10. The simultaneous quantitation of ten amino acids in soil extracts by mass fragmentography

    NASA Technical Reports Server (NTRS)

    Pereira, W. E.; Hoyano, Y.; Reynolds, W. E.; Summons, R. E.; Duffield, A. M.

    1972-01-01

    A specific and sensitive method for the identification and simultaneous quantitation by mass fragmentography of ten of the amino acids present in soil was developed. The technique uses a computer driven quadrupole mass spectrometer and a commercial preparation of deuterated amino acids is used as internal standards for purposes of quantitation. The results obtained are comparable with those from an amino acid analyzer. In the quadrupole mass spectrometer-computer system up to 25 pre-selected ions may be monitored sequentially. This allows a maximum of 12 different amino acids (one specific ion in each of the undeuterated and deuterated amino acid spectra) to be quantitated. The method is relatively rapid (analysis time of approximately one hour) and is capable of the quantitation of nanogram quantities of amino acids.

  11. Analytical Chemistry and the Microchip.

    ERIC Educational Resources Information Center

    Lowry, Robert K.

    1986-01-01

    Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…

  12. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  13. Quantitative twoplex glycan analysis using 12C6 and 13C6 stable isotope 2-aminobenzoic acid labelling and capillary electrophoresis mass spectrometry.

    PubMed

    Váradi, Csaba; Mittermayr, Stefan; Millán-Martín, Silvia; Bones, Jonathan

    2016-12-01

    Capillary electrophoresis (CE) offers excellent efficiency and orthogonality to liquid chromatographic (LC) separations for oligosaccharide structural analysis. Combination of CE with high resolution mass spectrometry (MS) for glycan analysis remains a challenging task due to the MS incompatibility of background electrolyte buffers and additives commonly used in offline CE separations. Here, a novel method is presented for the analysis of 2-aminobenzoic acid (2-AA) labelled glycans by capillary electrophoresis coupled to mass spectrometry (CE-MS). To ensure maximum resolution and excellent precision without the requirement for excessive analysis times, CE separation conditions including the concentration and pH of the background electrolyte, the effect of applied pressure on the capillary inlet and the capillary length were evaluated. Using readily available 12/13 C 6 stable isotopologues of 2-AA, the developed method can be applied for quantitative glycan profiling in a twoplex manner based on the generation of extracted ion electropherograms (EIE) for 12 C 6 'light' and 13 C 6 'heavy' 2-AA labelled glycan isotope clusters. The twoplex quantitative CE-MS glycan analysis platform is ideally suited for comparability assessment of biopharmaceuticals, such as monoclonal antibodies, for differential glycomic analysis of clinical material for potential biomarker discovery or for quantitative microheterogeneity analysis of different glycosylation sites within a glycoprotein. Additionally, due to the low injection volume requirements of CE, subsequent LC-MS analysis of the same sample can be performed facilitating the use of orthogonal separation techniques for structural elucidation or verification of quantitative performance.

  14. Utility of correlation techniques in gravity and magnetic interpretation

    NASA Technical Reports Server (NTRS)

    Chandler, V. W.; Koski, J. S.; Braile, L. W.; Hinze, W. J.

    1977-01-01

    Two methods of quantitative combined analysis, internal correspondence and clustering, are presented. Model studies are used to illustrate implementation and interpretation procedures of these methods, particularly internal correspondence. Analysis of the results of applying these methods to data from the midcontinent and a transcontinental profile show they can be useful in identifying crustal provinces, providing information on horizontal and vertical variations of physical properties over province size zones, validating long wave-length anomalies, and isolating geomagnetic field removal problems. Thus, these techniques are useful in considering regional data acquired by satellites.

  15. Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.

    PubMed

    Kemp, Pavlina S; VanderVeen, Deborah K

    2016-01-01

    The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.

  16. Quantitative analysis of cardiovascular MR images.

    PubMed

    van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H

    1997-06-01

    The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.

  17. The Design of a Quantitative Western Blot Experiment

    PubMed Central

    Taylor, Sean C.; Posch, Anton

    2014-01-01

    Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055

  18. Quantitative high-speed laryngoscopic analysis of vocal fold vibration in fatigued voice of young karaoke singers.

    PubMed

    Yiu, Edwin M-L; Wang, Gaowu; Lo, Andy C Y; Chan, Karen M-K; Ma, Estella P-M; Kong, Jiangping; Barrett, Elizabeth Ann

    2013-11-01

    The present study aimed to determine whether there were physiological differences in the vocal fold vibration between nonfatigued and fatigued voices using high-speed laryngoscopic imaging and quantitative analysis. Twenty participants aged from 18 to 23 years (mean, 21.2 years; standard deviation, 1.3 years) with normal voice were recruited to participate in an extended singing task. Vocal fatigue was induced using a singing task. High-speed laryngoscopic image recordings of /i/ phonation were taken before and after the singing task. The laryngoscopic images were semiautomatically analyzed with the quantitative high-speed video processing program to extract indices related to the anteroposterior dimension (length), transverse dimension (width), and the speed of opening and closing. Significant reduction in the glottal length-to-width ratio index was found after vocal fatigue. Physiologically, this indicated either a significantly shorter (anteroposteriorly) or a wider (transversely) glottis after vocal fatigue. The high-speed imaging technique using quantitative analysis has the potential for early identification of vocally fatigued voice. Copyright © 2013 The Voice Foundation. All rights reserved.

  19. A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers

    PubMed Central

    Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-01-01

    Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715

  20. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  1. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  2. Web-Based Essay Critiquing System and EFL Students' Writing: A Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Lee, Cynthia; Wong, Kelvin C. K.; Cheung, William K.; Lee, Fion S. L.

    2009-01-01

    The paper first describes a web-based essay critiquing system developed by the authors using latent semantic analysis (LSA), an automatic text analysis technique, to provide students with immediate feedback on content and organisation for revision whenever there is an internet connection. It reports on its effectiveness in enhancing adult EFL…

  3. Physical and Cognitive-Affective Factors Associated with Fatigue in Individuals with Fibromyalgia: A Multiple Regression Analysis

    ERIC Educational Resources Information Center

    Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong

    2015-01-01

    Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…

  4. Potable water taste enhancement

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis was conducted to determine the causes of and remedies for the unpalatability of potable water in manned spacecraft. Criteria and specifications for palatable water were established and a quantitative laboratory analysis technique was developed for determinig the amounts of volatile organics in good tasting water. Prototype spacecraft water reclamation systems are evaluated in terms of the essential palatability factors.

  5. The new numerology of immunity mediated by virus-specific CD8(+) T cells.

    PubMed

    Doherty, P C

    1998-08-01

    Our understanding of virus-specific CD8(+) T cell responses is currently being revolutionized by peptide-based assay systems that allow flow cytometric analysis of effector and memory cytotoxic T lymphocyte populations. These techniques are, for the first time, putting the analysis of T-cell-mediated immunity on a quantitative basis.

  6. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Multiplex, quantitative cellular analysis in large tissue volumes with clearing-enhanced 3D microscopy (Ce3D)

    PubMed Central

    Li, Weizhe; Germain, Ronald N.

    2017-01-01

    Organ homeostasis, cellular differentiation, signal relay, and in situ function all depend on the spatial organization of cells in complex tissues. For this reason, comprehensive, high-resolution mapping of cell positioning, phenotypic identity, and functional state in the context of macroscale tissue structure is critical to a deeper understanding of diverse biological processes. Here we report an easy to use method, clearing-enhanced 3D (Ce3D), which generates excellent tissue transparency for most organs, preserves cellular morphology and protein fluorescence, and is robustly compatible with antibody-based immunolabeling. This enhanced signal quality and capacity for extensive probe multiplexing permits quantitative analysis of distinct, highly intermixed cell populations in intact Ce3D-treated tissues via 3D histo-cytometry. We use this technology to demonstrate large-volume, high-resolution microscopy of diverse cell types in lymphoid and nonlymphoid organs, as well as to perform quantitative analysis of the composition and tissue distribution of multiple cell populations in lymphoid tissues. Combined with histo-cytometry, Ce3D provides a comprehensive strategy for volumetric quantitative imaging and analysis that bridges the gap between conventional section imaging and disassociation-based techniques. PMID:28808033

  8. Quantitative analysis of sitagliptin using the (19)F-NMR method: a universal technique for fluorinated compound detection.

    PubMed

    Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya

    2015-01-07

    To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.

  9. Systems Biology, Neuroimaging, Neuropsychology, Neuroconnectivity and Traumatic Brain Injury

    PubMed Central

    Bigler, Erin D.

    2016-01-01

    The patient who sustains a traumatic brain injury (TBI) typically undergoes neuroimaging studies, usually in the form of computed tomography (CT) and magnetic resonance imaging (MRI). In most cases the neuroimaging findings are clinically assessed with descriptive statements that provide qualitative information about the presence/absence of visually identifiable abnormalities; though little if any of the potential information in a scan is analyzed in any quantitative manner, except in research settings. Fortunately, major advances have been made, especially during the last decade, in regards to image quantification techniques, especially those that involve automated image analysis methods. This review argues that a systems biology approach to understanding quantitative neuroimaging findings in TBI provides an appropriate framework for better utilizing the information derived from quantitative neuroimaging and its relation with neuropsychological outcome. Different image analysis methods are reviewed in an attempt to integrate quantitative neuroimaging methods with neuropsychological outcome measures and to illustrate how different neuroimaging techniques tap different aspects of TBI-related neuropathology. Likewise, how different neuropathologies may relate to neuropsychological outcome is explored by examining how damage influences brain connectivity and neural networks. Emphasis is placed on the dynamic changes that occur following TBI and how best to capture those pathologies via different neuroimaging methods. However, traditional clinical neuropsychological techniques are not well suited for interpretation based on contemporary and advanced neuroimaging methods and network analyses. Significant improvements need to be made in the cognitive and behavioral assessment of the brain injured individual to better interface with advances in neuroimaging-based network analyses. By viewing both neuroimaging and neuropsychological processes within a systems biology perspective could represent a significant advancement for the field. PMID:27555810

  10. Biomedical application of MALDI mass spectrometry for small-molecule analysis.

    PubMed

    van Kampen, Jeroen J A; Burgers, Peter C; de Groot, Ronald; Gruters, Rob A; Luider, Theo M

    2011-01-01

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) is an emerging analytical tool for the analysis of molecules with molar masses below 1,000 Da; that is, small molecules. This technique offers rapid analysis, high sensitivity, low sample consumption, a relative high tolerance towards salts and buffers, and the possibility to store sample on the target plate. The successful application of the technique is, however, hampered by low molecular weight (LMW) matrix-derived interference signals and by poor reproducibility of signal intensities during quantitative analyses. In this review, we focus on the biomedical application of MALDI-MS for the analysis of small molecules and discuss its favorable properties and its challenges as well as strategies to improve the performance of the technique. Furthermore, practical aspects and applications are presented. © 2010 Wiley Periodicals, Inc.

  11. Characterization of Minerals of Geochronological Interest by EPMA and Atom Probe Tomography

    NASA Astrophysics Data System (ADS)

    Snoeyenbos, D.; Jercinovic, M. J.; Reinhard, D. A.; Hombourger, C.

    2012-12-01

    Isotopic and chemical dating techniques for zircon and monazite rely on several assumptions: that initial common Pb is low to nonexistent, that the analyzed domain is chronologically homogeneous, and that any relative migration of radiogenic Pb and its parent isotopes has not exceeded the analyzed domain. Yet, both zircon and monazite commonly contain significant submicron heterogeneities that may challenge these assumptions and can complicate the interpretation of chemical and isotopic data. Compositional mapping and submicron quantitative analysis by EPMA and FE-EPMA have been found to be useful techniques both for the characterization of these heterogeneities, and for quantitative geochronological determinations within the analytical limits of these techniques and the statistics of submicron sampling. Complementary to high-resolution EPMA techniques is Atom Probe Tomography (APT), wherein a specimen with dimensions of a few hundreds of nanometers is field evaporated atom by atom. The original position of each atom is identified, along with its atomic species and isotope. The result is a reconstruction allowing quantitative three-dimensional study of the specimen at the atomic scale, with low detection limits and high mass resolution. With the introduction of laser-induced thermal pulsing to achieve field evaporation, the technique is no longer limited to conductive specimens. There exists the capability to explore the compositional and isotopic structure of insulating materials at sub-nanometer resolution. Minerals of geochronological interest have been studied by an analytical method involving first compositional mapping and submicron quantitative analysis by EPMA and FE-EPMA, and subsequent use of these data to select specific sites for APT specimen extraction by FIB. Examples presented include 1) zircon from the Taconian of New England, USA, containing a fossil resorption front included between an unmodified igneous core, and a subsequent metamorphic overgrowth, with significant redistribution of U, Th, P and Y along microfracture arrays extending into the overgrowth, and 2) Paleoproterozoic monazite in thin bands <1μm wide along cleavage planes within much older (Neoarchean) monazite from the Boothia mainland of the Western Churchill Province, Canada.

  12. Nanoscale deformation analysis with high-resolution transmission electron microscopy and digital image correlation

    DOE PAGES

    Wang, Xueju; Pan, Zhipeng; Fan, Feifei; ...

    2015-09-10

    We present an application of the digital image correlation (DIC) method to high-resolution transmission electron microscopy (HRTEM) images for nanoscale deformation analysis. The combination of DIC and HRTEM offers both the ultrahigh spatial resolution and high displacement detection sensitivity that are not possible with other microscope-based DIC techniques. We demonstrate the accuracy and utility of the HRTEM-DIC technique through displacement and strain analysis on amorphous silicon. Two types of error sources resulting from the transmission electron microscopy (TEM) image noise and electromagnetic-lens distortions are quantitatively investigated via rigid-body translation experiments. The local and global DIC approaches are applied for themore » analysis of diffusion- and reaction-induced deformation fields in electrochemically lithiated amorphous silicon. As a result, the DIC technique coupled with HRTEM provides a new avenue for the deformation analysis of materials at the nanometer length scales.« less

  13. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis

    PubMed Central

    Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.

    2017-01-01

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712

  14. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.

    PubMed

    Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P

    2017-04-07

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.

  15. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment.

    PubMed

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.

  16. Current and evolving echocardiographic techniques for the quantitative evaluation of cardiac mechanics: ASE/EAE consensus statement on methodology and indications endorsed by the Japanese Society of Echocardiography.

    PubMed

    Mor-Avi, Victor; Lang, Roberto M; Badano, Luigi P; Belohlavek, Marek; Cardim, Nuno Miguel; Derumeaux, Genevieve; Galderisi, Maurizio; Marwick, Thomas; Nagueh, Sherif F; Sengupta, Partho P; Sicari, Rosa; Smiseth, Otto A; Smulevitz, Beverly; Takeuchi, Masaaki; Thomas, James D; Vannan, Mani; Voigt, Jens-Uwe; Zamorano, Jose Luis

    2011-03-01

    Echocardiographic imaging is ideally suited for the evaluation of cardiac mechanics because of its intrinsically dynamic nature. Because for decades, echocardiography has been the only imaging modality that allows dynamic imaging of the heart, it is only natural that new, increasingly automated techniques for sophisticated analysis of cardiac mechanics have been driven by researchers and manufacturers of ultrasound imaging equipment. Several such techniques have emerged over the past decades to address the issue of reader's experience and inter-measurement variability in interpretation. Some were widely embraced by echocardiographers around the world and became part of the clinical routine, whereas others remained limited to research and exploration of new clinical applications. Two such techniques have dominated the research arena of echocardiography: (1) Doppler-based tissue velocity measurements, frequently referred to as tissue Doppler or myocardial Doppler, and (2) speckle tracking on the basis of displacement measurements. Both types of measurements lend themselves to the derivation of multiple parameters of myocardial function. The goal of this document is to focus on the currently available techniques that allow quantitative assessment of myocardial function via image-based analysis of local myocardial dynamics, including Doppler tissue imaging and speckle-tracking echocardiography, as well as integrated back- scatter analysis. This document describes the current and potential clinical applications of these techniques and their strengths and weaknesses, briefly surveys a selection of the relevant published literature while highlighting normal and abnormal findings in the context of different cardiovascular pathologies, and summarizes the unresolved issues, future research priorities, and recommended indications for clinical use.

  17. Current and evolving echocardiographic techniques for the quantitative evaluation of cardiac mechanics: ASE/EAE consensus statement on methodology and indications endorsed by the Japanese Society of Echocardiography.

    PubMed

    Mor-Avi, Victor; Lang, Roberto M; Badano, Luigi P; Belohlavek, Marek; Cardim, Nuno Miguel; Derumeaux, Geneviève; Galderisi, Maurizio; Marwick, Thomas; Nagueh, Sherif F; Sengupta, Partho P; Sicari, Rosa; Smiseth, Otto A; Smulevitz, Beverly; Takeuchi, Masaaki; Thomas, James D; Vannan, Mani; Voigt, Jens-Uwe; Zamorano, José Luis

    2011-03-01

    Echocardiographic imaging is ideally suited for the evaluation of cardiac mechanics because of its intrinsically dynamic nature. Because for decades, echocardiography has been the only imaging modality that allows dynamic imaging of the heart, it is only natural that new, increasingly automated techniques for sophisticated analysis of cardiac mechanics have been driven by researchers and manufacturers of ultrasound imaging equipment.Several such technique shave emerged over the past decades to address the issue of reader's experience and inter measurement variability in interpretation.Some were widely embraced by echocardiographers around the world and became part of the clinical routine,whereas others remained limited to research and exploration of new clinical applications.Two such techniques have dominated the research arena of echocardiography: (1) Doppler based tissue velocity measurements,frequently referred to as tissue Doppler or myocardial Doppler, and (2) speckle tracking on the basis of displacement measurements.Both types of measurements lend themselves to the derivation of multiple parameters of myocardial function. The goal of this document is to focus on the currently available techniques that allow quantitative assessment of myocardial function via image-based analysis of local myocardial dynamics, including Doppler tissue imaging and speckle-tracking echocardiography, as well as integrated backscatter analysis. This document describes the current and potential clinical applications of these techniques and their strengths and weaknesses,briefly surveys a selection of the relevant published literature while highlighting normal and abnormal findings in the context of different cardiovascular pathologies, and summarizes the unresolved issues, future research priorities, and recommended indications for clinical use.

  18. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  19. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  20. Accuracy Enhancement of Raman Spectroscopy Using Complementary Laser-Induced Breakdown Spectroscopy (LIBS) with Geologically Mixed Samples.

    PubMed

    Choi, Soojin; Kim, Dongyoung; Yang, Junho; Yoh, Jack J

    2017-04-01

    Quantitative Raman analysis was carried out with geologically mixed samples that have various matrices. In order to compensate the matrix effect in Raman shift, laser-induced breakdown spectroscopy (LIBS) analysis was performed. Raman spectroscopy revealed the geological materials contained in the mixed samples. However, the analysis of a mixture containing different matrices was inaccurate due to the weak signal of the Raman shift, interference, and the strong matrix effect. On the other hand, the LIBS quantitative analysis of atomic carbon and calcium in mixed samples showed high accuracy. In the case of the calcite and gypsum mixture, the coefficient of determination of atomic carbon using LIBS was 0.99, while the signal using Raman was less than 0.9. Therefore, the geological composition of the mixed samples is first obtained using Raman and the LIBS-based quantitative analysis is then applied to the Raman outcome in order to construct highly accurate univariate calibration curves. The study also focuses on a method to overcome matrix effects through the two complementary spectroscopic techniques of Raman spectroscopy and LIBS.

  1. Early Oscillation Detection for Hybrid DC/DC Converter Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Wang, Bright L.

    2011-01-01

    This paper describes a novel fault detection technique for hybrid DC/DC converter oscillation diagnosis. The technique is based on principles of feedback control loop oscillation and RF signal modulations, and Is realized by using signal spectral analysis. Real-circuit simulation and analytical study reveal critical factors of the oscillation and indicate significant correlations between the spectral analysis method and the gain/phase margin method. A stability diagnosis index (SDI) is developed as a quantitative measure to accurately assign a degree of stability to the DC/DC converter. This technique Is capable of detecting oscillation at an early stage without interfering with DC/DC converter's normal operation and without limitations of probing to the converter.

  2. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  3. Comparison of intraoral scanning and conventional impression techniques using 3-dimensional superimposition.

    PubMed

    Rhee, Ye-Kyu; Huh, Yoon-Hyuk; Cho, Lee-Ra; Park, Chan-Jin

    2015-12-01

    The aim of this study is to evaluate the appropriate impression technique by analyzing the superimposition of 3D digital model for evaluating accuracy of conventional impression technique and digital impression. Twenty-four patients who had no periodontitis or temporomandibular joint disease were selected for analysis. As a reference model, digital impressions with a digital impression system were performed. As a test models, for conventional impression dual-arch and full-arch, impression techniques utilizing addition type polyvinylsiloxane for fabrication of cast were applied. 3D laser scanner is used for scanning the cast. Each 3 pairs for 25 STL datasets were imported into the inspection software. The three-dimensional differences were illustrated in a color-coded map. For three-dimensional quantitative analysis, 4 specified contact locations(buccal and lingual cusps of second premolar and molar) were established. For twodimensional quantitative analysis, the sectioning from buccal cusp to lingual cusp of second premolar and molar were acquired depending on the tooth axis. In color-coded map, the biggest difference between intraoral scanning and dual-arch impression was seen (P<.05). In three-dimensional analysis, the biggest difference was seen between intraoral scanning and dual-arch impression and the smallest difference was seen between dual-arch and full-arch impression. The two- and three-dimensional deviations between intraoral scanner and dual-arch impression was bigger than full-arch and dual-arch impression (P<.05). The second premolar showed significantly bigger three-dimensional deviations than the second molar in the three-dimensional deviations (P>.05).

  4. Comparison of intraoral scanning and conventional impression techniques using 3-dimensional superimposition

    PubMed Central

    Rhee, Ye-Kyu

    2015-01-01

    PURPOSE The aim of this study is to evaluate the appropriate impression technique by analyzing the superimposition of 3D digital model for evaluating accuracy of conventional impression technique and digital impression. MATERIALS AND METHODS Twenty-four patients who had no periodontitis or temporomandibular joint disease were selected for analysis. As a reference model, digital impressions with a digital impression system were performed. As a test models, for conventional impression dual-arch and full-arch, impression techniques utilizing addition type polyvinylsiloxane for fabrication of cast were applied. 3D laser scanner is used for scanning the cast. Each 3 pairs for 25 STL datasets were imported into the inspection software. The three-dimensional differences were illustrated in a color-coded map. For three-dimensional quantitative analysis, 4 specified contact locations(buccal and lingual cusps of second premolar and molar) were established. For twodimensional quantitative analysis, the sectioning from buccal cusp to lingual cusp of second premolar and molar were acquired depending on the tooth axis. RESULTS In color-coded map, the biggest difference between intraoral scanning and dual-arch impression was seen (P<.05). In three-dimensional analysis, the biggest difference was seen between intraoral scanning and dual-arch impression and the smallest difference was seen between dual-arch and full-arch impression. CONCLUSION The two- and three-dimensional deviations between intraoral scanner and dual-arch impression was bigger than full-arch and dual-arch impression (P<.05). The second premolar showed significantly bigger three-dimensional deviations than the second molar in the three-dimensional deviations (P>.05). PMID:26816576

  5. Radiomic analysis in prediction of Human Papilloma Virus status.

    PubMed

    Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu

    2017-12-01

    Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.

  6. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  7. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis

    PubMed Central

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L.; Hwang, Jae Youn

    2016-01-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis. PMID:28018743

  8. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  9. Separation techniques: Chromatography

    PubMed Central

    Coskun, Ozlem

    2016-01-01

    Chromatography is an important biophysical technique that enables the separation, identification, and purification of the components of a mixture for qualitative and quantitative analysis. Proteins can be purified based on characteristics such as size and shape, total charge, hydrophobic groups present on the surface, and binding capacity with the stationary phase. Four separation techniques based on molecular characteristics and interaction type use mechanisms of ion exchange, surface adsorption, partition, and size exclusion. Other chromatography techniques are based on the stationary bed, including column, thin layer, and paper chromatography. Column chromatography is one of the most common methods of protein purification. PMID:28058406

  10. Dynamically monitoring the gene expression of dual fluorophore in the cell cycle with quantitative spectrum analysis

    NASA Astrophysics Data System (ADS)

    Lee, Ja-Yun; Wu, Tzong-Yuan; Hsu, I.-Jen

    2008-04-01

    The cloning and transcription techniques on gene cloned fluorescent proteins have been widely used in many applications. They have been used as reporters of some conditions in a series of reactions. However, it is usually difficult to monitor the specific target with the exactly number of proteins during the process in turbid media, especially at micrometer scales. We successfully revealed an alternative way to monitor the cell cycle behavior and quantitatively analyzed the target cells with green and red fluorescent proteins (GFP and RFP) during different phases of the cell cycle by quantitatively analyzing its behavior and also monitoring its spatial distribution.

  11. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    PubMed Central

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  12. Quantitative analysis on collagen of dermatofibrosarcoma protuberans skin by second harmonic generation microscopy.

    PubMed

    Wu, Shulian; Huang, Yudian; Li, Hui; Wang, Yunxia; Zhang, Xiaoman

    2015-01-01

    Dermatofibrosarcoma protuberans (DFSP) is a skin cancer usually mistaken as other benign tumors. Abnormal DFSP resection results in tumor recurrence. Quantitative characterization of collagen alteration on the skin tumor is essential for developing a diagnostic technique. In this study, second harmonic generation (SHG) microscopy was performed to obtain images of the human DFSP skin and normal skin. Subsequently, structure and texture analysis methods were applied to determine the differences in skin texture characteristics between the two skin types, and the link between collagen alteration and tumor was established. Results suggest that combining SHG microscopy and texture analysis methods is a feasible and effective method to describe the characteristics of skin tumor like DFSP. © Wiley Periodicals, Inc.

  13. Acoustics based assessment of respiratory diseases using GMM classification.

    PubMed

    Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J

    2010-01-01

    The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.

  14. High-throughput 3D whole-brain quantitative histopathology in rodents

    PubMed Central

    Vandenberghe, Michel E.; Hérard, Anne-Sophie; Souedet, Nicolas; Sadouni, Elmahdi; Santin, Mathieu D.; Briet, Dominique; Carré, Denis; Schulz, Jocelyne; Hantraye, Philippe; Chabrier, Pierre-Etienne; Rooney, Thomas; Debeir, Thomas; Blanchard, Véronique; Pradier, Laurent; Dhenain, Marc; Delzescaux, Thierry

    2016-01-01

    Histology is the gold standard to unveil microscopic brain structures and pathological alterations in humans and animal models of disease. However, due to tedious manual interventions, quantification of histopathological markers is classically performed on a few tissue sections, thus restricting measurements to limited portions of the brain. Recently developed 3D microscopic imaging techniques have allowed in-depth study of neuroanatomy. However, quantitative methods are still lacking for whole-brain analysis of cellular and pathological markers. Here, we propose a ready-to-use, automated, and scalable method to thoroughly quantify histopathological markers in 3D in rodent whole brains. It relies on block-face photography, serial histology and 3D-HAPi (Three Dimensional Histology Analysis Pipeline), an open source image analysis software. We illustrate our method in studies involving mouse models of Alzheimer’s disease and show that it can be broadly applied to characterize animal models of brain diseases, to evaluate therapeutic interventions, to anatomically correlate cellular and pathological markers throughout the entire brain and to validate in vivo imaging techniques. PMID:26876372

  15. Studies in the History and Geography of California Languages

    ERIC Educational Resources Information Center

    Haynie, Hannah Jane

    2012-01-01

    This dissertation uses quantitative and geographic analysis techniques to examine the historical and geographical processes that have shaped California's linguistic diversity. Many questions in California historical linguistics have received diminishing attention in recent years, remaining unanswered despite their continued relevance. The studies…

  16. Pyrolysis Mass Spectrometry of Complex Organic Materials.

    ERIC Educational Resources Information Center

    Meuzelaar, Henk L. C.; And Others

    1984-01-01

    Illustrates the state of the art in pyrolysis mass spectrometry techniques through applications in: (1) structural determination and quality control of synthetic polymers; (2) quantitative analysis of polymer mixtures; (3) classification and structural characterization of fossil organic matter; and (4) nonsupervised numerical extraction of…

  17. Dual Nozzle Aerodynamic and Cooling Analysis Study.

    DTIC Science & Technology

    1981-02-27

    program and to the aerodynamic model computer program. This pro - cedure was used to define two secondary nozzle contours for the baseline con - figuration...both the dual-throat and dual-expander con - cepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow...preliminary heat transfer analysis of both con - cepts, and (5) engineering analysis of data from the NASA/MSFC hot-fire testing of a dual-throat

  18. Error analysis applied to several inversion techniques used for the retrieval of middle atmospheric constituents from limb-scanning MM-wave spectroscopic measurements

    NASA Technical Reports Server (NTRS)

    Puliafito, E.; Bevilacqua, R.; Olivero, J.; Degenhardt, W.

    1992-01-01

    The formal retrieval error analysis of Rodgers (1990) allows the quantitative determination of such retrieval properties as measurement error sensitivity, resolution, and inversion bias. This technique was applied to five numerical inversion techniques and two nonlinear iterative techniques used for the retrieval of middle atmospheric constituent concentrations from limb-scanning millimeter-wave spectroscopic measurements. It is found that the iterative methods have better vertical resolution, but are slightly more sensitive to measurement error than constrained matrix methods. The iterative methods converge to the exact solution, whereas two of the matrix methods under consideration have an explicit constraint, the sensitivity of the solution to the a priori profile. Tradeoffs of these retrieval characteristics are presented.

  19. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  20. Simple preparation of plant epidermal tissue for laser microdissection and downstream quantitative proteome and carbohydrate analysis

    PubMed Central

    Falter, Christian; Ellinger, Dorothea; von Hülsen, Behrend; Heim, René; Voigt, Christian A.

    2015-01-01

    The outwardly directed cell wall and associated plasma membrane of epidermal cells represent the first layers of plant defense against intruding pathogens. Cell wall modifications and the formation of defense structures at sites of attempted pathogen penetration are decisive for plant defense. A precise isolation of these stress-induced structures would allow a specific analysis of regulatory mechanism and cell wall adaption. However, methods for large-scale epidermal tissue preparation from the model plant Arabidopsis thaliana, which would allow proteome and cell wall analysis of complete, laser-microdissected epidermal defense structures, have not been provided. We developed the adhesive tape – liquid cover glass technique (ACT) for simple leaf epidermis preparation from A. thaliana, which is also applicable on grass leaves. This method is compatible with subsequent staining techniques to visualize stress-related cell wall structures, which were precisely isolated from the epidermal tissue layer by laser microdissection (LM) coupled to laser pressure catapulting. We successfully demonstrated that these specific epidermal tissue samples could be used for quantitative downstream proteome and cell wall analysis. The development of the ACT for simple leaf epidermis preparation and the compatibility to LM and downstream quantitative analysis opens new possibilities in the precise examination of stress- and pathogen-related cell wall structures in epidermal cells. Because the developed tissue processing is also applicable on A. thaliana, well-established, model pathosystems that include the interaction with powdery mildews can be studied to determine principal regulatory mechanisms in plant–microbe interaction with their potential outreach into crop breeding. PMID:25870605

  1. Simple preparation of plant epidermal tissue for laser microdissection and downstream quantitative proteome and carbohydrate analysis.

    PubMed

    Falter, Christian; Ellinger, Dorothea; von Hülsen, Behrend; Heim, René; Voigt, Christian A

    2015-01-01

    The outwardly directed cell wall and associated plasma membrane of epidermal cells represent the first layers of plant defense against intruding pathogens. Cell wall modifications and the formation of defense structures at sites of attempted pathogen penetration are decisive for plant defense. A precise isolation of these stress-induced structures would allow a specific analysis of regulatory mechanism and cell wall adaption. However, methods for large-scale epidermal tissue preparation from the model plant Arabidopsis thaliana, which would allow proteome and cell wall analysis of complete, laser-microdissected epidermal defense structures, have not been provided. We developed the adhesive tape - liquid cover glass technique (ACT) for simple leaf epidermis preparation from A. thaliana, which is also applicable on grass leaves. This method is compatible with subsequent staining techniques to visualize stress-related cell wall structures, which were precisely isolated from the epidermal tissue layer by laser microdissection (LM) coupled to laser pressure catapulting. We successfully demonstrated that these specific epidermal tissue samples could be used for quantitative downstream proteome and cell wall analysis. The development of the ACT for simple leaf epidermis preparation and the compatibility to LM and downstream quantitative analysis opens new possibilities in the precise examination of stress- and pathogen-related cell wall structures in epidermal cells. Because the developed tissue processing is also applicable on A. thaliana, well-established, model pathosystems that include the interaction with powdery mildews can be studied to determine principal regulatory mechanisms in plant-microbe interaction with their potential outreach into crop breeding.

  2. Recent developments in qualitative and quantitative analysis of phytochemical constituents and their metabolites using liquid chromatography-mass spectrometry.

    PubMed

    Wu, Haifeng; Guo, Jian; Chen, Shilin; Liu, Xin; Zhou, Yan; Zhang, Xiaopo; Xu, Xudong

    2013-01-01

    Over the past few years, the applications of liquid chromatography coupled with mass spectrometry (LC-MS) in natural product analysis have been dramatically growing because of the increasingly improved separation and detection capabilities of LC-MS instruments. In particular, novel high-resolution hybrid instruments linked to ultra-high-performance LC and the hyphenations of LC-MS with other separation or analytical techniques greatly aid unequivocal identification and highly sensitive quantification of natural products at trace concentrations in complex matrices. With the aim of providing an up-to-date overview of LC-MS applications on the analysis of plant-derived compounds, papers published within the latest years (2007-2012) involving qualitative and quantitative analysis of phytochemical constituents and their metabolites are summarized in the present review. After briefly describing the general characteristics of natural products analysis, the most remarkable features of LC-MS and sample preparation techniques, the present paper mainly focuses on screening and characterization of phenols (including flavonoids), alkaloids, terpenoids, steroids, coumarins, lignans, and miscellaneous compounds in respective herbs and biological samples, as well as traditional Chinese medicine (TCM) prescriptions using tandem mass spectrometer. Chemical fingerprinting analysis using LC-MS is also described. Meanwhile, instrumental peculiarities and methodological details are accentuated. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Model Analysis of an Aircraft Fueslage Panel using Experimental and Finite-Element Techniques

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A.; Buehrle, Ralph D.; Storaasli, Olaf L.

    1998-01-01

    The application of Electro-Optic Holography (EOH) for measuring the center bay vibration modes of an aircraft fuselage panel under forced excitation is presented. The requirement of free-free panel boundary conditions made the acquisition of quantitative EOH data challenging since large scale rigid body motions corrupted measurements of the high frequency vibrations of interest. Image processing routines designed to minimize effects of large scale motions were applied to successfully resurrect quantitative EOH vibrational amplitude measurements

  4. Quantitative Determination of Caffeine in Beverages Using a Combined SPME-GC/MS Method

    NASA Astrophysics Data System (ADS)

    Pawliszyn, Janusz; Yang, Min J.; Orton, Maureen L.

    1997-09-01

    Solid-phase microextraction (SPME) combined with gas chromatography/mass spectrometry (GC/MS) has been applied to the analysis of various caffeinated beverages. Unlike the current methods, this technique is solvent free and requires no pH adjustments. The simplicity of the SPME-GC/MS method lends itself to a good undergraduate laboratory practice. This publication describes the analytical conditions and presents the data for determination of caffeine in coffee, tea, and coke. Quantitation by isotopic dilution is also illustrated.

  5. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques

    PubMed Central

    Rosebrock, Adrian; Caban, Jesus J.; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2014-01-01

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute’s Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant. PMID:25722829

  6. The use of virtual environments for percentage view analysis.

    PubMed

    Schofield, Damian; Cox, Christopher J B

    2005-09-01

    It is recognised that Visual Impact Assessment (VIA), unlike many other aspects of Environmental Impact Assessments (EIA), relies less upon measurement than upon experience and judgement. Hence, it is necessary for a more structured and consistent approach towards VIA, reducing the amount of bias and subjectivity. For proposed developments, there are very few quantitative techniques for the evaluation of visibility, and these existing methods can be highly inaccurate and time consuming. Percentage view changes are one of the few quantitative techniques, and the use of computer technology can reduce the inaccuracy and the time spent evaluating the visibility of either existing or proposed developments. For over 10 years, research work undertaken by the authors at the University of Nottingham has employed Computer Graphics (CG) and Virtual Reality (VR) in civilian and industrial contexts for environmental planning, design visualisation, accident reconstruction, risk analysis, data visualisation and training simulators. This paper describes a method to quantitatively assess the visual impact of proposed developments on the landscape using CG techniques. This method allows the determination of accurate percentage view changes with the use of a computer-generated model of the environment and the application of specialist software that has been developed at the University of Nottingham. The principles are easy to understand and therefore planners, authorisation agencies and members of the public can use and understand the results. A case study is shown to demonstrate the application and the capabilities of the technology.

  7. Temporal analysis of regional wall motion from cine cardiac MRI

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Didier, Dominique; Chretien, Anne; Rosset, Antoine; Magnin, Isabelle E.; Ligier, Yves

    1996-04-01

    The purpose of this work is to develop and to evaluate an automatic analysis technique for quantitative assessment of cardiac function from cine MRI and to identify regional alterations in synchronicity based on Fourier analysis of ventricular wall motion (WM). A temporal analysis technique of left ventricular wall displacement was developed for quantitative analysis of temporal delays in wall motion and applied to gated cine 'dark blood' cardiac MRI. This imaging technique allows the user to saturate the blood both above and below the imaging slice simultaneously by using a specially designed rf presaturation pulse. The acquisition parameters are: TR equals 25 - 60 msec, TE equals 5 - 7 msec, 0 equals 25 degrees, slice thickness equals 10 mm, 16 to 32 frames/cycle. Automatic edge detection was used to outline the ventricular cavities on all frames of a cardiac cycle. Two different segmentation techniques were applied to all studies and lead to similar results. Further improvement in edge detection accuracy was achieved by temporal interpolation of individual contours on each image of the cardiac cycle. Radial analysis of the ventricular wall motion was then performed along 64 radii drawn from the center of the ventricular cavity. The first harmonic of the Fourier transform of each radial motion curve is calculated. The phase of the fundamental Fourier component is used as an index of synchrony (delay) of regional wall motion. Results are displayed in color-coded maps of regional alterations in the amplitude and synchrony of wall motion. The temporal delays measured from individual segments are evaluated through a histogram of phase distribution, where the width of the main peak is used as an index of overall synchrony of wall motion. The variability of this technique was validated in 10 normal volunteers and was used to identify regions with asynchronous WM in 15 patients with documented CAD. The standard deviation (SD) of phase distribution measured in short axis views was calculated and used to identify regions with asynchronous wall motion in patients with coronary artery disease. Results suggest that this technique is more sensitive than global functional parameters such as ejection fraction for the detection of ventricular dysfunction. Color coded parametric display offers a more convenient way for the identification and localization of regional wall motion asynchrony. Data obtained from endocardial wall motion analysis were not significantly different from wall thickening measurements. The innovative approach of evaluating the temporal behavior of regional wall motion anomalies is expected to provide clinically relevant data about subtle alteration that cannot be detected through simple analysis of the extent (amplitude) of wall motion or myocardial thickening. Temporal analysis of regional WM abnormality from cine MRI offers an innovative and promising means for objective quantitative evaluation of subtle regional abnormalities. Color coded parametric maps allowed a better identification and localization of regional WM asynchrony.

  8. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  9. Consequences of Assumption Violations Revisited: A Quantitative Review of Alternatives to the One-Way Analysis of Variance "F" Test.

    ERIC Educational Resources Information Center

    Lix, Lisa M.; And Others

    1996-01-01

    Meta-analytic techniques were used to summarize the statistical robustness literature on Type I error properties of alternatives to the one-way analysis of variance "F" test. The James (1951) and Welch (1951) tests performed best under violations of the variance homogeneity assumption, although their use is not always appropriate. (SLD)

  10. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less

  11. Two worlds collide: Image analysis methods for quantifying structural variation in cluster molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less

  12. Two worlds collide: image analysis methods for quantifying structural variation in cluster molecular dynamics.

    PubMed

    Steenbergen, K G; Gaston, N

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.

  13. Quantitative analysis of biological tissues using Fourier transform-second-harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.

    2010-02-01

    We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.

  14. 3D Image Analysis of Geomaterials using Confocal Microscopy

    NASA Astrophysics Data System (ADS)

    Mulukutla, G.; Proussevitch, A.; Sahagian, D.

    2009-05-01

    Confocal microscopy is one of the most significant advances in optical microscopy of the last century. It is widely used in biological sciences but its application to geomaterials lingers due to a number of technical problems. Potentially the technique can perform non-invasive testing on a laser illuminated sample that fluoresces using a unique optical sectioning capability that rejects out-of-focus light reaching the confocal aperture. Fluorescence in geomaterials is commonly induced using epoxy doped with a fluorochrome that is impregnated into the sample to enable discrimination of various features such as void space or material boundaries. However, for many geomaterials, this method cannot be used because they do not naturally fluoresce and because epoxy cannot be impregnated into inaccessible parts of the sample due to lack of permeability. As a result, the confocal images of most geomaterials that have not been pre-processed with extensive sample preparation techniques are of poor quality and lack the necessary image and edge contrast necessary to apply any commonly used segmentation techniques to conduct any quantitative study of its features such as vesicularity, internal structure, etc. In our present work, we are developing a methodology to conduct a quantitative 3D analysis of images of geomaterials collected using a confocal microscope with minimal amount of prior sample preparation and no addition of fluorescence. Two sample geomaterials, a volcanic melt sample and a crystal chip containing fluid inclusions are used to assess the feasibility of the method. A step-by-step process of image analysis includes application of image filtration to enhance the edges or material interfaces and is based on two segmentation techniques: geodesic active contours and region competition. Both techniques have been applied extensively to the analysis of medical MRI images to segment anatomical structures. Preliminary analysis suggests that there is distortion in the shapes of the segmented vesicles, vapor bubbles, and void spaces due to the optical measurements, so corrective actions are being explored. This will establish a practical and reliable framework for an adaptive 3D image processing technique for the analysis of geomaterials using confocal microscopy.

  15. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  16. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  17. The diagnostic capability of laser induced fluorescence in the characterization of excised breast tissues

    NASA Astrophysics Data System (ADS)

    Galmed, A. H.; Elshemey, Wael M.

    2017-08-01

    Differentiating between normal, benign and malignant excised breast tissues is one of the major worldwide challenges that need a quantitative, fast and reliable technique in order to avoid personal errors in diagnosis. Laser induced fluorescence (LIF) is a promising technique that has been applied for the characterization of biological tissues including breast tissue. Unfortunately, only few studies have adopted a quantitative approach that can be directly applied for breast tissue characterization. This work provides a quantitative means for such characterization via introduction of several LIF characterization parameters and determining the diagnostic accuracy of each parameter in the differentiation between normal, benign and malignant excised breast tissues. Extensive analysis on 41 lyophilized breast samples using scatter diagrams, cut-off values, diagnostic indices and receiver operating characteristic (ROC) curves, shows that some spectral parameters (peak height and area under the peak) are superior for characterization of normal, benign and malignant breast tissues with high sensitivity (up to 0.91), specificity (up to 0.91) and accuracy ranking (highly accurate).

  18. Development of quantitative laser ionization mass spectrometry (LIMS). Final report, 1 Aug 87-1 Jan 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odom, R.W.

    1991-06-04

    The objective of the research was to develop quantitative microanalysis methods for dielectric thin films using the laser ionization mass spectrometry (LIMS) technique. The research involved preparation of thin (5,000 A) films of SiO2, Al2O3, MgF2, TiO2, Cr2O3, Ta2O5, Si3N4, and ZrO2, and doping these films with ion implant impurities of 11B, 40Ca, 56Fe, 68Zn, 81Br, and 121Sb. Laser ionization mass spectrometry (LIMS), secondary ion mass spectrometry (SIMS) and Rutherford backscattering spectrometry (RBS) were performed on these films. The research demonstrated quantitative LIMS analysis down to detection levels of 10-100 ppm, and led to the development of (1) a compoundmore » thin film standards product line for the performing organization, (2) routine LIMS analytical methods, and (3) the manufacture of high speed preamplifiers for time-of-flight mass spectrometry (TOF-MS) techniques.« less

  19. Investigation of quartz grain surface textures by atomic force microscopy for forensic analysis.

    PubMed

    Konopinski, D I; Hudziak, S; Morgan, R M; Bull, P A; Kenyon, A J

    2012-11-30

    This paper presents a study of quartz sand grain surface textures using atomic force microscopy (AFM) to image the surface. Until now scanning electron microscopy (SEM) has provided the primary technique used in the forensic surface texture analysis of quartz sand grains as a means of establishing the provenance of the grains for forensic reconstructions. The ability to independently corroborate the grain type classifications is desirable and provides additional weight to the findings of SEM analysis of the textures of quartz grains identified in forensic soil/sediment samples. AFM offers a quantitative means of analysis that complements SEM examination, and is a non-destructive technique that requires no sample preparation prior to scanning. It therefore has great potential to be used for forensic analysis where sample preservation is highly valuable. By taking quantitative topography scans, it is possible to produce 3D representations of microscopic surface textures and diagnostic features for examination. Furthermore, various empirical measures can be obtained from analysing the topography scans, including arithmetic average roughness, root-mean-square surface roughness, skewness, kurtosis, and multiple gaussian fits to height distributions. These empirical measures, combined with qualitative examination of the surfaces can help to discriminate between grain types and provide independent analysis that can corroborate the morphological grain typing based on the surface textures assigned using SEM. Furthermore, the findings from this study also demonstrate that quartz sand grain surfaces exhibit a statistically self-similar fractal nature that remains unchanged across scales. This indicates the potential for a further quantitative measure that could be utilised in the discrimination of quartz grains based on their provenance for forensic investigations. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Accounting for Limited Detection Efficiency and Localization Precision in Cluster Analysis in Single Molecule Localization Microscopy

    PubMed Central

    Shivanandan, Arun; Unnikrishnan, Jayakrishnan; Radenovic, Aleksandra

    2015-01-01

    Single Molecule Localization Microscopy techniques like PhotoActivated Localization Microscopy, with their sub-diffraction limit spatial resolution, have been popularly used to characterize the spatial organization of membrane proteins, by means of quantitative cluster analysis. However, such quantitative studies remain challenged by the techniques’ inherent sources of errors such as a limited detection efficiency of less than 60%, due to incomplete photo-conversion, and a limited localization precision in the range of 10 – 30nm, varying across the detected molecules, mainly depending on the number of photons collected from each. We provide analytical methods to estimate the effect of these errors in cluster analysis and to correct for them. These methods, based on the Ripley’s L(r) – r or Pair Correlation Function popularly used by the community, can facilitate potentially breakthrough results in quantitative biology by providing a more accurate and precise quantification of protein spatial organization. PMID:25794150

  1. Do Mouthwashes Really Kill Bacteria?

    ERIC Educational Resources Information Center

    Corner, Thomas R.

    1984-01-01

    Errors in determining the effectiveness of mouthwashes, disinfectants, and other household products as antibacterial agents may result from using broth cultures and/or irregularly shaped bits of filter paper. Presents procedures for a better technique and, for advanced students, two additional procedures for introducing quantitative analysis into…

  2. Combined elemental and microstructural analysis of genuine and fake copper-alloy coins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoli, L; Agresti, J; Mascalchi, M

    2011-07-31

    Innovative noninvasive material analysis techniques are applied to determine archaeometallurgical characteristics of copper-alloy coins from Florence's National Museum of Archaeology. Three supposedly authentic Roman coins and three hypothetically fraudolent imitations are thoroughly investigated using laser-induced plasma spectroscopy and time of flight neutron diffraction along with 3D videomicroscopy and electron microscopy. Material analyses are aimed at collecting data allowing for objective discrimination between genuine Roman productions and late fakes. The results show the mentioned techniques provide quantitative compositional and textural data, which are strictly related to the manufacturing processes and aging of copper alloys. (laser applications)

  3. Focus characterization at an X-ray free-electron laser by coherent scattering and speckle analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sikorski, Marcin; Song, Sanghoon; Schropp, Andreas

    2015-04-14

    X-ray focus optimization and characterization based on coherent scattering and quantitative speckle size measurements was demonstrated at the Linac Coherent Light Source. Its performance as a single-pulse free-electron laser beam diagnostic was tested for two typical focusing configurations. The results derived from the speckle size/shape analysis show the effectiveness of this technique in finding the focus' location, size and shape. In addition, its single-pulse compatibility enables users to capture pulse-to-pulse fluctuations in focus properties compared with other techniques that require scanning and averaging.

  4. Subspace techniques to remove artifacts from EEG: a quantitative analysis.

    PubMed

    Teixeira, A R; Tome, A M; Lang, E W; Martins da Silva, A

    2008-01-01

    In this work we discuss and apply projective subspace techniques to both multichannel as well as single channel recordings. The single-channel approach is based on singular spectrum analysis(SSA) and the multichannel approach uses the extended infomax algorithm which is implemented in the opensource toolbox EEGLAB. Both approaches will be evaluated using artificial mixtures of a set of selected EEG signals. The latter were selected visually to contain as the dominant activity one of the characteristic bands of an electroencephalogram (EEG). The evaluation is performed both in the time and frequency domain by using correlation coefficients and coherence function, respectively.

  5. Minimizing target interference in PK immunoassays: new approaches for low-pH-sample treatment.

    PubMed

    Partridge, Michael A; Pham, John; Dziadiv, Olena; Luong, Onson; Rafique, Ashique; Sumner, Giane; Torri, Albert

    2013-08-01

    Quantitating total levels of monoclonal antibody (mAb) biotherapeutics in serum using ELISA may be hindered by soluble targets. We developed two low-pH-sample-pretreatment techniques to minimize target interference. The first procedure involves sample pretreatment at pH <3.0 before neutralization and analysis in a target capture ELISA. Careful monitoring of acidification time is required to minimize potential impact on mAb detection. The second approach involves sample dilution into mild acid (pH ∼4.5) before transferring to an anti-human capture-antibody-coated plate without neutralization. Analysis of target-drug and drug-capture antibody interactions at pH 4.5 indicated that the capture antibody binds to the drug, while the drug and the target were dissociated. Using these procedures, total biotherapeutic levels were accurately measured when soluble target was >30-fold molar excess. These techniques provide alternatives for quantitating mAb biotherapeutics in the presence of a target when standard acid-dissociation procedures are ineffective.

  6. Statistical methodology: II. Reliability and validity assessment in study design, Part B.

    PubMed

    Karras, D J

    1997-02-01

    Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.

  7. Wavelet-based multiscale analysis of bioimpedance data measured by electric cell-substrate impedance sensing for classification of cancerous and normal cells.

    PubMed

    Das, Debanjan; Shiladitya, Kumar; Biswas, Karabi; Dutta, Pranab Kumar; Parekh, Aditya; Mandal, Mahitosh; Das, Soumen

    2015-12-01

    The paper presents a study to differentiate normal and cancerous cells using label-free bioimpedance signal measured by electric cell-substrate impedance sensing. The real-time-measured bioimpedance data of human breast cancer cells and human epithelial normal cells employs fluctuations of impedance value due to cellular micromotions resulting from dynamic structural rearrangement of membrane protrusions under nonagitated condition. Here, a wavelet-based multiscale quantitative analysis technique has been applied to analyze the fluctuations in bioimpedance. The study demonstrates a method to classify cancerous and normal cells from the signature of their impedance fluctuations. The fluctuations associated with cellular micromotion are quantified in terms of cellular energy, cellular power dissipation, and cellular moments. The cellular energy and power dissipation are found higher for cancerous cells associated with higher micromotions in cancer cells. The initial study suggests that proposed wavelet-based quantitative technique promises to be an effective method to analyze real-time bioimpedance signal for distinguishing cancer and normal cells.

  8. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  10. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  11. Recommended techniques for effective maintainability. A continuous improvement initiative of the NASA Reliability and Maintainability Steering Committee

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.

  12. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  13. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    PubMed

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  14. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less

  15. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment

    PubMed Central

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364

  16. An easy and inexpensive method for quantitative analysis of endothelial damage by using vital dye staining and Adobe Photoshop software.

    PubMed

    Saad, Hisham A; Terry, Mark A; Shamie, Neda; Chen, Edwin S; Friend, Daniel F; Holiman, Jeffrey D; Stoeger, Christopher

    2008-08-01

    We developed a simple, practical, and inexpensive technique to analyze areas of endothelial cell loss and/or damage over the entire corneal area after vital dye staining by using a readily available, off-the-shelf, consumer software program, Adobe Photoshop. The purpose of this article is to convey a method of quantifying areas of cell loss and/or damage. Descemet-stripping automated endothelial keratoplasty corneal transplant surgery was performed by using 5 precut corneas on a human cadaver eye. Corneas were removed and stained with trypan blue and alizarin red S and subsequently photographed. Quantitative assessment of endothelial damage was performed by using Adobe Photoshop 7.0 software. The average difference for cell area damage for analyses performed by 1 observer twice was 1.41%. For analyses performed by 2 observers, the average difference was 1.71%. Three masked observers were 100% successful in matching the randomized stained corneas to their randomized processed Adobe images. Vital dye staining of corneal endothelial cells can be combined with Adobe Photoshop software to yield a quantitative assessment of areas of acute endothelial cell loss and/or damage. This described technique holds promise for a more consistent and accurate method to evaluate the surgical trauma to the endothelial cell layer in laboratory models. This method of quantitative analysis can probably be generalized to any area of research that involves areas that are differentiated by color or contrast.

  17. In vivo confocal microscopy of the cornea: New developments in image acquisition, reconstruction and analysis using the HRT-Rostock Corneal Module

    PubMed Central

    Petroll, W. Matthew; Robertson, Danielle M.

    2015-01-01

    The optical sectioning ability of confocal microscopy allows high magnification images to be obtained from different depths within a thick tissue specimen, and is thus ideally suited to the study of intact tissue in living subjects. In vivo confocal microscopy has been used in a variety of corneal research and clinical applications since its development over 25 years ago. In this article we review the latest developments in quantitative corneal imaging with the Heidelberg Retinal Tomograph with Rostock Corneal Module (HRT-RCM). We provide an overview of the unique strengths and weaknesses of the HRT-RCM. We discuss techniques for performing 3-D imaging with the HRT-RCM, including hardware and software modifications that allow full thickness confocal microscopy through focusing (CMTF) of the cornea, which can provide quantitative measurements of corneal sublayer thicknesses, stromal cell and extracellular matrix backscatter, and depth dependent changes in corneal keratocyte density. We also review current approaches for quantitative imaging of the subbasal nerve plexus, which require a combination of advanced image acquisition and analysis procedures, including wide field mapping and 3-D reconstruction of nerve structures. The development of new hardware, software, and acquisition techniques continues to expand the number of applications of the HRT-RCM for quantitative in vivo corneal imaging at the cellular level. Knowledge of these rapidly evolving strategies should benefit corneal clinicians and basic scientists alike. PMID:25998608

  18. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  19. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2017-12-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  20. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures.

    PubMed

    Boes, Kelsey S; Roberts, Michael S; Vinueza, Nelson R

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R 2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R 2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. Graphical Abstract ᅟ.

  1. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  2. Quantitative analysis of aberrant protein glycosylation in liver cancer plasma by AAL-enrichment and MRM mass spectrometry.

    PubMed

    Ahn, Yeong Hee; Shin, Park Min; Kim, Yong-Sam; Oh, Na Ree; Ji, Eun Sun; Kim, Kwang Hoe; Lee, Yeon Jung; Kim, Sung Ho; Yoo, Jong Shin

    2013-11-07

    A lectin-coupled mass spectrometry (MS) approach was employed to quantitatively monitor aberrant protein glycosylation in liver cancer plasma. To do this, we compared the difference in the total protein abundance of a target glycoprotein between hepatocellular carcinoma (HCC) plasmas and hepatitis B virus (HBV) plasmas, as well as the difference in lectin-specific protein glycoform abundance of the target glycoprotein. Capturing the lectin-specific protein glycoforms from a plasma sample was accomplished by using a fucose-specific aleuria aurantia lectin (AAL) immobilized onto magnetic beads via a biotin-streptavidin conjugate. Following tryptic digestion of both the total plasma and its AAL-captured fraction of each HCC and HBV sample, targeted proteomic mass spectrometry was conducted quantitatively by a multiple reaction monitoring (MRM) technique. From the MRM-based analysis of the total plasmas and AAL-captured fractions, differences between HCC and HBV plasma groups in fucosylated glycoform levels of target glycoproteins were confirmed to arise from both the change in the total protein abundance of the target proteins and the change incurred by aberrant fucosylation on target glycoproteins in HCC plasma, even when no significant change occurs in the total protein abundance level. Combining the MRM-based analysis method with the lectin-capturing technique proved to be a successful means of quantitatively investigating aberrant protein glycosylation in cancer plasma samples. Additionally, it was elucidated that the differences between HCC and control groups in fucosylated biomarker candidates A1AT and FETUA mainly originated from an increase in fucosylation levels on these target glycoproteins, rather than an increase in the total protein abundance of the target glycoproteins.

  3. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    NASA Astrophysics Data System (ADS)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  4. Quantitative characterization of nanoscale polycrystalline magnets with electron magnetic circular dichroism.

    PubMed

    Muto, Shunsuke; Rusz, Ján; Tatsumi, Kazuyoshi; Adam, Roman; Arai, Shigeo; Kocevski, Vancho; Oppeneer, Peter M; Bürgler, Daniel E; Schneider, Claus M

    2014-01-01

    Electron magnetic circular dichroism (EMCD) allows the quantitative, element-selective determination of spin and orbital magnetic moments, similar to its well-established X-ray counterpart, X-ray magnetic circular dichroism (XMCD). As an advantage over XMCD, EMCD measurements are made using transmission electron microscopes, which are routinely operated at sub-nanometre resolution, thereby potentially allowing nanometre magnetic characterization. However, because of the low intensity of the EMCD signal, it has not yet been possible to obtain quantitative information from EMCD signals at the nanoscale. Here we demonstrate a new approach to EMCD measurements that considerably enhances the outreach of the technique. The statistical analysis introduced here yields robust quantitative EMCD signals. Moreover, we demonstrate that quantitative magnetic information can be routinely obtained using electron beams of only a few nanometres in diameter without imposing any restriction regarding the crystalline order of the specimen.

  5. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells

    PubMed Central

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  6. Detection and quantitation of benzo(a)pyrene-DNA adducts in brain and liver tissues of Beluga whales (Delphinapterus leucas) from the St. Lawrence and Mackenzie Estuaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shugart, L.R.

    1988-01-01

    It should be noted that there are few analytical techniques available for the detection and quantitation of chemical adducts in the DNA of living organisms. The reasons for this are: the analytical technique often has to accommodate the unique chemical and/or physical properties of the individual chemical or its metabolite; the percentage of total chemical that becomes most of the parent compound is usually detoxified and excreted; not all adducts that form between the genotoxic agent and DNA are stable or are involved in the development of subsequent deleterious events in the organism; and the amount of DNA available formore » analysis is often quite limited. 16 refs., 1 tab.« less

  7. Development of CD3 cell quantitation algorithms for renal allograft biopsy rejection assessment utilizing open source image analysis software.

    PubMed

    Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad

    2018-02-01

    Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.

  8. The quantitative surface analysis of an antioxidant additive in a lubricant oil matrix by desorption electrospray ionization mass spectrometry

    PubMed Central

    Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S

    2013-01-01

    RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398

  9. Local structure in LaMnO3 and CaMnO3 perovskites: A quantitative structural refinement of Mn K -edge XANES data

    NASA Astrophysics Data System (ADS)

    Monesi, C.; Meneghini, C.; Bardelli, F.; Benfatto, M.; Mobilio, S.; Manju, U.; Sarma, D. D.

    2005-11-01

    Hole-doped perovskites such as La1-xCaxMnO3 present special magnetic and magnetotransport properties, and it is commonly accepted that the local atomic structure around Mn ions plays a crucial role in determining these peculiar features. Therefore experimental techniques directly probing the local atomic structure, like x-ray absorption spectroscopy (XAS), have been widely exploited to deeply understand the physics of these compounds. Quantitative XAS analysis usually concerns the extended region [extended x-ray absorption fine structure (EXAFS)] of the absorption spectra. The near-edge region [x-ray absorption near-edge spectroscopy (XANES)] of XAS spectra can provide detailed complementary information on the electronic structure and local atomic topology around the absorber. However, the complexity of the XANES analysis usually prevents a quantitative understanding of the data. This work exploits the recently developed MXAN code to achieve a quantitative structural refinement of the Mn K -edge XANES of LaMnO3 and CaMnO3 compounds; they are the end compounds of the doped manganite series LaxCa1-xMnO3 . The results derived from the EXAFS and XANES analyses are in good agreement, demonstrating that a quantitative picture of the local structure can be obtained from XANES in these crystalline compounds. Moreover, the quantitative XANES analysis provides topological information not directly achievable from EXAFS data analysis. This work demonstrates that combining the analysis of extended and near-edge regions of Mn K -edge XAS spectra could provide a complete and accurate description of Mn local atomic environment in these compounds.

  10. Optimizing morphology through blood cell image analysis.

    PubMed

    Merino, A; Puigví, L; Boldú, L; Alférez, S; Rodellar, J

    2018-05-01

    Morphological review of the peripheral blood smear is still a crucial diagnostic aid as it provides relevant information related to the diagnosis and is important for selection of additional techniques. Nevertheless, the distinctive cytological characteristics of the blood cells are subjective and influenced by the reviewer's interpretation and, because of that, translating subjective morphological examination into objective parameters is a challenge. The use of digital microscopy systems has been extended in the clinical laboratories. As automatic analyzers have some limitations for abnormal or neoplastic cell detection, it is interesting to identify quantitative features through digital image analysis for morphological characteristics of different cells. Three main classes of features are used as follows: geometric, color, and texture. Geometric parameters (nucleus/cytoplasmic ratio, cellular area, nucleus perimeter, cytoplasmic profile, RBC proximity, and others) are familiar to pathologists, as they are related to the visual cell patterns. Different color spaces can be used to investigate the rich amount of information that color may offer to describe abnormal lymphoid or blast cells. Texture is related to spatial patterns of color or intensities, which can be visually detected and quantitatively represented using statistical tools. This study reviews current and new quantitative features, which can contribute to optimize morphology through blood cell digital image processing techniques. © 2018 John Wiley & Sons Ltd.

  11. Application of scenario analysis and multiagent technique in land-use planning: a case study on Sanjiang wetlands.

    PubMed

    Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.

  12. Application of Scenario Analysis and Multiagent Technique in Land-Use Planning: A Case Study on Sanjiang Wetlands

    PubMed Central

    Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816

  13. Quantitative analysis of the effect of environmental-scanning electron microscopy on collagenous tissues.

    PubMed

    Lee, Woowon; Toussaint, Kimani C

    2018-05-31

    Environmental-scanning electron microscopy (ESEM) is routinely applied to various biological samples due to its ability to maintain a wet environment while imaging; moreover, the technique obviates the need for sample coating. However, there is limited research carried out on electron-beam (e-beam) induced tissue damage resulting from using the ESEM. In this paper, we use quantitative second-harmonic generation (SHG) microscopy to examine the effects of e-beam exposure from the ESEM on collagenous tissue samples prepared as either fixed, frozen, wet or dehydrated. Quantitative SHG analysis of tissues, before and after ESEM e-beam exposure in low-vacuum mode, reveals evidence of cross-linking of collagen fibers, however there are no structural differences observed in fixed tissue. Meanwhile wet-mode ESEM appears to radically alter the structure from a regular fibrous arrangement to a more random fiber orientation. We also confirm that ESEM images of collagenous tissues show higher spatial resolution compared to SHG microscopy, but the relative tradeoff with collagen specificity reduces its effectiveness in quantifying collagen fiber organization. Our work provides insight on both the limitations of the ESEM for tissue imaging, and the potential opportunity to use as a complementary technique when imaging fine features in the non-collagenous regions of tissue samples.

  14. Quantitative trace analysis of complex mixtures using SABRE hyperpolarization.

    PubMed

    Eshuis, Nan; van Weerdenburg, Bram J A; Feiters, Martin C; Rutjes, Floris P J T; Wijmenga, Sybren S; Tessari, Marco

    2015-01-26

    Signal amplification by reversible exchange (SABRE) is an emerging nuclear spin hyperpolarization technique that strongly enhances NMR signals of small molecules in solution. However, such signal enhancements have never been exploited for concentration determination, as the efficiency of SABRE can strongly vary between different substrates or even between nuclear spins in the same molecule. The first application of SABRE for the quantitative analysis of a complex mixture is now reported. Despite the inherent complexity of the system under investigation, which involves thousands of competing binding equilibria, analytes at concentrations in the low micromolar range could be quantified from single-scan SABRE spectra using a standard-addition approach. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Quantitative phase imaging of human red blood cells using phase-shifting white light interference microscopy with colour fringe analysis

    NASA Astrophysics Data System (ADS)

    Singh Mehta, Dalip; Srivastava, Vishal

    2012-11-01

    We report quantitative phase imaging of human red blood cells (RBCs) using phase-shifting interference microscopy. Five phase-shifted white light interferograms are recorded using colour charge coupled device camera. White light interferograms were decomposed into red, green, and blue colour components. The phase-shifted interferograms of each colour were then processed by phase-shifting analysis and phase maps for red, green, and blue colours were reconstructed. Wavelength dependent refractive index profiles of RBCs were computed from the single set of white light interferogram. The present technique has great potential for non-invasive determination of refractive index variation and morphological features of cells and tissues.

  16. Coercivity degradation caused by inhomogeneous grain boundaries in sintered Nd-Fe-B permanent magnets

    NASA Astrophysics Data System (ADS)

    Chen, Hansheng; Yun, Fan; Qu, Jiangtao; Li, Yingfei; Cheng, Zhenxiang; Fang, Ruhao; Ye, Zhixiao; Ringer, Simon P.; Zheng, Rongkun

    2018-05-01

    Quantitative correlation between intrinsic coercivity and grain boundaries in three dimensions is critical to further improve the performance of sintered Nd-Fe-B permanent magnets. Here, we quantitatively reveal the local composition variation across and especially along grain boundaries using the powerful atomic-scale analysis technique known as atom probe tomography. We also estimate the saturation magnetization, magnetocrystalline anisotropy constant, and exchange stiffness of the grain boundaries on the basis of the experimentally determined structure and composition. Finally, using micromagnetic simulations, we quantify the intrinsic coercivity degradation caused by inhomogeneous grain boundaries. This approach can be applied to other magnetic materials for the analysis and optimization of magnetic properties.

  17. Computer simulation of schlieren images of rotationally symmetric plasma systems: a simple method.

    PubMed

    Noll, R; Haas, C R; Weikl, B; Herziger, G

    1986-03-01

    Schlieren techniques are commonly used methods for quantitative analysis of cylindrical or spherical index of refraction profiles. Many schlieren objects, however, are characterized by more complex geometries, so we have investigated the more general case of noncylindrical, rotationally symmetric distributions of index of refraction n(r,z). Assuming straight ray paths in the schlieren object we have calculated 2-D beam deviation profiles. It is shown that experimental schlieren images of the noncylindrical plasma generated by a plasma focus device can be simulated with these deviation profiles. The computer simulation allows a quantitative analysis of these schlieren images, which yields, for example, the plasma parameters, electron density, and electron density gradients.

  18. Rapid Quantitative Determination of Squalene in Shark Liver Oils by Raman and IR Spectroscopy.

    PubMed

    Hall, David W; Marshall, Susan N; Gordon, Keith C; Killeen, Daniel P

    2016-01-01

    Squalene is sourced predominantly from shark liver oils and to a lesser extent from plants such as olives. It is used for the production of surfactants, dyes, sunscreen, and cosmetics. The economic value of shark liver oil is directly related to the squalene content, which in turn is highly variable and species-dependent. Presented here is a validated gas chromatography-mass spectrometry analysis method for the quantitation of squalene in shark liver oils, with an accuracy of 99.0 %, precision of 0.23 % (standard deviation), and linearity of >0.999. The method has been used to measure the squalene concentration of 16 commercial shark liver oils. These reference squalene concentrations were related to infrared (IR) and Raman spectra of the same oils using partial least squares regression. The resultant models were suitable for the rapid quantitation of squalene in shark liver oils, with cross-validation r (2) values of >0.98 and root mean square errors of validation of ≤4.3 % w/w. Independent test set validation of these models found mean absolute deviations of the 4.9 and 1.0 % w/w for the IR and Raman models, respectively. Both techniques were more accurate than results obtained by an industrial refractive index analysis method, which is used for rapid, cheap quantitation of squalene in shark liver oils. In particular, the Raman partial least squares regression was suited to quantitative squalene analysis. The intense and highly characteristic Raman bands of squalene made quantitative analysis possible irrespective of the lipid matrix.

  19. Thinking big

    NASA Astrophysics Data System (ADS)

    Collins, Harry

    2008-02-01

    Physicists are often quick to discount social research based on qualitative techniques such as ethnography and "deep case studies" - where a researcher draws conclusions about a community based on immersion in the field - thinking that only quantitative research backed up by statistical analysis is sound. The balance is not so clear, however.

  20. The Case for Open Source Software: The Interactional Discourse Lab

    ERIC Educational Resources Information Center

    Choi, Seongsook

    2016-01-01

    Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…

  1. Direct Allocation Costing: Informed Management Decisions in a Changing Environment.

    ERIC Educational Resources Information Center

    Mancini, Cesidio G.; Goeres, Ernest R.

    1995-01-01

    It is argued that colleges and universities can use direct allocation costing to provide quantitative information needed for decision making. This method of analysis requires institutions to modify traditional ideas of costing, looking to the private sector for examples of accurate costing techniques. (MSE)

  2. Applications of mass spectrometry for quantitative protein analysis in formalin-fixed paraffin-embedded tissues

    PubMed Central

    Steiner, Carine; Ducret, Axel; Tille, Jean-Christophe; Thomas, Marlene; McKee, Thomas A; Rubbia-Brandt, Laura A; Scherl, Alexander; Lescuyer, Pierre; Cutler, Paul

    2014-01-01

    Proteomic analysis of tissues has advanced in recent years as instruments and methodologies have evolved. The ability to retrieve peptides from formalin-fixed paraffin-embedded tissues followed by shotgun or targeted proteomic analysis is offering new opportunities in biomedical research. In particular, access to large collections of clinically annotated samples should enable the detailed analysis of pathologically relevant tissues in a manner previously considered unfeasible. In this paper, we review the current status of proteomic analysis of formalin-fixed paraffin-embedded tissues with a particular focus on targeted approaches and the potential for this technique to be used in clinical research and clinical diagnosis. We also discuss the limitations and perspectives of the technique, particularly with regard to application in clinical diagnosis and drug discovery. PMID:24339433

  3. Hydrogen measurement during steam oxidation using coupled thermogravimetric analysis and quadrupole mass spectrometry

    DOE PAGES

    Parkison, Adam J.; Nelson, Andrew Thomas

    2016-01-11

    An analytical technique is presented with the goal of measuring reaction kinetics during steam oxidation reactions for three cases in which obtaining kinetics information often requires a prohibitive amount of time and cost. The technique presented relies on coupling thermogravimetric analysis (TGA) with a quantitative hydrogen measurement technique using quadrupole mass spectrometry (QMS). The first case considered is in differentiating between the kinetics of steam oxidation reactions and those for simultaneously reacting gaseous impurities such as nitrogen or oxygen. The second case allows one to independently measure the kinetics of oxide and hydride formation for systems in which both ofmore » these reactions are known to take place during steam oxidation. The third case deals with measuring the kinetics of formation for competing volatile and non-volatile oxides during certain steam oxidation reactions. In order to meet the requirements of the coupled technique, a methodology is presented which attempts to provide quantitative measurement of hydrogen generation using QMS in the presence of an interfering fragmentation species, namely water vapor. This is achieved such that all calibrations and corrections are performed during the TGA baseline and steam oxidation programs, making system operation virtually identical to standard TGA. Benchmarking results showed a relative error in hydrogen measurement of 5.7–8.4% following the application of a correction factor. Lastly, suggestions are made for possible improvements to the presented technique so that it may be better applied to the three cases presented.« less

  4. Hydrogen measurement during steam oxidation using coupled thermogravimetric analysis and quadrupole mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parkison, Adam J.; Nelson, Andrew Thomas

    An analytical technique is presented with the goal of measuring reaction kinetics during steam oxidation reactions for three cases in which obtaining kinetics information often requires a prohibitive amount of time and cost. The technique presented relies on coupling thermogravimetric analysis (TGA) with a quantitative hydrogen measurement technique using quadrupole mass spectrometry (QMS). The first case considered is in differentiating between the kinetics of steam oxidation reactions and those for simultaneously reacting gaseous impurities such as nitrogen or oxygen. The second case allows one to independently measure the kinetics of oxide and hydride formation for systems in which both ofmore » these reactions are known to take place during steam oxidation. The third case deals with measuring the kinetics of formation for competing volatile and non-volatile oxides during certain steam oxidation reactions. In order to meet the requirements of the coupled technique, a methodology is presented which attempts to provide quantitative measurement of hydrogen generation using QMS in the presence of an interfering fragmentation species, namely water vapor. This is achieved such that all calibrations and corrections are performed during the TGA baseline and steam oxidation programs, making system operation virtually identical to standard TGA. Benchmarking results showed a relative error in hydrogen measurement of 5.7–8.4% following the application of a correction factor. Lastly, suggestions are made for possible improvements to the presented technique so that it may be better applied to the three cases presented.« less

  5. The superiority of indium ratio over blood pool subtraction in analysis of indium-111 platelet deposition on thoraco-abdominal prosthetic grafts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ripley, S.; Wakefield, T.; Spaulding, S.

    1985-05-01

    In this investigation platelet deposition in polytetrafluroethylene (PTFE) thoracoabdominal grafts (TAC's) was evaluated using two different semi-quantitative techniques. Ten PTFE TAG's 6 mm in diameter and 30 cm in length were inserted into 10 mongrel dogs. One, 4 and 6 weeks after graft implantation the animals were injected with autologous In-111 platelets labelled by a modified Thakur technique. Platelet imaging in grafts was performed 48 hrs after injection. Blood pool was determined by Tc99m labelled RBC's (in vivo/in vitro technique). Semi-quantitative analysis was performed by subdividing the imaged graft into three major regions and selecting a reference region from eithermore » the native aorta or common iliac artery. Excess platelet deposition was determined by two methods: 1) the ratio of In-111 counts in the graft ROI''s to the reference region and 2) the percent In-111 excess using the Tc99m blood pool subtraction technique (TBPST). Animals were sacrificed 7 weeks after implantation and radioactivity in the excised grafts was determined using a well counter. A positive correlation was found to exist between the In-111 ratio percent analysis (IRPA) and the direct gamma counting (DCC) for all three segments of the prosthetic graft. Correlation coefficients for the thorax, midsegment and abdominal segments were 0.80, 0.73 and 0.48 respectivly. There was no correlation between TBPST and DGC. Using the IRPA technique the thrombogenicity of TAG's can be routinely assessed and is clinically applicable for patient use. TBPST should probably be limited to the extremities to avoid error due to free Tc99m counts from kidneys and ureters.« less

  6. On line biomonitors used as a tool for toxicity reduction evaluation of in situ groundwater remediation techniques.

    PubMed

    Küster, Eberhard; Dorusch, Falk; Vogt, Carsten; Weiss, Holger; Altenburger, Rolf

    2004-07-15

    Success of groundwater remediation is typically controlled via snapshot analysis of selected chemical substances or physical parameters. Biological parameters, i.e. ecotoxicological assays, are rarely employed. Hence the aim of the study was to develop a bioassay tool, which allows an on line monitoring of contaminated groundwater, as well as a toxicity reduction evaluation (TRE) of different remediation techniques in parallel and may furthermore be used as an additional tool for process control to supervise remediation techniques in a real time mode. Parallel testing of groundwater remediation techniques was accomplished for short and long time periods, by using the energy dependent luminescence of the bacterium Vibrio fischeri as biological monitoring parameter. One data point every hour for each remediation technique was generated by an automated biomonitor. The bacteria proved to be highly sensitive to the contaminated groundwater and the biomonitor showed a long standing time despite the highly corrosive groundwater present in Bitterfeld, Germany. The bacterial biomonitor is demonstrated to be a valuable tool for remediation success evaluation. Dose response relationships were generated for the six quantitatively dominant groundwater contaminants (2-chlortoluene, 1,2- and 1,4-dichlorobenzene, monochlorobenzene, ethylenbenzene and benzene). The concentrations of individual volatile organic chemicals (VOCs) could not explain the observed effects in the bacteria. An expected mixture toxicity was calculated for the six components using the concept of concentration addition. The calculated EC(50) for the mixture was still one order of magnitude lower than the observed EC(50) of the actual groundwater. The results pointed out that chemical analysis of the six most quantitative substances alone was not able to explain the effects observed with the bacteria. Thus chemical analysis alone may not be an adequate tool for remediation success evaluation in terms of toxicity reduction.

  7. Quantitative analysis of trace metal accumulation in teeth using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Samek, O.; Beddows, D. C. S.; Telle, H. H.; Morris, G. W.; Liska, M.; Kaiser, J.

    The technique of laser ablation is receiving increasing attention for applications in dentistry, specifically for the treatment of teeth (e.g. drilling of micro-holes and plaque removal). In the process of ablation a luminous micro-plasma is normally generated which may be exploited for elemental analysis. Here we report on quantitative Laser-Induced Breakdown Spectroscopy (LIBS) analysis to study the presence of trace minerals in teeth. A selection of teeth of different age groups has been investigated, ranging from the first teeth of infants, through the second teeth of children, to adults to trace the influence of environmental factors on the accumulation of a number of elements in teeth. We found a close link between elements detected in tooth fillings and toothpastes with those present in teeth.

  8. Quantitative Aspects of Single Molecule Microscopy

    PubMed Central

    Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally

    2015-01-01

    Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102

  9. Visualizing Ebolavirus Particles Using Single-Particle Interferometric Reflectance Imaging Sensor (SP-IRIS).

    PubMed

    Carter, Erik P; Seymour, Elif Ç; Scherr, Steven M; Daaboul, George G; Freedman, David S; Selim Ünlü, M; Connor, John H

    2017-01-01

    This chapter describes an approach for the label-free imaging and quantification of intact Ebola virus (EBOV) and EBOV viruslike particles (VLPs) using a light microscopy technique. In this technique, individual virus particles are captured onto a silicon chip that has been printed with spots of virus-specific capture antibodies. These captured virions are then detected using an optical approach called interference reflectance imaging. This approach allows for the detection of each virus particle that is captured on an antibody spot and can resolve the filamentous structure of EBOV VLPs without the need for electron microscopy. Capture of VLPs and virions can be done from a variety of sample types ranging from tissue culture medium to blood. The technique also allows automated quantitative analysis of the number of virions captured. This can be used to identify the virus concentration in an unknown sample. In addition, this technique offers the opportunity to easily image virions captured from native solutions without the need for additional labeling approaches while offering a means of assessing the range of particle sizes and morphologies in a quantitative manner.

  10. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  11. Quantitative analysis of thyroid tumors vascularity: A comparison between 3-D contrast-enhanced ultrasound and 3-D Power Doppler on benign and malignant thyroid nodules.

    PubMed

    Caresio, Cristina; Caballo, Marco; Deandrea, Maurilio; Garberoglio, Roberto; Mormile, Alberto; Rossetto, Ruth; Limone, Paolo; Molinari, Filippo

    2018-05-15

    To perform a comparative quantitative analysis of Power Doppler ultrasound (PDUS) and Contrast-Enhancement ultrasound (CEUS) for the quantification of thyroid nodules vascularity patterns, with the goal of identifying biomarkers correlated with the malignancy of the nodule with both imaging techniques. We propose a novel method to reconstruct the vascular architecture from 3-D PDUS and CEUS images of thyroid nodules, and to automatically extract seven quantitative features related to the morphology and distribution of vascular network. Features include three tortuosity metrics, the number of vascular trees and branches, the vascular volume density, and the main spatial vascularity pattern. Feature extraction was performed on 20 thyroid lesions (ten benign and ten malignant), of which we acquired both PDUS and CEUS. MANOVA (multivariate analysis of variance) was used to differentiate benign and malignant lesions based on the most significant features. The analysis of the extracted features showed a significant difference between the benign and malignant nodules for both PDUS and CEUS techniques for all the features. Furthermore, by using a linear classifier on the significant features identified by the MANOVA, benign nodules could be entirely separated from the malignant ones. Our early results confirm the correlation between the morphology and distribution of blood vessels and the malignancy of the lesion, and also show (at least for the dataset used in this study) a considerable similarity in terms of findings of PDUS and CEUS imaging for thyroid nodules diagnosis and classification. © 2018 American Association of Physicists in Medicine.

  12. Automatic phase aberration compensation for digital holographic microscopy based on deep learning background detection.

    PubMed

    Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George

    2017-06-26

    We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.

  13. Liquid chromatographic separation of terpenoid pigments in foods and food products.

    PubMed

    Cserháti, T; Forgács, E

    2001-11-30

    The newest achievements in the use of various liquid chromatographic techniques such as adsorption and reversed-phase thin-layer chromatography and HPLC employed for the separation and quantitative determination of terpenoid-based color substances in foods and food products are reviewed. The techniques applied for the analysis of individual pigments and pigments classes are surveyed and critically evaluated. Future trends in the separation and identification of pigments in foods and food products are delineated.

  14. Laser-induced breakdown spectroscopy (LIBS) technique for the determination of the chemical composition of complex inorganic materials

    NASA Astrophysics Data System (ADS)

    Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Kozioł, Paweł E.; Stepak, Bogusz; Abramski, Krzysztof M.

    2014-08-01

    Laser-induced breakdown spectroscopy (LIBS) is a fast, fully optical method, that needs little or no sample preparation. In this technique qualitative and quantitative analysis is based on comparison. The determination of composition is generally based on the construction of a calibration curve namely the LIBS signal versus the concentration of the analyte. Typically, to calibrate the system, certified reference materials with known elemental composition are used. Nevertheless, such samples due to differences in the overall composition with respect to the used complex inorganic materials can influence significantly on the accuracy. There are also some intermediate factors which can cause imprecision in measurements, such as optical absorption, surface structure, thermal conductivity etc. This paper presents the calibration procedure performed with especially prepared pellets from the tested materials, which composition was previously defined. We also proposed methods of post-processing which allowed for mitigation of the matrix effects and for a reliable and accurate analysis. This technique was implemented for determination of trace elements in industrial copper concentrates standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for contents of three elements, that is silver, cobalt and vanadium. It has been shown that the described technique can be used to qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates.

  15. Quantitative analysis of major dibenzocyclooctane lignans in Schisandrae fructus by online TLC-DART-MS.

    PubMed

    Kim, Hye Jin; Oh, Myung Sook; Hong, Jongki; Jang, Young Pyo

    2011-01-01

    Direct analysis in real time (DART) ion source is a powerful ionising technique for the quick and easy detection of various organic molecules without any sample preparation steps, but the lack of quantitation capacity limits its extensive use in the field of phytochemical analysis. To improvise a new system which utilize DART-MS as a hyphenated detector for quantitation. A total extract of Schisandra chinensis fruit was analyzed on a TLC plate and three major lignan compounds were quantitated by three different methods of UV densitometry, TLC-DART-MS and HPLC-UV to compare the efficiency of each method. To introduce the TLC plate into the DART ion source at a constant velocity, a syringe pump was employed. The DART-MS total ion current chromatogram was recorded for the entire TLC plate. The concentration of each lignan compound was calculated from the calibration curve established with standard compound. Gomisin A, gomisin N and schisandrin were well separated on a silica-coated TLC plate and the specific ion current chromatograms were successfully acquired from the TLC-DART-MS system. The TLC-DART-MS system for the quantitation of natural products showed better linearity and specificity than TLC densitometry, and consumed less time and solvent than conventional HPLC method. A hyphenated system for the quantitation of phytochemicals from crude herbal drugs was successfully established. This system was shown to have a powerful analytical capacity for the prompt and efficient quantitation of natural products from crude drugs. Copyright © 2010 John Wiley & Sons, Ltd.

  16. Macroscopic X-ray Powder Diffraction Scanning: Possibilities for Quantitative and Depth-Selective Parchment Analysis.

    PubMed

    Vanmeert, Frederik; De Nolf, Wout; Dik, Joris; Janssens, Koen

    2018-06-05

    At or below the surface of painted works of art, valuable information is present that provides insights into an object's past, such as the artist's technique and the creative process that was followed or its conservation history but also on its current state of preservation. Various noninvasive techniques have been developed over the past 2 decades that can probe this information either locally (via point analysis) or on a macroscopic scale (e.g., full-field imaging and raster scanning). Recently macroscopic X-ray powder diffraction (MA-XRPD) mapping using laboratory X-ray sources was developed. This method can visualize highly specific chemical distributions at the macroscale (dm 2 ). In this work we demonstrate the synergy between the quantitative aspects of powder diffraction and the noninvasive scanning capability of MA-XRPD highlighting the potential of the method to reveal new types of information. Quantitative data derived from a 15th/16th century illuminated sheet of parchment revealed three lead white pigments with different hydrocerussite-cerussite compositions in specific pictorial elements, while quantification analysis of impurities in the blue azurite pigment revealed two distinct azurite types: one rich in barite and one in quartz. Furthermore, on the same artifact, the depth-selective possibilities of the method that stem from an exploitation of the shift of the measured diffraction peaks with respect to reference data are highlighted. The influence of different experimental parameters on the depth-selective analysis results is briefly discussed. Promising stratigraphic information could be obtained, even though the analysis is hampered by not completely understood variations in the unit cell dimensions of the crystalline pigment phases.

  17. Using normalization 3D model for automatic clinical brain quantative analysis and evaluation

    NASA Astrophysics Data System (ADS)

    Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping

    2003-05-01

    Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.

  18. Improvements to direct quantitative analysis of multiple microRNAs facilitating faster analysis.

    PubMed

    Ghasemi, Farhad; Wegman, David W; Kanoatov, Mirzo; Yang, Burton B; Liu, Stanley K; Yousef, George M; Krylov, Sergey N

    2013-11-05

    Studies suggest that patterns of deregulation in sets of microRNA (miRNA) can be used as cancer diagnostic and prognostic biomarkers. Establishing a "miRNA fingerprint"-based diagnostic technique requires a suitable miRNA quantitation method. The appropriate method must be direct, sensitive, capable of simultaneous analysis of multiple miRNAs, rapid, and robust. Direct quantitative analysis of multiple microRNAs (DQAMmiR) is a recently introduced capillary electrophoresis-based hybridization assay that satisfies most of these criteria. Previous implementations of the method suffered, however, from slow analysis time and required lengthy and stringent purification of hybridization probes. Here, we introduce a set of critical improvements to DQAMmiR that address these technical limitations. First, we have devised an efficient purification procedure that achieves the required purity of the hybridization probe in a fast and simple fashion. Second, we have optimized the concentrations of the DNA probe to decrease the hybridization time to 10 min. Lastly, we have demonstrated that the increased probe concentrations and decreased incubation time removed the need for masking DNA, further simplifying the method and increasing its robustness. The presented improvements bring DQAMmiR closer to use in a clinical setting.

  19. High-Precision Pinpointing of Luminescent Targets in Encoder-Assisted Scanning Microscopy Allowing High-Speed Quantitative Analysis.

    PubMed

    Zheng, Xianlin; Lu, Yiqing; Zhao, Jiangbo; Zhang, Yuhai; Ren, Wei; Liu, Deming; Lu, Jie; Piper, James A; Leif, Robert C; Liu, Xiaogang; Jin, Dayong

    2016-01-19

    Compared with routine microscopy imaging of a few analytes at a time, rapid scanning through the whole sample area of a microscope slide to locate every single target object offers many advantages in terms of simplicity, speed, throughput, and potential for robust quantitative analysis. Existing techniques that accommodate solid-phase samples incorporating individual micrometer-sized targets generally rely on digital microscopy and image analysis, with intrinsically low throughput and reliability. Here, we report an advanced on-the-fly stage scanning method to achieve high-precision target location across the whole slide. By integrating X- and Y-axis linear encoders to a motorized stage as the virtual "grids" that provide real-time positional references, we demonstrate an orthogonal scanning automated microscopy (OSAM) technique which can search a coverslip area of 50 × 24 mm(2) in just 5.3 min and locate individual 15 μm lanthanide luminescent microspheres with standard deviations of 1.38 and 1.75 μm in X and Y directions. Alongside implementation of an autofocus unit that compensates the tilt of a slide in the Z-axis in real time, we increase the luminescence detection efficiency by 35% with an improved coefficient of variation. We demonstrate the capability of advanced OSAM for robust quantification of luminescence intensities and lifetimes for a variety of micrometer-scale luminescent targets, specifically single down-shifting and upconversion microspheres, crystalline microplates, and color-barcoded microrods, as well as quantitative suspension array assays of biotinylated-DNA functionalized upconversion nanoparticles.

  20. Automatic quantitative analysis of in-stent restenosis using FD-OCT in vivo intra-arterial imaging.

    PubMed

    Mandelias, Kostas; Tsantis, Stavros; Spiliopoulos, Stavros; Katsakiori, Paraskevi F; Karnabatidis, Dimitris; Nikiforidis, George C; Kagadis, George C

    2013-06-01

    A new segmentation technique is implemented for automatic lumen area extraction and stent strut detection in intravascular optical coherence tomography (OCT) images for the purpose of quantitative analysis of in-stent restenosis (ISR). In addition, a user-friendly graphical user interface (GUI) is developed based on the employed algorithm toward clinical use. Four clinical datasets of frequency-domain OCT scans of the human femoral artery were analyzed. First, a segmentation method based on fuzzy C means (FCM) clustering and wavelet transform (WT) was applied toward inner luminal contour extraction. Subsequently, stent strut positions were detected by utilizing metrics derived from the local maxima of the wavelet transform into the FCM membership function. The inner lumen contour and the position of stent strut were extracted with high precision. Compared to manual segmentation by an expert physician, the automatic lumen contour delineation had an average overlap value of 0.917 ± 0.065 for all OCT images included in the study. The strut detection procedure achieved an overall accuracy of 93.80% and successfully identified 9.57 ± 0.5 struts for every OCT image. Processing time was confined to approximately 2.5 s per OCT frame. A new fast and robust automatic segmentation technique combining FCM and WT for lumen border extraction and strut detection in intravascular OCT images was designed and implemented. The proposed algorithm integrated in a GUI represents a step forward toward the employment of automated quantitative analysis of ISR in clinical practice.

  1. Manufacturing of hybrid aluminum copper joints by electromagnetic pulse welding - Identification of quantitative process windows

    NASA Astrophysics Data System (ADS)

    Psyk, Verena; Scheffler, Christian; Linnemann, Maik; Landgrebe, Dirk

    2017-10-01

    Compared to conventional joining techniques, electromagnetic pulse welding offers important advantages especially when it comes to dissimilar material connections as e.g. copper aluminum welds. However, due to missing guidelines and tools for process design, the process has not been widely implemented in industrial production, yet. In order to contribute to overcoming this obstacle, a combined numerical and experimental process analysis for electromagnetic pulse welding of Cu-DHP and EN AW-1050 was carried out and the results were consolidated in a quantitative collision parameter based process window.

  2. Estimation of Characteristics of Echo Envelope Using RF Echo Signal from the Liver

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Tadashi; Hachiya, Hiroyuki; Kamiyama, Naohisa; Ikeda, Kazuki; Moriyasu, Norifumi

    2001-05-01

    To realize quantitative diagnosis of liver cirrhosis, we have been analyzing the probability density function (PDF) of echo amplitude using B-mode images. However, the B-mode image is affected by the various signal and image processing techniques used in the diagnosis equipment, so a detailed and quantitative analysis is very difficult. In this paper, we analyze the PDF of echo amplitude using RF echo signal and B-mode images of normal and cirrhotic livers, and compare both results to examine the validity of the RF echo signal.

  3. Using the Nobel Laureates in Economics to Teach Quantitative Methods

    ERIC Educational Resources Information Center

    Becker, William E.; Greene, William H.

    2005-01-01

    The authors show how the work of Nobel Laureates in economics can enhance student understanding and bring them up to date on topics such as probability, uncertainty and decision theory, hypothesis testing, regression to the mean, instrumental variable techniques, discrete choice modeling, and time-series analysis. (Contains 2 notes.)

  4. RAMP: a computer system for mapping regional areas

    Treesearch

    Bradley B. Nickey

    1975-01-01

    Until 1972, the U.S. Forest Service's Individual Fire Reports recorded locations by the section-township-range system..These earlier fire reports, therefore, lacked congruent locations. RAMP (Regional Area Mapping Procedure) was designed to make the reports more useful for quantitative analysis. This computer-based technique converts locations expressed in...

  5. Developing and Assessing E-Learning Techniques for Teaching Forecasting

    ERIC Educational Resources Information Center

    Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian

    2014-01-01

    In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…

  6. Developing Public Education Policy through Policy-Impact Analysis.

    ERIC Educational Resources Information Center

    Hackett, E. Raymond; And Others

    A model for analyzing policy impacts is presented that will assist state-level policy makers in education. The model comprises four stages: (1) monitoring, which includes the identification of relevant trends and issues and the development of a data base; (2) forecasting, which uses quantitative and qualitative techniques developed in futures…

  7. Community College Students' Perceptions of Effective Communication in Online Learning

    ERIC Educational Resources Information Center

    Parker, Donna Alice Hill

    2012-01-01

    This quantitative research project analyzed the application of instructional communication tools and techniques used by community college students to determine how they perceive communication in their online classes. Online students from a community college participated in this study by completing an electronic survey. Data analysis revealed that…

  8. Strategies for estimating the marine geoid from altimeter data

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Kahn, W. D.; Garza-Robles, R.

    1976-01-01

    Altimeter data from a spacecraft borne altimeter was processed to estimate the fine structure of the marine geoid. Simulation studies show that, among several competing parameterizations, the mean free air gravity anomaly model exhibited promising geoid recovery characteristics. Using covariance analysis techniques, quantitative measures of the orthogonality properties are investigated.

  9. Comparison of Quantitative PCR and Droplet Digital PCR Multiplex Assays for Two Genera of Bloom-Forming Cyanobacteria, Cylindrospermopsis and Microcystis.

    PubMed

    Te, Shu Harn; Chen, Enid Yingru; Gin, Karina Yew-Hoong

    2015-08-01

    The increasing occurrence of harmful cyanobacterial blooms, often linked to deteriorated water quality and adverse public health effects, has become a worldwide concern in recent decades. The use of molecular techniques such as real-time quantitative PCR (qPCR) has become increasingly popular in the detection and monitoring of harmful cyanobacterial species. Multiplex qPCR assays that quantify several toxigenic cyanobacterial species have been established previously; however, there is no molecular assay that detects several bloom-forming species simultaneously. Microcystis and Cylindrospermopsis are the two most commonly found genera and are known to be able to produce microcystin and cylindrospermopsin hepatotoxins. In this study, we designed primers and probes which enable quantification of these genera based on the RNA polymerase C1 gene for Cylindrospermopsis species and the c-phycocyanin beta subunit-like gene for Microcystis species. Duplex assays were developed for two molecular techniques-qPCR and droplet digital PCR (ddPCR). After optimization, both qPCR and ddPCR assays have high linearity and quantitative correlations for standards. Comparisons of the two techniques showed that qPCR has higher sensitivity, a wider linear dynamic range, and shorter analysis time and that it was more cost-effective, making it a suitable method for initial screening. However, the ddPCR approach has lower variability and was able to handle the PCR inhibition and competitive effects found in duplex assays, thus providing more precise and accurate analysis for bloom samples. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less

  11. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  12. Sensitivity analysis and multidisciplinary optimization for aircraft design: Recent advances and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    Optimization by decomposition, complex system sensitivity analysis, and a rapid growth of disciplinary sensitivity analysis are some of the recent developments that hold promise of a quantum jump in the support engineers receive from computers in the quantitative aspects of design. Review of the salient points of these techniques is given and illustrated by examples from aircraft design as a process that combines the best of human intellect and computer power to manipulate data.

  13. Advanced hyphenated chromatographic-mass spectrometry in mycotoxin determination: current status and prospects.

    PubMed

    Li, Peiwu; Zhang, Zhaowei; Hu, Xiaofeng; Zhang, Qi

    2013-01-01

    Mass spectrometric techniques are essential for advanced research in food safety and environmental monitoring. These fields are important for securing the health of humans and animals, and for ensuring environmental security. Mycotoxins, toxic secondary metabolites of filamentous fungi, are major contaminants of agricultural products, food and feed, biological samples, and the environment as a whole. Mycotoxins can cause cancers, nephritic and hepatic diseases, various hemorrhagic syndromes, and immune and neurological disorders. Mycotoxin-contaminated food and feed can provoke trade conflicts, resulting in massive economic losses. Risk assessment of mycotoxin contamination for humans and animals generally depends on clear identification and reliable quantitation in diversified matrices. Pioneering work on mycotoxin quantitation using mass spectrometry (MS) was performed in the early 1970s. Now, unambiguous confirmation and quantitation of mycotoxins can be readily achieved with a variety hyphenated techniques that combine chromatographic separation with MS, including liquid chromatography (LC) or gas chromatography (GC). With the advent of atmospheric pressure ionization, LC-MS has become a routine technique. Recently, the co-occurrence of multiple mycotoxins in the same sample has drawn an increasing amount of attention. Thus, modern analyses must be able to detect and quantitate multiple mycotoxins in a single run. Improvements in tandem MS techniques have been made to achieve this purpose. This review describes the advanced research that has been done regarding mycotoxin determination using hyphenated chromatographic-MS techniques, but is not a full-circle survey of all the literature published on this topic. The present work provides an overview of the various hyphenated chromatographic-MS-based strategies that have been applied to mycotoxin analysis, with a focus on recent developments. The use of chromatographic-MS to measure levels of mycotoxins, including aflatoxins, ochratoxins, patulin, trichothecenes, zearalenone, and fumonisins, is discussed in detail. Both free and masked mycotoxins are included in this review due to different methods of sample preparation. Techniques are described in terms of sample preparation, internal standards, LC/ultra performance LC (UPLC) optimization, and applications and survey. Several future hyphenated MS techniques are discussed as well, including multidimensional chromatography-MS, capillary electrophoresis-MS, and surface plasmon resonance array-MS. © 2013 Wiley Periodicals, Inc.

  14. Virtual substrate method for nanomaterials characterization

    PubMed Central

    Da, Bo; Liu, Jiangwei; Yamamoto, Mahito; Ueda, Yoshihiro; Watanabe, Kazuyuki; Cuong, Nguyen Thanh; Li, Songlin; Tsukagoshi, Kazuhito; Yoshikawa, Hideki; Iwai, Hideo; Tanuma, Shigeo; Guo, Hongxuan; Gao, Zhaoshun; Sun, Xia; Ding, Zejun

    2017-01-01

    Characterization techniques available for bulk or thin-film solid-state materials have been extended to substrate-supported nanomaterials, but generally non-quantitatively. This is because the nanomaterial signals are inevitably buried in the signals from the underlying substrate in common reflection-configuration techniques. Here, we propose a virtual substrate method, inspired by the four-point probe technique for resistance measurement as well as the chop-nod method in infrared astronomy, to characterize nanomaterials without the influence of underlying substrate signals from four interrelated measurements. By implementing this method in secondary electron (SE) microscopy, a SE spectrum (white electrons) associated with the reflectivity difference between two different substrates can be tracked and controlled. The SE spectrum is used to quantitatively investigate the covering nanomaterial based on subtle changes in the transmission of the nanomaterial with high efficiency rivalling that of conventional core-level electrons. The virtual substrate method represents a benchmark for surface analysis to provide ‘free-standing' information about supported nanomaterials. PMID:28548114

  15. Surface temperature/heat transfer measurement using a quantitative phosphor thermography system

    NASA Technical Reports Server (NTRS)

    Buck, G. M.

    1991-01-01

    A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.

  16. A Chromosome-Scale Assembly of the Bactrocera cucurbitae Genome Provides Insight to the Genetic Basis of white pupae

    PubMed Central

    Sim, Sheina B.; Geib, Scott M.

    2017-01-01

    Genetic sexing strains (GSS) used in sterile insect technique (SIT) programs are textbook examples of how classical Mendelian genetics can be directly implemented in the management of agricultural insect pests. Although the foundation of traditionally developed GSS are single locus, autosomal recessive traits, their genetic basis are largely unknown. With the advent of modern genomic techniques, the genetic basis of sexing traits in GSS can now be further investigated. This study is the first of its kind to integrate traditional genetic techniques with emerging genomics to characterize a GSS using the tephritid fruit fly pest Bactrocera cucurbitae as a model. These techniques include whole-genome sequencing, the development of a mapping population and linkage map, and quantitative trait analysis. The experiment designed to map the genetic sexing trait in B. cucurbitae, white pupae (wp), also enabled the generation of a chromosome-scale genome assembly by integrating the linkage map with the assembly. Quantitative trait loci analysis revealed SNP loci near position 42 MB on chromosome 3 to be tightly linked to wp. Gene annotation and synteny analysis show a near perfect relationship between chromosomes in B. cucurbitae and Muller elements A–E in Drosophila melanogaster. This chromosome-scale genome assembly is complete, has high contiguity, was generated using a minimal input DNA, and will be used to further characterize the genetic mechanisms underlying wp. Knowledge of the genetic basis of genetic sexing traits can be used to improve SIT in this species and expand it to other economically important Diptera. PMID:28450369

  17. Determination of carbohydrates in medicinal plants--comparison between TLC, mf-MELDI-MS and GC-MS.

    PubMed

    Qureshi, Muhammad Nasimullah; Stecher, Guenther; Sultana, Tahira; Abel, Gudrun; Popp, Michael; Bonn, Guenther K

    2011-01-01

    Quality control in the pharmaceutical and phytopharmaceutical industries requires fast and reliable methods for the analysis of raw materials and final products. This study evaluates different analytical approaches in order to recognise the most suitable technique for the analysis of carbohydrates in herbal drug preparations. The specific focus of the study is on thin-layer chromatography (TLC), gas chromatography (GC), and a newly developed mass spectrometric method, i.e. matrix free material enhanced laser desorption/ionisation time of flight mass spectrometry (mf-MELDI-MS). Samples employed in the study were standards and microwave-assisted water extracts from Quercus. TLC analysis proved the presence of mono-, di- and trisaccharides within the biological sample and hinted at the existence of an unknown carbohydrate of higher oligomerisation degree. After evaluation of different derivatisation techniques, GC-MS confirmed data obtained via TLC for mono- to trisaccharides, delivering additionally quantified values under a considerable amount of time. A carbohydrate of higher oligomerisation degree could not be found. The application of mf-MELDI-MS further confirmed the presence of carbohydrates up to trisaccharides, also hinting at the presence of a form of tetrasaccharide. Besides this information, mf-MELDI-MS delivered further data about other substances present in the extract. Quantitative determination resulted in 1.750, 1.736 and 0.336 mg/mL for glucose, sucrose and raffinose respectively. Evaluation of all three techniques employed, clearly proved the heightened performance of mf-MELDI-MS for the qualitative analysis of complex mixtures, as targets do not need modification and analysis requires only a few minutes. In addition, GC-MS is suitable for quantitative analysis. Copyright © 2011 John Wiley & Sons, Ltd.

  18. A systematic review of the relationship factor between women and health professionals within the multivariant analysis of maternal satisfaction.

    PubMed

    Macpherson, Ignacio; Roqué-Sánchez, María V; Legget Bn, Finola O; Fuertes, Ferran; Segarra, Ignacio

    2016-10-01

    personalised support provided to women by health professionals is one of the prime factors attaining women's satisfaction during pregnancy and childbirth. However the multifactorial nature of 'satisfaction' makes difficult to assess it. Statistical multivariate analysis may be an effective technique to obtain in depth quantitative evidence of the importance of this factor and its interaction with the other factors involved. This technique allows us to estimate the importance of overall satisfaction in its context and suggest actions for healthcare services. systematic review of studies that quantitatively measure the personal relationship between women and healthcare professionals (gynecologists, obstetricians, nurse, midwifes, etc.) regarding maternity care satisfaction. The literature search focused on studies carried out between 1970 and 2014 that used multivariate analyses and included the woman-caregiver relationship as a factor of their analysis. twenty-four studies which applied various multivariate analysis tools to different periods of maternity care (antenatal, perinatal, post partum) were selected. The studies included discrete scale scores and questionnaires from women with low-risk pregnancies. The "personal relationship" factor appeared under various names: care received, personalised treatment, professional support, amongst others. The most common multivariate techniques used to assess the percentage of variance explained and the odds ratio of each factor were principal component analysis and logistic regression. the data, variables and factor analysis suggest that continuous, personalised care provided by the usual midwife and delivered within a family or a specialised setting, generates the highest level of satisfaction. In addition, these factors foster the woman's psychological and physiological recovery, often surpassing clinical action (e.g. medicalization and hospital organization) and/or physiological determinants (e.g. pain, pathologies, etc.). Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Real-time PCR to determine transgene copy number and to quantitate the biolocalization of adoptively transferred cells from EGFP-transgenic mice.

    PubMed

    Joshi, Molishree; Keith Pittman, H; Haisch, Carl; Verbanac, Kathryn

    2008-09-01

    Quantitative real-time PCR (qPCR) is a sensitive technique for the detection and quantitation of specific DNA sequences. Here we describe a Taqman qPCR assay for quantification of tissue-localized, adoptively transferred enhanced green fluorescent protein (EGFP)-transgenic cells. A standard curve constructed from serial dilutions of a plasmid containing the EGFP transgene was (i) highly reproducible, (ii) detected as few as two copies, and (iii) was included in each qPCR assay. qPCR analysis of genomic DNA was used to determine transgene copy number in several mouse strains. Fluorescent microscopy of tissue sections showed that adoptively transferred vascular endothelial cells (VEC) from EGFP-transgenic mice specifically localized to tissue with metastatic tumors in syngeneic recipients. VEC microscopic enumeration of liver metastases strongly correlated with qPCR analysis of identical sections (Pearson correlation 0.81). EGFP was undetectable in tissue from control mice by qPCR. In another study using intra-tumor EGFP-VEC delivery to subcutaneous tumors, manual cell count and qPCR analysis of alternating sections also strongly correlated (Pearson correlation 0.82). Confocal microscopy of the subcutaneous tumor sections determined that visual fluorescent signals were frequently tissue artifacts. This qPCR methodology offers specific, objective, and rapid quantitation, uncomplicated by tissue autofluorescence, and should be readily transferable to other in vivo models to quantitate the biolocalization of transplanted cells.

  20. Quantitative refractive index distribution of single cell by combining phase-shifting interferometry and AFM imaging.

    PubMed

    Zhang, Qinnan; Zhong, Liyun; Tang, Ping; Yuan, Yingjie; Liu, Shengde; Tian, Jindong; Lu, Xiaoxu

    2017-05-31

    Cell refractive index, an intrinsic optical parameter, is closely correlated with the intracellular mass and concentration. By combining optical phase-shifting interferometry (PSI) and atomic force microscope (AFM) imaging, we constructed a label free, non-invasive and quantitative refractive index of single cell measurement system, in which the accurate phase map of single cell was retrieved with PSI technique and the cell morphology with nanoscale resolution was achieved with AFM imaging. Based on the proposed AFM/PSI system, we achieved quantitative refractive index distributions of single red blood cell and Jurkat cell, respectively. Further, the quantitative change of refractive index distribution during Daunorubicin (DNR)-induced Jurkat cell apoptosis was presented, and then the content changes of intracellular biochemical components were achieved. Importantly, these results were consistent with Raman spectral analysis, indicating that the proposed PSI/AFM based refractive index system is likely to become a useful tool for intracellular biochemical components analysis measurement, and this will facilitate its application for revealing cell structure and pathological state from a new perspective.

  1. Time-Resolved Fluorescent Immunochromatography of Aflatoxin B1 in Soybean Sauce: A Rapid and Sensitive Quantitative Analysis.

    PubMed

    Wang, Du; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Zhang, Wen

    2016-07-14

    Rapid and quantitative sensing of aflatoxin B1 with high sensitivity and specificity has drawn increased attention of studies investigating soybean sauce. A sensitive and rapid quantitative immunochromatographic sensing method was developed for the detection of aflatoxin B1 based on time-resolved fluorescence. It combines the advantages of time-resolved fluorescent sensing and immunochromatography. The dynamic range of a competitive and portable immunoassay was 0.3-10.0 µg·kg(-1), with a limit of detection (LOD) of 0.1 µg·kg(-1) and recoveries of 87.2%-114.3%, within 10 min. The results showed good correlation (R² > 0.99) between time-resolved fluorescent immunochromatographic strip test and high performance liquid chromatography (HPLC). Soybean sauce samples analyzed using time-resolved fluorescent immunochromatographic strip test revealed that 64.2% of samples contained aflatoxin B1 at levels ranging from 0.31 to 12.5 µg·kg(-1). The strip test is a rapid, sensitive, quantitative, and cost-effective on-site screening technique in food safety analysis.

  2. In silico quantitative structure-toxicity relationship study of aromatic nitro compounds.

    PubMed

    Pasha, Farhan Ahmad; Neaz, Mohammad Morshed; Cho, Seung Joo; Ansari, Mohiuddin; Mishra, Sunil Kumar; Tiwari, Sharvan

    2009-05-01

    Small molecules often have toxicities that are a function of molecular structural features. Minor variations in structural features can make large difference in such toxicity. Consequently, in silico techniques may be used to correlate such molecular toxicities with their structural features. Relative to nine different sets of aromatic nitro compounds having known observed toxicities against different targets, we developed ligand-based 2D quantitative structure-toxicity relationship models using 20 selected topological descriptors. The topological descriptors have several advantages such as conformational independency, facile and less time-consuming computation to yield good results. Multiple linear regression analysis was used to correlate variations of toxicity with molecular properties. The information index on molecular size, lopping centric index and Kier flexibility index were identified as fundamental descriptors for different kinds of toxicity, and further showed that molecular size, branching and molecular flexibility might be particularly important factors in quantitative structure-toxicity relationship analysis. This study revealed that topological descriptor-guided quantitative structure-toxicity relationship provided a very useful, cost and time-efficient, in silico tool for describing small-molecule toxicities.

  3. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    NASA Astrophysics Data System (ADS)

    Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, C.; Jourdain, P.; Magistretti, P. J.

    2016-03-01

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  4. Frontally eluted components procedure with thin layer chromatography as a mode of sample preparation for high performance liquid chromatography quantitation of acetaminophen in biological matrix.

    PubMed

    Klimek-Turek, A; Sikora, M; Rybicki, M; Dzido, T H

    2016-03-04

    A new concept of using thin-layer chromatography to sample preparation for the quantitative determination of solute/s followed by instrumental techniques is presented Thin-layer chromatography (TLC) is used to completely separate acetaminophen and its internal standard from other components (matrix) and to form a single spot/zone containing them at the solvent front position (after the final stage of the thin-layer chromatogram development). The location of the analytes and internal standard in the solvent front zone allows their easy extraction followed by quantitation by HPLC. The exctraction procedure of the solute/s and internal standard can proceed from whole solute frontal zone or its part without lowering in accuracy of quantitative analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  6. Analysis of photographic X-ray images. [S-054 telescope on Skylab

    NASA Technical Reports Server (NTRS)

    Krieger, A. S.

    1977-01-01

    Some techniques used to extract quantitative data from the information contained in photographic images produced by grazing incidence soft X-ray optical systems are described. The discussion is focussed on the analysis of the data returned by the S-054 X-Ray Spectrographic Telescope Experiment on Skylab. The parameters of the instrument and the procedures used for its calibration are described. The technique used to convert photographic density to focal plane X-ray irradiance is outlined. The deconvolution of the telescope point response function from the image data is discussed. Methods of estimating the temperature, pressure, and number density of coronal plasmas are outlined.

  7. Statistical innovations in the medical device world sparked by the FDA.

    PubMed

    Campbell, Gregory; Yue, Lilly Q

    2016-01-01

    The world of medical devices while highly diverse is extremely innovative, and this facilitates the adoption of innovative statistical techniques. Statisticians in the Center for Devices and Radiological Health (CDRH) at the Food and Drug Administration (FDA) have provided leadership in implementing statistical innovations. The innovations discussed include: the incorporation of Bayesian methods in clinical trials, adaptive designs, the use and development of propensity score methodology in the design and analysis of non-randomized observational studies, the use of tipping-point analysis for missing data, techniques for diagnostic test evaluation, bridging studies for companion diagnostic tests, quantitative benefit-risk decisions, and patient preference studies.

  8. Analysing magnetism using scanning SQUID microscopy.

    PubMed

    Reith, P; Renshaw Wang, X; Hilgenkamp, H

    2017-12-01

    Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.

  9. Analysing magnetism using scanning SQUID microscopy

    NASA Astrophysics Data System (ADS)

    Reith, P.; Renshaw Wang, X.; Hilgenkamp, H.

    2017-12-01

    Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.

  10. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  11. Remote sensing and spectral analysis of plumes from ocean dumping in the New York Bight Apex

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1980-01-01

    The application of the remote sensing techniques of aerial photography and multispectral scanning in the qualitative and quantitative analysis of plumes from ocean dumping of waste materials is investigated in the New York Bight Apex. Plumes resulting from the dumping of acid waste and sewage sludge were observed by Ocean Color Scanner at an altitude of 19.7 km and by Modular Multispectral Scanner and mapping camera at an altitude of 3.0 km. Results of the qualitative analysis of multispectral and photographic data for the mapping, location, and identification of pollution features without concurrent sea truth measurements are presented which demonstrate the usefulness of in-scene calibration. Quantitative distributions of the suspended solids in sewage sludge released in spot and line dumps are also determined by a multiple regression analysis of multispectral and sea truth data.

  12. Methodological Variables in the Analysis of Cell-Free DNA.

    PubMed

    Bronkhorst, Abel Jacobus; Aucamp, Janine; Pretorius, Piet J

    2016-01-01

    In recent years, cell-free DNA (cfDNA) analysis has received increasing amounts of attention as a potential non-invasive screening tool for the early detection of genetic aberrations and a wide variety of diseases, especially cancer. However, except for some prenatal tests and BEAMing, a technique used to detect mutations in various genes of cancer patients, cfDNA analysis is not yet routinely applied in clinical practice. Although some confusing biological factors inherent to the in vivo setting play a key part, it is becoming increasingly clear that this struggle is mainly due to the lack of an analytical consensus, especially as regards quantitative analyses of cfDNA. In order to use quantitative analysis of cfDNA with confidence, process optimization and standardization are crucial. In this work we aim to elucidate the most confounding variables of each preanalytical step that must be considered for process optimization and equivalence of procedures.

  13. Heavy metal concentrations in soils as determined by laser-induced breakdown spectroscopy (LIBS), with special emphasis on chromium.

    PubMed

    Senesi, G S; Dell'Aglio, M; Gaudiuso, R; De Giacomo, A; Zaccone, C; De Pascale, O; Miano, T M; Capitelli, M

    2009-05-01

    Soil is unanimously considered as one of the most important sink of heavy metals released by human activities. Heavy metal analysis of natural and polluted soils is generally conducted by the use of atomic absorption spectroscopy (AAS) or inductively coupled plasma optical emission spectroscopy (ICP-OES) on adequately obtained soil extracts. Although in recent years the emergent technique of laser-induced breakdown spectroscopy (LIBS) has been applied widely and with increasing success for the qualitative and quantitative analyses of a number of heavy metals in soil matrices with relevant simplification of the conventional methodologies, the technique still requires further confirmation before it can be applied fully successfully in soil analyses. The main objective of this work was to demonstrate that new developments in LIBS technique are able to provide reliable qualitative and quantitative analytical evaluation of several heavy metals in soils, with special focus on the element chromium (Cr), and with reference to the concentrations measured by conventional ICP spectroscopy. The preliminary qualitative LIBS analysis of five soil samples and one sewage sludge sample has allowed the detection of a number of elements including Al, Ca, Cr, Cu, Fe, Mg, Mn, Pb, Si, Ti, V and Zn. Of these, a quantitative analysis was also possible for the elements Cr, Cu, Pb, V and Zn based on the obtained linearity of the calibration curves constructed for each heavy metal, i.e., the proportionality between the intensity of the LIBS emission peaks and the concentration of each heavy metal in the sample measured by ICP. In particular, a triplet of emission lines for Cr could be used for its quantitative measurement. The consistency of experiments made on various samples was supported by the same characteristics of the laser-induced plasma (LIP), i.e., the typical linear distribution confirming the existence of local thermodynamic equilibrium (LTE) condition, and similar excitation temperatures and comparable electron number density measured for all samples. An index of the anthropogenic contribution of Cr in polluted soils was calculated in comparison to a non-polluted reference soil. Thus, the intensity ratios of the emission lines of heavy metal can be used to detect in few minutes the polluted areas for which a more detailed sampling and analysis can be useful.

  14. Application of the shifted excitation Raman difference spectroscopy (SERDS) to the analysis of trace amounts of methanol in red wines

    NASA Astrophysics Data System (ADS)

    Volodin, Boris; Dolgy, Sergei; Ban, Vladimir S.; Gracin, Davor; Juraić, Krunoslav; Gracin, Leo

    2014-03-01

    Shifted Excitation Raman Difference Spectroscopy (SERDS) has proven an effective method for performing Raman analysis of fluorescent samples. This technique allows achieving excellent signal to noise performance with shorter excitation wavelengths, thus taking full advantage of the superior signal strength afforded by shorter excitation wavelengths and the superior performance, also combined with lower cost, delivered by silicon CCDs. The technique is enabled by use of two closely space fixed-wavelength laser diode sources stabilized with the Volume Bragg gratings (VBGs). A side by side comparison reveals that SERDS technique delivers superior signal to noise ratio and better detection limits in most situations, even when a longer excitation wavelength is employed for the purpose of elimination of the fluorescence. We have applied the SERDS technique to the quantitative analysis of the presence of trace amounts of methanol in red wines, which is an important task in quality control operations within wine industry and is currently difficult to perform in the field. So far conventional Raman spectroscopy analysis of red wines has been impractical due to the high degree of fluorescence.

  15. Feasibility of free-breathing dynamic contrast-enhanced MRI of gastric cancer using a golden-angle radial stack-of-stars VIBE sequence: comparison with the conventional contrast-enhanced breath-hold 3D VIBE sequence.

    PubMed

    Li, Huan-Huan; Zhu, Hui; Yue, Lei; Fu, Yi; Grimm, Robert; Stemmer, Alto; Fu, Cai-Xia; Peng, Wei-Jun

    2018-05-01

    To investigate the feasibility and diagnostic value of free-breathing, radial, stack-of-stars three-dimensional (3D) gradient echo (GRE) sequence ("golden angle") on dynamic contrast-enhanced (DCE) MRI of gastric cancer. Forty-three gastric cancer patients were divided into cooperative and uncooperative groups. Respiratory fluctuation was observed using an abdominal respiratory gating sensor. Those who breath-held for more than 15 s were placed in the cooperative group and the remainder in the uncooperative group. The 3-T MRI scanning protocol included 3D GRE and conventional breath-hold VIBE (volume-interpolated breath-hold examination) sequences, comparing images quantitatively and qualitatively. DCE-MRI parameters from VIBE images of normal gastric wall and malignant lesions were compared. For uncooperative patients, 3D GRE scored higher qualitatively, and had higher SNRs (signal-to-noise ratios) and CNRs (contrast-to-noise ratios) than conventional VIBE quantitatively. Though 3D GRE images scored lower in qualitative parameters compared with conventional VIBE for cooperative patients, it provided images with fewer artefacts. DCE parameters differed significantly between normal gastric wall and lesions, with higher Ve (extracellular volume) and lower Kep (reflux constant) in gastric cancer. The free-breathing, golden-angle, radial stack-of-stars 3D GRE technique is feasible for DCE-MRI of gastric cancer. Dynamic enhanced images can be used for quantitative analysis of this malignancy. • Golden-angle radial stack-of-stars VIBE aids gastric cancer MRI diagnosis. • The 3D GRE technique is suitable for patients unable to suspend respiration. • Method scored higher in the qualitative evaluation for uncooperative patients. • The technique produced images with fewer artefacts than conventional VIBE sequence. • Dynamic enhanced images can be used for quantitative analysis of gastric cancer.

  16. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and

  17. Validation of Reference Genes in mRNA Expression Analysis Applied to the Study of Asthma.

    PubMed

    Segundo-Val, Ignacio San; Sanz-Lozano, Catalina S

    2016-01-01

    The quantitative Polymerase Chain Reaction is the most used technique for the study of gene expression. To correct putative experimental errors of this technique is necessary normalizing the expression results of the gene of interest with the obtained for reference genes. Here, we describe an example of the process to select reference genes. In this particular case, we select reference genes for expression studies in the peripheral blood mononuclear cells of asthmatic patients.

  18. [Quantitative determination of glass content in monazite glass-ceramics by IR technique].

    PubMed

    He, Yong; Zhang, Bao-min

    2003-04-01

    Monazite glass-ceramics consist of both monazite and metaphoshate glass phases. The absorption bands of both phases do not overlap each other, and the absorption intensities of bands 1,275 and 616 cm-1 vary with the glass contents. The correlation coefficient between logarithmic absorbance ratio of the two bands and glass contents was r = 0.9975 and its regression equation was y = 48.356 + 25.93x. The absorbance ratio of bands 952 and 616 cm-1 also varied with different ratios of Ce2O3/La2O3 in synthetic monazites, with r = 0.9917 and a regression equation y = 0.2211 exp (0.0221x). High correlation coefficients show that the IR technique could find new application in the quantitative analysis of glass content in phosphate glass-ceramics.

  19. New X-Ray Technique to Characterize Nanoscale Precipitates in Aged Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Sitdikov, V. D.; Murashkin, M. Yu.; Valiev, R. Z.

    2017-10-01

    This paper puts forward a new technique for measurement of x-ray patterns, which enables to solve the problem of identification and determination of precipitates (nanoscale phases) in metallic alloys of the matrix type. The minimum detection limit of precipitates in the matrix of the base material provided by this technique constitutes as little as 1%. The identification of precipitates in x-ray patterns and their analysis are implemented through a transmission mode with a larger radiation area, longer holding time and higher diffractometer resolution as compared to the conventional reflection mode. The presented technique has been successfully employed to identify and quantitatively describe precipitates formed in the Al alloy of the Al-Mg-Si system as a result of artificial aging. For the first time, the x-ray phase analysis has been used to identify and measure precipitates formed during the alloy artificial aging.

  20. Novel CE-MS technique for detection of high explosives using perfluorooctanoic acid as a MEKC and mass spectrometric complexation reagent.

    PubMed

    Brensinger, Karen; Rollman, Christopher; Copper, Christine; Genzman, Ashton; Rine, Jacqueline; Lurie, Ira; Moini, Mehdi

    2016-01-01

    To address the need for the forensic analysis of high explosives, a novel capillary electrophoresis mass spectrometry (CE-MS) technique has been developed for high resolution, sensitivity, and mass accuracy detection of these compounds. The technique uses perfluorooctanoic acid (PFOA) as both a micellar electrokinetic chromatography (MEKC) reagent for separation of neutral explosives and as the complexation reagent for mass spectrometric detection of PFOA-explosive complexes in the negative ion mode. High explosives that formed complexes with PFOA included RDX, HMX, tetryl, and PETN. Some nitroaromatics were detected as molecular ions. Detection limits in the high parts per billion range and linear calibration responses over two orders of magnitude were obtained. For proof of concept, the technique was applied to the quantitative analysis of high explosives in sand samples. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Multidimensional Processing and Visual Rendering of Complex 3D Biomedical Images

    NASA Technical Reports Server (NTRS)

    Sams, Clarence F.

    2016-01-01

    The proposed technology uses advanced image analysis techniques to maximize the resolution and utility of medical imaging methods being used during spaceflight. We utilize COTS technology for medical imaging, but our applications require higher resolution assessment of the medical images than is routinely applied with nominal system software. By leveraging advanced data reduction and multidimensional imaging techniques utilized in analysis of Planetary Sciences and Cell Biology imaging, it is possible to significantly increase the information extracted from the onboard biomedical imaging systems. Year 1 focused on application of these techniques to the ocular images collected on ground test subjects and ISS crewmembers. Focus was on the choroidal vasculature and the structure of the optic disc. Methods allowed for increased resolution and quantitation of structural changes enabling detailed assessment of progression over time. These techniques enhance the monitoring and evaluation of crew vision issues during space flight.

  2. Ion-Exclusion Chromatography for Analyzing Organics in Water

    NASA Technical Reports Server (NTRS)

    Sauer, Richard; Rutz, Jeffrey A.; Schultz, John R.

    2006-01-01

    A liquid-chromatography technique has been developed for use in the quantitative analysis of urea (and of other nonvolatile organic compounds typically found with urea) dissolved in water. The technique involves the use of a column that contains an ion-exclusion resin; heretofore, this column has been sold for use in analyzing monosaccharides and food softeners, but not for analyzing water supplies. The prior technique commonly used to analyze water for urea content has been one of high-performance liquid chromatography (HPLC), with reliance on hydrophobic interactions between analytes in a water sample and long-chain alkyl groups bonded to an HPLC column. The prior technique has proven inadequate because of a strong tendency toward co-elution of urea with other compounds. Co-elution often causes the urea and other compounds to be crowded into a narrow region of the chromatogram (see left part of figure), thereby giving rise to low chromatographic resolution and misidentification of compounds. It is possible to quantitate urea or another analyte via ultraviolet- and visible-light absorbance measurements, but in order to perform such measurements, it is necessary to dilute the sample, causing a significant loss of sensitivity. The ion-exclusion resin used in the improved technique is sulfonated polystyrene in the calcium form. Whereas the alkyl-chain column used in the prior technique separates compounds on the basis of polarity only, the ion-exclusion-resin column used in the improved technique separates compounds on the basis of both molecular size and electric charge. As a result, the degree of separation is increased: instead of being crowded together into a single chromatographic peak only about 1 to 2 minutes wide as in the prior technique, the chromatographic peaks of different compounds are now separated from each other and spread out over a range about 33 minutes wide (see right part of figure), and the urea peak can readily be distinguished from the other peaks. Although the analysis takes more time in the improved technique, this disadvantage is offset by two important advantages: Sensitivity is increased. The minimum concentration of urea that can be measured is reduced (to between 1/5 and 1/3 of that of the prior technique) because it is not necessary to dilute the sample. The separation of peaks facilitates the identification and quantitation of the various compounds. The resolution of the compounds other than urea makes it possible to identify those compounds by use of mass spectrometry.

  3. The cutting edge - Micro-CT for quantitative toolmark analysis of sharp force trauma to bone.

    PubMed

    Norman, D G; Watson, D G; Burnett, B; Fenne, P M; Williams, M A

    2018-02-01

    Toolmark analysis involves examining marks created on an object to identify the likely tool responsible for creating those marks (e.g., a knife). Although a potentially powerful forensic tool, knife mark analysis is still in its infancy and the validation of imaging techniques as well as quantitative approaches is ongoing. This study builds on previous work by simulating real-world stabbings experimentally and statistically exploring quantitative toolmark properties, such as cut mark angle captured by micro-CT imaging, to predict the knife responsible. In Experiment 1 a mechanical stab rig and two knives were used to create 14 knife cut marks on dry pig ribs. The toolmarks were laser and micro-CT scanned to allow for quantitative measurements of numerous toolmark properties. The findings from Experiment 1 demonstrated that both knives produced statistically different cut mark widths, wall angle and shapes. Experiment 2 examined knife marks created on fleshed pig torsos with conditions designed to better simulate real-world stabbings. Eight knives were used to generate 64 incision cut marks that were also micro-CT scanned. Statistical exploration of these cut marks suggested that knife type, serrated or plain, can be predicted from cut mark width and wall angle. Preliminary results suggest that knives type can be predicted from cut mark width, and that knife edge thickness correlates with cut mark width. An additional 16 cut marks walls were imaged for striation marks using scanning electron microscopy with results suggesting that this approach might not be useful for knife mark analysis. Results also indicated that observer judgements of cut mark shape were more consistent when rated from micro-CT images than light microscopy images. The potential to combine micro-CT data, medical grade CT data and photographs to develop highly realistic virtual models for visualisation and 3D printing is also demonstrated. This is the first study to statistically explore simulated real-world knife marks imaged by micro-CT to demonstrate the potential of quantitative approaches in knife mark analysis. Findings and methods presented in this study are relevant to both forensic toolmark researchers as well as practitioners. Limitations of the experimental methodologies and imaging techniques are discussed, and further work is recommended. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  5. CLOSED-LOOP STRIPPING ANALYSIS (CLSA) OF ...

    EPA Pesticide Factsheets

    Synthetic musk compounds have been found in surface water, fish tissues, and human breast milk. Current techniques for separating these compounds from fish tissues require tedious sample clean-upprocedures A simple method for the deterrnination of these compounds in fish tissues has been developed. Closed-loop stripping of saponified fish tissues in a I -L Wheaton purge-and-trap vessel is used to strip compounds with high vapor pressures such as synthetic musks from the matrix onto a solid sorbent (Abselut Nexus). This technique is useful for screening biological tissues that contain lipids for musk compounds. Analytes are desorbed from the sorbent trap sequentially with polar and nonpolar solvents, concentrated, and directly analyzed by high resolution gas chromatography coupled to a mass spectrometer operating in the selected ion monitoring mode. In this paper, we analyzed two homogenized samples of whole fish tissues with spiked synthetic musk compounds using closed-loop stripping analysis (CLSA) and pressurized liquid extraction (PLE). The analytes were not recovered quantitatively but the extraction yield was sufficiently reproducible for at least semi-quantitative purposes (screening). The method was less expensive to implement and required significantly less sample preparation than the PLE technique. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water,

  6. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  7. Analysis of Mammalian Sphingolipids by Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) and Tissue Imaging Mass Spectrometry (TIMS)

    PubMed Central

    Sullards, M. Cameron; Liu, Ying; Chen, Yanfeng; Merrill, Alfred H.

    2011-01-01

    Sphingolipids are a highly diverse category of molecules that serve not only as components of biological structures but also as regulators of numerous cell functions. Because so many of the structural features of sphingolipids give rise to their biological activity, there is a need for comprehensive or “sphingolipidomic” methods for identification and quantitation of as many individual subspecies as possible. This review defines sphingolipids as a class, briefly discusses classical methods for their analysis, and focuses primarily on liquid chromatography tandem mass spectrometry (LC-MS/MS) and tissue imaging mass spectrometry (TIMS). Recently, a set of evolving and expanding methods have been developed and rigorously validated for the extraction, identification, separation, and quantitation of sphingolipids by LC-MS/MS. Quantitation of these biomolecules is made possible via the use of an internal standard cocktail. The compounds that can be readily analyzed are free long-chain (sphingoid) bases, sphingoid base 1-phosphates, and more complex species such as ceramides, ceramide 1-phosphates, sphingomyelins, mono- and di-hexosylceramides sulfatides, and novel compounds such as the 1-deoxy- and 1-(deoxymethyl)-sphingoid bases and their N-acyl-derivatives. These methods can be altered slightly to separate and quantitate isomeric species such as glucosyl/galactosylceramide. Because these techniques require the extraction of sphingolipids from their native environment, any information regarding their localization in histological slices is lost. Therefore, this review also describes methods for TIMS. This technique has been shown to be a powerful tool to determine the localization of individual molecular species of sphingolipids directly from tissue slices. PMID:21749933

  8. Quantitative analysis of volatile organic compounds using ion mobility spectra and cascade correlation neural networks

    NASA Technical Reports Server (NTRS)

    Harrington, Peter DEB.; Zheng, Peng

    1995-01-01

    Ion Mobility Spectrometry (IMS) is a powerful technique for trace organic analysis in the gas phase. Quantitative measurements are difficult, because IMS has a limited linear range. Factors that may affect the instrument response are pressure, temperature, and humidity. Nonlinear calibration methods, such as neural networks, may be ideally suited for IMS. Neural networks have the capability of modeling complex systems. Many neural networks suffer from long training times and overfitting. Cascade correlation neural networks train at very fast rates. They also build their own topology, that is a number of layers and number of units in each layer. By controlling the decay parameter in training neural networks, reproducible and general models may be obtained.

  9. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    PubMed Central

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  10. Quantitative image analysis for investigating cell-matrix interactions

    NASA Astrophysics Data System (ADS)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  11. Three-dimensional characterization of pigment dispersion in dried paint films using focused ion beam-scanning electron microscopy.

    PubMed

    Lin, Jui-Ching; Heeschen, William; Reffner, John; Hook, John

    2012-04-01

    The combination of integrated focused ion beam-scanning electron microscope (FIB-SEM) serial sectioning and imaging techniques with image analysis provided quantitative characterization of three-dimensional (3D) pigment dispersion in dried paint films. The focused ion beam in a FIB-SEM dual beam system enables great control in slicing paints, and the sectioning process can be synchronized with SEM imaging providing high quality serial cross-section images for 3D reconstruction. Application of Euclidean distance map and ultimate eroded points image analysis methods can provide quantitative characterization of 3D particle distribution. It is concluded that 3D measurement of binder distribution in paints is effective to characterize the order of pigment dispersion in dried paint films.

  12. Predicting ESI/MS Signal Change for Anions in Different Solvents.

    PubMed

    Kruve, Anneli; Kaupmees, Karl

    2017-05-02

    LC/ESI/MS is a technique widely used for qualitative and quantitative analysis in various fields. However, quantification is currently possible only for compounds for which the standard substances are available, as the ionization efficiency of different compounds in ESI source differs by orders of magnitude. In this paper we present an approach for quantitative LC/ESI/MS analysis without standard substances. This approach relies on accurately predicting the ionization efficiencies in ESI source based on a model, which uses physicochemical parameters of analytes. Furthermore, the model has been made transferable between different mobile phases and instrument setups by using a suitable set of calibration compounds. This approach has been validated both in flow injection and chromatographic mode with gradient elution.

  13. Nondestructive evaluation of degradation in papaya fruit using intensity based algorithms

    NASA Astrophysics Data System (ADS)

    Kumari, Shubhashri; Nirala, Anil Kumar

    2018-05-01

    In the proposed work degradation in Papaya fruit has been evaluated nondestructively using laser biospeckle technique. The biospeckle activity inside the fruit has been evaluated qualitatively and quantitatively during its maturity to degradation stage using intensity based algorithms. Co-occurrence matrix (COM) has been used for qualitative analysis whereas Inertia Moment (IM), Absolute value Difference (AVD) and Autocovariance methods have been used for quantitative analysis. The biospeckle activity has been found to first increase and then decrease during study period of five days. In addition Granulometric size distribution (GSD) has also been used for the first time for the evaluation of degradation of the papaya. It is concluded that the degradation process of papaya fruit can be evaluated nondestructively using all the mentioned algorithms.

  14. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  15. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  16. Analysis of ribosomal RNA stability in dead cells of wine yeast by quantitative PCR.

    PubMed

    Sunyer-Figueres, Merce; Wang, Chunxiao; Mas, Albert

    2018-04-02

    During wine production, some yeasts enter a Viable But Not Culturable (VBNC) state, which may influence the quality and stability of the final wine through remnant metabolic activity or by resuscitation. Culture-independent techniques are used for obtaining an accurate estimation of the number of live cells, and quantitative PCR could be the most accurate technique. As a marker of cell viability, rRNA was evaluated by analyzing its stability in dead cells. The species-specific stability of rRNA was tested in Saccharomyces cerevisiae, as well as in three species of non-Saccharomyces yeast (Hanseniaspora uvarum, Torulaspora delbrueckii and Starmerella bacillaris). High temperature and antimicrobial dimethyl dicarbonate (DMDC) treatments were efficient in lysing the yeast cells. rRNA gene and rRNA (as cDNA) were analyzed over 48 h after cell lysis by quantitative PCR. The results confirmed the stability of rRNA for 48 h after the cell lysis treatments. To sum up, rRNA may not be a good marker of cell viability in the wine yeasts that were tested. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  18. A novel quantitative analysis method of three-dimensional fluorescence spectra for vegetable oils contents in edible blend oil

    NASA Astrophysics Data System (ADS)

    Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei

    2015-04-01

    Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.

  19. [Sample preparation and bioanalysis in mass spectrometry].

    PubMed

    Bourgogne, Emmanuel; Wagner, Michel

    2015-01-01

    The quantitative analysis of compounds of clinical interest of low molecular weight (<1000 Da) in biological fluids is currently in most cases performed by liquid chromatography-mass spectrometry (LC-MS). Analysis of these compounds in biological fluids (plasma, urine, saliva, hair...) is a difficult task requiring a sample preparation. Sample preparation is a crucial part of chemical/biological analysis and in a sense is considered the bottleneck of the whole analytical process. The main objectives of sample preparation are the removal of potential interferences, analyte preconcentration, and converting (if needed) the analyte into a more suitable form for detection or separation. Without chromatographic separation, endogenous compounds, co-eluted products may affect a quantitative method in mass spectrometry performance. This work focuses on three distinct parts. First, quantitative bioanalysis will be defined, different matrices and sample preparation techniques currently used in bioanalysis by mass spectrometry of/for small molecules of clinical interest in biological fluids. In a second step the goals of sample preparation will be described. Finally, in a third step, sample preparation strategies will be made either directly ("dilute and shoot") or after precipitation.

  20. Report on the analysis of common beverages spiked with gamma-hydroxybutyric acid (GHB) and gamma-butyrolactone (GBL) using NMR and the PURGE solvent-suppression technique.

    PubMed

    Lesar, Casey T; Decatur, John; Lukasiewicz, Elaan; Champeil, Elise

    2011-10-10

    In forensic evidence, the identification and quantitation of gamma-hydroxybutyric acid (GHB) in "spiked" beverages is challenging. In this report, we present the analysis of common alcoholic beverages found in clubs and bars spiked with gamma-hydroxybutyric acid (GHB) and gamma-butyrolactone (GBL). Our analysis of the spiked beverages consisted of using (1)H NMR with a water suppression method called Presaturation Utilizing Relaxation Gradients and Echoes (PURGE). The following beverages were analyzed: water, 10% ethanol in water, vodka-cranberry juice, rum and coke, gin and tonic, whisky and diet coke, white wine, red wine, and beer. The PURGE method allowed for the direct identification and quantitation of both compounds in all beverages except red and white wine where small interferences prevented accurate quantitation. The NMR method presented in this paper utilizes PURGE water suppression. Thanks to the use of a capillary internal standard, the method is fast, non-destructive, sensitive and requires no sample preparation which could disrupt the equilibrium between GHB and GBL. Published by Elsevier Ireland Ltd.

Top