Science.gov

Sample records for quantitative methods results

  1. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  2. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and

    E-print Network

    Born, Richard

    biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoningEDUCATION The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills

  3. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  4. Automated Quantitative Nuclear Cardiology Methods.

    PubMed

    Motwani, Manish; Berman, Daniel S; Germano, Guido; Slomka, Piotr

    2016-02-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps, and estimate global and local measures of stress/rest perfusion, all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This article briefly reviews these techniques, highlights several challenges, and discusses the latest developments. PMID:26590779

  5. QUANTITATIVE METHODS IN PSYCHOLOGY A Power Primer

    E-print Network

    QUANTITATIVE METHODS IN PSYCHOLOGY A Power Primer Jacob Cohen New \\brk University One possible of the Journal of Abnormal and Social Psychology, which found the mean power to detect medium effect sizes) reported a power reviewof the 1984 volumeof the Journal of Abnormal Psychology (some 24 years after mine

  6. Optimization method for quantitative calculation of clay minerals in soil

    NASA Astrophysics Data System (ADS)

    Hao, Libo; Wei, Qiaoqiao; Zhao, Yuyan; Lu, Jilong; Zhao, Xinyun

    2015-04-01

    Determination of types and amounts for clay minerals in soil are important in environmental, agricultural, and geological investigations. Many reliable methods have been established to identify clay mineral types. However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative determination of clay minerals in soil based on bulk chemical composition data. The fundamental principles and processes of the calculation are elucidated. Some samples were used for reliability verification of the method and the results prove the simplicity and efficacy of the approach.

  7. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  8. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  9. [New trends in functional medical imaging: quantitative methods].

    PubMed

    Balkay, László; Emri, Miklós; Galuska, László; Garai, Ildikó; Kis, Attila Sándor; Szűcs, Bernadett; Varga, József

    2014-12-01

    Deriving quantitative measures from the medical imaging methods is a key issue for the optimal oncologic therapy, when the anatomical abnormalities and changes of the metabolic state of the tissues need to be characterized. In order to improve the effectiveness of the therapy, the results of medical imaging procedures should be comparable after two or more consecutive scans. There are several tomographic imaging applications (CT, MRI, SPECT and PET), but in this work we will focus on the quantitative capability of PET, because this method provides the most versatile possibilities for quantifying the resulting images. PMID:25517443

  10. Quantitative laser-induced breakdown spectroscopy data using peak area step-wise regression analysis: an alternative method for interpretation of Mars science laboratory results

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Dyar, Melinda D; Schafer, Martha W; Tucker, Jonathan M

    2008-01-01

    The ChemCam instrument on the Mars Science Laboratory (MSL) will include a laser-induced breakdown spectrometer (LIBS) to quantify major and minor elemental compositions. The traditional analytical chemistry approach to calibration curves for these data regresses a single diagnostic peak area against concentration for each element. This approach contrasts with a new multivariate method in which elemental concentrations are predicted by step-wise multiple regression analysis based on areas of a specific set of diagnostic peaks for each element. The method is tested on LIBS data from igneous and metamorphosed rocks. Between 4 and 13 partial regression coefficients are needed to describe each elemental abundance accurately (i.e., with a regression line of R{sup 2} > 0.9995 for the relationship between predicted and measured elemental concentration) for all major and minor elements studied. Validation plots suggest that the method is limited at present by the small data set, and will work best for prediction of concentration when a wide variety of compositions and rock types has been analyzed.

  11. Results of a European interlaboratory method validation study for the quantitative determination of lipophilic marine biotoxins in raw and cooked shellfish based on high-performance liquid chromatography-tandem mass spectrometry. Part I: collaborative study.

    PubMed

    These, Anja; Klemm, Christine; Nausch, Ingo; Uhlig, Steffen

    2011-01-01

    A European interlaboratory collaborative study was conducted to validate a method for the quantitative determination of lipophilic marine biotoxins based on high-performance liquid chromatography-tandem mass spectrometry. During this study, the diarrhetic shellfish poisoning toxins okadaic acid, dinophysis toxin1 and 2 including their esters, the azaspiracids 1-3, pectenotoxin2, and the yessotoxins were investigated at concentration levels near the limit of quantification and near the legal limit. Naturally contaminated blue mussels, both raw and cooked and spiked extracts of clams and oysters were studied and results were obtained for 16 test samples from 16 laboratories representing eight different countries. This article summarizes the study outcome concerning validation key parameters like specificity, linearity, limit of detection, accuracy/recovery, and precision. Further, influences of cooking of mussels before homogenization or hydrolysis on method robustness have been evaluated. PMID:21107979

  12. Bregman methods in quantitative photoacoustic tomography , Hongkai Zhao2

    E-print Network

    Ferguson, Thomas S.

    Bregman methods in quantitative photoacoustic tomography Hao Gao1 , Hongkai Zhao2 and Stanley Osher Jacobian-based and gradient-based methods in quantitative photoacoustic tomography with multiple optical after numerous scattering events. 1.1. Quantitative photoacoustic tomography Photoacoustic tomography

  13. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  14. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  15. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C. (Los Alamos, NM); Kuske, Cheryl R. (Los Alamos, NM); Mullen, Kenneth I. (Los Alamos, NM)

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  16. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings. PMID:25782471

  17. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  18. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  19. Liquid Chromatography-Mass Spectrometry Quantitation: Applications and Methods

    E-print Network

    Baird, Serena Nicole

    2013-12-31

    , and environmental hazards. Liquid chromatography-mass spectrometry (LC-MS) is capable of completing such tasks using a variety of quantitative methods. In Chapter 1, these methods are presented with regards to chemical warfare agent studies. External calibration...

  20. Research radiometric calibration quantitative transfer methods between internal and external

    NASA Astrophysics Data System (ADS)

    Guo, Ju Guang; Ma, Yong hui; Zhang, Guang; Yang, Zhi hui

    2015-10-01

    This paper puts forward a method by realizing the internal and external radiation calibration transfer for infrared radiation characteristics quantitative measuring system. Through technological innovation and innovation application to establish a theoretical model of the corresponding radiated transfer method. This method can be well in engineering application for technology conversion process of radiometric calibration that with relatively simple and effective calibration in the half light path radiation instead of complex difficult whole optical path radiometric calibration. At the same time, it also will provide the basis of effective support to further carry out the target radiated characteristics quantitative measurement and application for ground type infrared radiated quantitative measuring system.

  1. Machine Learning methods for Quantitative Radiomic Biomarkers

    PubMed Central

    Parmar, Chintan; Grossmann, Patrick; Bussink, Johan; Lambin, Philippe; Aerts, Hugo J. W. L.

    2015-01-01

    Radiomics extracts and mines large number of medical imaging features quantifying tumor phenotypic characteristics. Highly accurate and reliable machine-learning approaches can drive the success of radiomic applications in clinical care. In this radiomic study, fourteen feature selection methods and twelve classification methods were examined in terms of their performance and stability for predicting overall survival. A total of 440 radiomic features were extracted from pre-treatment computed tomography (CT) images of 464 lung cancer patients. To ensure the unbiased evaluation of different machine-learning methods, publicly available implementations along with reported parameter configurations were used. Furthermore, we used two independent radiomic cohorts for training (n?=?310 patients) and validation (n?=?154 patients). We identified that Wilcoxon test based feature selection method WLCX (stability?=?0.84?±?0.05, AUC?=?0.65?±?0.02) and a classification method random forest RF (RSD?=?3.52%, AUC?=?0.66?±?0.03) had highest prognostic performance with high stability against data perturbation. Our variability analysis indicated that the choice of classification method is the most dominant source of performance variation (34.21% of total variance). Identification of optimal machine-learning methods for radiomic applications is a crucial step towards stable and clinically relevant radiomic biomarkers, providing a non-invasive way of quantifying and monitoring tumor-phenotypic characteristics in clinical practice. PMID:26278466

  2. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  3. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems using dynamic PRA methods. These efforts, documented in NUREG/CR-6901, NUREG/CR-6942, and NUREG/CR-6985, included a functional representation of the system's software but did not explicitly address failure modes caused by software defects or by inadequate design requirements. An important identified research need is to establish a commonly accepted basis for incorporating the behavior of software into digital I&C system reliability models for use in PRAs. To address this need, BNL is exploring the inclusion of software failures into the reliability models of digital I&C systems, such that their contribution to the risk of the associated NPP can be assessed.

  4. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  5. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A. (Livermore, CA)

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  6. A Quantitative Method for Machine Translation Evaluation Jess Toms

    E-print Network

    by the various methods. 1 Introduction Research in automatic translation lacks an appropriate, consistent in the field of research as well as when a user has to choose between two or more translators. The evaluationA Quantitative Method for Machine Translation Evaluation Jesús Tomás Escola Politčcnica Superior de

  7. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  8. A Quantitative Assessment Method for Ascaris Eggs on Hands

    PubMed Central

    Jeandron, Aurelie; Ensink, Jeroen H. J.; Thamsborg, Stig M.; Dalsgaard, Anders; Sengupta, Mita E.

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  9. Quantitative Methods Inquires ANALYSIS AND COMPARISON OF UNDER FIVE CHILD

    E-print Network

    de Leon, Alex R.

    Quantitative Methods Inquires 1 ANALYSIS AND COMPARISON OF UNDER FIVE CHILD MORTALITY BETWEEN RURAL Research, Bangladesh (ICDDR,B) Abstract: Knowledge of factors that affect the under-five year child from rich family and the 2nd or 3rd child have lower risk of death compared to poor and 1st child

  10. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  11. Quantitative assessment of single-cell RNA-sequencing methods

    PubMed Central

    Wu, Angela R; Neff, Norma F; Kalisky, Tomer; Dalerba, Piero; Treutlein, Barbara; Rothenberg, Michael E; Mburu, Francis M; Mantalas, Gary L; Sim, Sopheak; Clarke, Michael F; Quake, Stephen R

    2014-01-01

    Interest in single-cell whole-transcriptome analysis is growing rapidly, especially for profiling rare or heterogeneous populations of cells. We compared commercially available single-cell RNA amplification methods with both microliter and nanoliter volumes, using sequence from bulk total RNA and multiplexed quantitative PCR as benchmarks to systematically evaluate the sensitivity and accuracy of various single-cell RNA-seq approaches. We show that single-cell RNA-seq can be used to perform accurate quantitative transcriptome measurement in individual cells with a relatively small number of sequencing reads and that sequencing large numbers of single cells can recapitulate bulk transcriptome complexity. PMID:24141493

  12. Quantitative methods for analyzing cell-cell adhesion in development.

    PubMed

    Kashef, Jubin; Franz, Clemens M

    2015-05-01

    During development cell-cell adhesion is not only crucial to maintain tissue morphogenesis and homeostasis, it also activates signalling pathways important for the regulation of different cellular processes including cell survival, gene expression, collective cell migration and differentiation. Importantly, gene mutations of adhesion receptors can cause developmental disorders and different diseases. Quantitative methods to measure cell adhesion are therefore necessary to understand how cells regulate cell-cell adhesion during development and how aberrations in cell-cell adhesion contribute to disease. Different in vitro adhesion assays have been developed in the past, but not all of them are suitable to study developmentally-related cell-cell adhesion processes, which usually requires working with low numbers of primary cells. In this review, we provide an overview of different in vitro techniques to study cell-cell adhesion during development, including a semi-quantitative cell flipping assay, and quantitative single-cell methods based on atomic force microscopy (AFM)-based single-cell force spectroscopy (SCFS) or dual micropipette aspiration (DPA). Furthermore, we review applications of Förster resonance energy transfer (FRET)-based molecular tension sensors to visualize intracellular mechanical forces acting on cell adhesion sites. Finally, we describe a recently introduced method to quantitate cell-generated forces directly in living tissues based on the deformation of oil microdroplets functionalized with adhesion receptor ligands. Together, these techniques provide a comprehensive toolbox to characterize different cell-cell adhesion phenomena during development. PMID:25448695

  13. Analysis of 129I in Groundwater Samples: Direct and Quantitative Results below the Drinking Water Standard

    SciTech Connect

    Brown, Christopher F.; Geiszler, Keith N.; Lindberg, Michael J.

    2007-03-03

    Due to its long half-life (15.7 million years) and relatively unencumbered migration in subsurface environments, 129I has been recognized as a contaminant of concern at numerous federal, private, and international facilities. In order to understand the long-term risk associated with 129I at these locations, quantitative analysis of groundwater samples must be performed. However, the ability to quantitatively assess the 129I content in groundwater samples requires specialized extraction and sophisticated analytical techniques, which are complicated and not always available to the general scientific community. This paper highlights an analytical method capable of directly quantifying 129I in groundwater samples at concentrations below the MCL without the need for sample pre-concentration. Samples were analyzed on a Perkin Elmer ELAN DRC II ICP-MS after minimal dilution using O2 as the reaction gas. Analysis of continuing calibration verification standards indicated that the DRC mode could be used for quantitative analysis of 129I in samples below the drinking water standard (0.0057 ng/ml or 1 pCi/L). The low analytical detection limit of 129I analysis in the DRC mode coupled with minimal sample dilution (1.02x) resulted in a final sample limit of quantification of 0.0051 ng/ml. Subsequent analysis of three groundwater samples containing 129I resulted in fully quantitative results in the DRC mode, and spike recovery analyses performed on all three samples confirmed that the groundwater matrix did not adversely impact the analysis of 129I in the DRC mode. This analytical approach has been proven to be a cost-effective, high-throughput technique for the direct, quantitative analysis of 129I in groundwater samples at concentrations below the current MCL.

  14. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  15. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikoli?, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  16. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikoli?, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  17. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  18. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  19. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0.981 and that for Ni is 0.995. The concentrations of Cr and Ni in two target samples were determined using PLS calibration method, and the relative errors for the two unknown steel alloy samples are lower than 6.62% and 1.49% respectively. The obtained results showed that in the quantitative analysis of steel alloys, the matrix effect would be reduced effectively and the quantitative analysis accuracy would be improved by the PLS calibration method. PMID:25508749

  20. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0.981 and that for Ni is 0.995. The concentrations of Cr and Ni in two target samples were determined using PLS calibration method, and the relative errors for the two unknown steel alloy samples are lower than 6.62% and 1.49% respectively. The obtained results showed that in the quantitative analysis of steel alloys, the matrix effect would be reduced effectively and the quantitative analysis accuracy would be improved by the PLS calibration method. PMID:25474970

  1. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding, proliferation, fishing, resources protection and management of specific plans.

  2. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  3. Quantitative imaging of volcanic plumes — Results, needs, and future trends

    USGS Publications Warehouse

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-01-01

    Recent technology allows two-dimensional “imaging” of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry–Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  4. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  5. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  6. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples. PMID:24190861

  7. A novel benzene quantitative analysis method using miniaturized metal ionization gas sensor and non-linear bistable dynamic system.

    PubMed

    Tang, Xuxiang; Liu, Fuqi

    2015-09-01

    In this paper, a novel benzene quantitative analysis method utilizing miniaturized metal ionization gas sensor and non-linear bistable dynamic system was investigated. Al plate anodic gas-ionization sensor was installed for electrical current-voltage data measurement. Measurement data was analyzed by non-linear bistable dynamics system. Results demonstrated that this method realized benzene concentration quantitative determination. This method is promising in laboratory safety management in benzene leak detection. PMID:26218927

  8. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  9. How Many proteins are Missed in Quantitative proteomics Based on Ms/Ms sequencing Methods?

    PubMed Central

    Mulvey, Claire; Thur, Bettina; Crawford, Mark; Godovac-Zimmermann, Jasminka

    2014-01-01

    Current bottom-up quantitative proteomics methods based on MS/MS sequencing of peptides are shown to be strongly dependent on sample preparation. Using cytosolic proteins from MCF-7 breast cancer cells, it is shown that protein pre-fractionation based on pI and MW is more effective than pre-fractionation using only MW in increasing the number of observed proteins (947 vs. 704 proteins) and the number of spectral counts per protein. Combination of MS data from the different pre-fractionation methods results in further improvements (1238 proteins). We discuss that at present the main limitation on quantitation by MS/MS sequencing is not MS sensitivity and protein abundance, but rather extensive peptide overlap and limited MS/MS sequencing throughput, and that this favors internally calibrated methods such as SILAC, ICAT or ITRAQ over spectral counting methods in attempts to drastically improve proteome coverage of biological samples. PMID:25729266

  10. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  11. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  12. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  13. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  14. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  15. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works. Engineers need more quantitative information. In order to apply geophysical methods to engineering design works, quantitative interpretation is very important. The presentation introduces several case studies from different countries around the world (Fig. 2) from the integrated and quantitative points of view.

  16. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples. PMID:25993858

  17. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (?-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this ?-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  18. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  19. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (?-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this ?-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  20. A quantitative method for visual phantom image quality evaluation

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.; Liu, Xiong; O'Shea, Michael; Toto, Lawrence C.

    2000-04-01

    This work presents an image quality evaluation technique for uniform-background target-object phantom images. The Degradation-Comparison-Threshold (DCT) method involves degrading the image quality of a target-containing region with a blocking processing and comparing the resulting image to a similarly degraded target-free region. The threshold degradation needed for 92% correct detection of the target region is the image quality measure of the target. Images of American College of Radiology (ACR) mammography accreditation program phantom were acquired under varying x-ray conditions on a digital mammography machine. Five observers performed ACR and DCT evaluations of the images. A figure-of-merit (FOM) of an evaluation method was defined which takes into account measurement noise and the change of the measure as a function of x-ray exposure to the phantom. The FOM of the DCT method was 4.1 times that of the ACR method for the specks, 2.7 times better for the fibers and 1.4 times better for the masses. For the specks, inter-reader correlations on the same image set increased significantly from 87% for the ACR method to 97% for the DCT method. The viewing time per target for the DCT method was 3 - 5 minutes. The observed greater sensitivity of the DCT method could lead to more precise Quality Control (QC) testing of digital images, which should improve the sensitivity of the QC process to genuine image quality variations. Another benefit of the method is that it can measure the image quality of high detectability target objects, which is impractical by existing methods.

  1. 11.220 Quantitative Reasoning and Statistical Method for Planning I, Spring 2006

    E-print Network

    Zegras, P. Christopher

    This course develops logical, empirically based arguments using statistical techniques and analytic methods. It covers elementary statistics, probability, and other types of quantitative reasoning useful for description, ...

  2. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (?-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  3. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    NASA Astrophysics Data System (ADS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-06-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the sample’s local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilever’s contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilever’s EMI can be well predicted by the equivalent circuit model while the soft cantilever’s cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM.

  4. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methods—acetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41—was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason. PMID:25542575

  5. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  6. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    PubMed Central

    Yamashiro, Tsuneo; Miyara, Tetsuhiro; Honda, Osamu; Tomiyama, Noriyuki; Ohno, Yoshiharu; Noma, Satoshi; Murayama, Sadayuki

    2015-01-01

    Purpose To assess the advantages of iterative reconstruction for quantitative computed tomography (CT) analysis of pulmonary emphysema. Materials and methods Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D) and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < ?950 Hounsfield units) and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001). For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01), but was not significantly different between each pair among scans when using AIDR3D. On scans without using AIDR3D, measurement errors between different tube current settings were significantly correlated with patients’ body weights (P<0.05), whereas these errors between scans when using AIDR3D were insignificantly or minimally correlated with body weight. Conclusion The extent of emphysema was more consistent across different tube currents when CT scans were converted to CT images using AIDR3D than using a conventional filtered-back projection method. PMID:25709426

  7. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  8. Analyzing the Students' Academic Integrity using Quantitative Methods

    ERIC Educational Resources Information Center

    Teodorescu, Daniel; Andrei, Tudorel; Tusa, Erika; Herteliu, Claudiu; Stancu, Stelian

    2007-01-01

    The transition period in Romania has generated a series of important changes, including the reforming of the Romanian tertiary education. This process has been accelerated after the signing of the Bologna treaty. Important changes were recorded in many of the quantitative aspects (such as number of student enrolled, pupil-student ratio etc) as…

  9. A Method for Quantitative Mapping of Thick Oil Spills Using Imaging Spectroscopy

    E-print Network

    A Method for Quantitative Mapping of Thick Oil Spills Using Imaging Spectroscopy By Roger N. Clark (AVIRIS) Team, 2010, A method for quantitative mapping of thick oil spills using imaging spectroscopy: U ....................................................................................................................................................14 Figures 1. Image of oil emulsion from the Deepwater Horizon oil spill in the Gulf of Mexico off

  10. On the quantitative method for measurement and analysis of the fine structure of Fraunhofer line profiles

    NASA Astrophysics Data System (ADS)

    Kuli-Zade, D. M.

    The methods of measurement and analysis of the fine structure of weak and moderate Fraunhofer line profiles are considered. The digital spectral materials were obtained using rapid scanning high dispersion and high resolution double monochromators. The methods of asymmetry coefficient, bisector method and new quantitative method pro- posed by the author are discussed. The new physical values of differential, integral, residual and relative asymmetries are first introduced. These quantitative values permit us to investigate the dependence of asymmetry on microscopic (atomic) and macro- scopic (photospheric) values. It is shown that the integral profile asymmetries grow appreciably with increase in line equivalent width. The average effective depths of the formation of used Fraunhofer lines in the photosphere of the Sun are determined. It is shown that with the increasing of the effective formation depths of the lines integral and residual asymmetries of the lines profiles noticeably decrease. It is in fine agree- ment with the results of intensity dependence of asymmetry. The above-mentioned methods are critically compared and the advantages of author's method are shown. The computer program of calculation of the line-profile asymmetry parameters has been worked out.

  11. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  12. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  13. Quantitative assessment of gene expression network module-validation methods

    PubMed Central

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  14. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy.

    PubMed

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-02-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8-29% smaller than those obtained from the other two methods. This discrepancy decreased to 3-19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method. PMID:26624514

  15. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions. PMID:19216674

  16. Quantitative Laser Diffraction Method for the Assessment of Protein Subvisible Particles

    PubMed Central

    Totoki, Shinichiro; Yamamoto, Gaku; Tsumoto, Kouhei; Uchiyama, Susumu; Fukui, Kiichi

    2015-01-01

    Laser diffraction (LD) has been recognized as a method for estimating particle size distribution. Here, a recently developed quantitative LD (qLD) system, which is an LD method with extensive deconvolution analysis, was employed for the quantitative assessment of protein particles sizes, especially aimed at the quantification of 0.2–10 ?m diameter subvisible particles (SVPs). The qLD accurately estimated concentration distributions for silica beads with diameters ranging from 0.2 to 10 ?m that have refractive indices similar to that of protein particles. The linearity of concentration for micrometer-diameter silica beads was confirmed in the presence of a fixed concentration of submicrometer diameter beads. Similarly, submicrometer-diameter silica beads could be quantified in the presence of micrometer-diameter beads. Subsequently, stir- and heat-stressed intravenous immunoglobulins were evaluated by using the qLD, in which the refractive index of protein particles that was determined experimentally was used in the deconvolution analysis. The results showed that the concentration distributions of protein particles in SVP size range differ for the two stresses. The number concentration of the protein particles estimated using the qLD agreed well with that obtained using flow microscopy. This work demonstrates that qLD can be used for quantitative estimation of protein aggregates in SVP size range. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:618–626, 2015 PMID:25449441

  17. Full quantitative phase analysis of hydrated lime using the Rietveld method

    SciTech Connect

    Lassinantti Gualtieri, Magdalena

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  18. An experimental method for quantitatively evaluating the elemental processes of indoor radioactive aerosol behavior.

    PubMed

    Yamazawa, H; Yamada, S; Xu, Y; Hirao, S; Moriizumi, J

    2015-11-01

    An experimental method for quantitatively evaluating the elemental processes governing the indoor behaviour of naturally occurring radioactive aerosols was proposed. This method utilises transient response of aerosol concentrations to an artificial change in aerosol removal rate by turning on and off an air purifier. It was shown that the indoor-outdoor exchange rate and the indoor deposition rate could be estimated by a continuous measurement of outdoor and indoor aerosol number concentration measurements and by the method proposed in this study. Although the scatter of the estimated parameters is relatively large, both the methods gave consistent results. It was also found that the size distribution of radioactive aerosol particles and hence activity median aerodynamic diameter remained not largely affected by the operation of the air purifier, implying the predominance of the exchange and deposition processes over other processes causing change in the size distribution such as the size growth by coagulation and the size dependence of deposition. PMID:25935006

  19. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  20. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405?nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750?nm. The spectral parameter, defined as the ratio of wavebands at 565-750?nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  1. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models

    PubMed Central

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  2. Characterization of working iron Fischer-Tropsch catalysts using quantitative diffraction methods

    NASA Astrophysics Data System (ADS)

    Mansker, Linda Denise

    This study presents the results of the ex-situ characterization of working iron Fischer-Tropsch synthesis (F-TS) catalysts, reacted hundreds of hours at elevated pressures, using a new quantitative x-ray diffraction analytical methodology. Compositions, iron phase structures, and phase particle morphologies were determined and correlated with the observed reaction kinetics. Conclusions were drawn about the character of each catalyst in its most and least active state. The identity of the active phase(s) in the Fe F-TS catalyst has been vigorously debated for more than 45 years. The highly-reduced catalyst, used to convert coal-derived syngas to hydrocarbon products, is thought to form a mixture of oxides, metal, and carbides upon pretreatment and reaction. Commonly, Soxhlet extraction is used to effect catalyst-product slurry separation; however, the extraction process could be producing irreversible changes in the catalyst, contributing to the conflicting results in the literature. X-ray diffraction doesn't require analyte-matrix separation before analysis, and can detect trace phases down to 300 ppm/2 nm; thus, working catalyst slurries could be characterized as-sampled. Data were quantitatively interpreted employing first principles methods, including the Rietveld polycrystalline structure method. Pretreated catalysts and pure phases were examined experimentally and modeled to explore specific behavior under x-rays. Then, the working catalyst slurries were quantitatively characterized. Empirical quantitation factors were calculated from experimental data or single crystal parameters, then validated using the Rietveld method results. In the most active form, after pretreatment in H 2 or in CO at Pambient, well-preserved working catalysts contained significant amounts of Fe7C3 with trace alpha-Fe, once reaction had commenced at elevated pressure. Amounts of Fe3O 4 were constant and small, with carbide dpavg < 15 nm. Small amounts of Fe7C3 were found in unreacted catalyst pretreated in CO at elevated pressures. In the least active form, well-preserved working catalysts contained Fe5C2 amounts >65 wt%, regardless of pretreatment gas and pressure, with all dpavg 18 nm. epsilon '-Fe2.2C carbide was found to probably consist of an {Fe5C2/FexO/epsilon-Fe3C} mixture. Fe5C2 carbide exhibited wide variations in diffraction pattern which could be correlated with sample handling events, changes in process conditions, or dpavg.

  3. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    SciTech Connect

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-15

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10{sup -6}, comparable with the recent results reported in the literature.

  4. A bead-based method for multiplexed identification and quantitation of DNA sequences using flow cytometry.

    PubMed

    Spiro, A; Lowe, M; Brown, D

    2000-10-01

    A new multiplexed, bead-based method which utilizes nucleic acid hybridizations on the surface of microscopic polystyrene spheres to identify specific sequences in heterogeneous mixtures of DNA sequences is described. The method consists of three elements: beads (5.6-microm diameter) with oligomer capture probes attached to the surface, three fluorophores for multiplexed detection, and flow cytometry instrumentation. Two fluorophores are impregnated within each bead in varying amounts to create different bead types, each associated with a unique probe. The third fluorophore is a reporter. Following capture of fluorescent cDNA sequences from environmental samples, the beads are analyzed by flow cytometric techniques which yield a signal intensity for each capture probe proportional to the amount of target sequences in the analyte. In this study, a direct hybrid capture assay was developed and evaluated with regard to sequence discrimination and quantitation of abundances. The target sequences (628 to 728 bp in length) were obtained from the 16S/23S intergenic spacer region of microorganisms collected from polluted groundwater at the nuclear waste site in Hanford, Wash. A fluorescence standard consisting of beads with a known number of fluorescent DNA molecules on the surface was developed, and the resolution, sensitivity, and lower detection limit for measuring abundances were determined. The results were compared with those of a DNA microarray using the same sequences. The bead method exhibited far superior sequence discrimination and possesses features which facilitate accurate quantitation. PMID:11010868

  5. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  6. Lymph Explorer: A new GUI using 3D high-frequency quantitative ultrasound methods to

    E-print Network

    Illinois at Urbana-Champaign, University of

    . Quantitative ultrasound (QUS) permits characterization of tissue microstructure using system-probability estimates and classification performance was assessed using ROC methods. For gastrointestinal nodes tissue microstructure using user- and system-independent estimates. In our studies, freshly-excised lymph

  7. A quantitative assessment of nuclear weapons proliferation risk utilizing probabilistic methods

    E-print Network

    Sentell, Dennis Shannon, 1971-

    2002-01-01

    A comparative quantitative assessment is made of the nuclear weapons proliferation risk between various nuclear reactor/fuel cycle concepts using a probabilistic method. The work presented details quantified proliferation ...

  8. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;…

  9. Methods for quantitatively determining fault slip using fault separation

    NASA Astrophysics Data System (ADS)

    Xu, S.-S.; Velasquillo-Martínez, L. G.; Grajales-Nishimura, J. M.; Murillo-Muńetón, G.; Nieto-Samaniego, A. F.

    2007-10-01

    Fault slip and fault separation are generally not equal to each other, however, they are geometrically related. The fault slip ( S) is a vector with a magnitude, a direction, and a sense of the movement. In this paper, a series of approaches are introduced to estimate quantitatively the magnitude and direction of the fault slip using fault separations. For calculation, the known factors are the pitch of slip lineations ( ?), the pitch of a cutoff ( ?), the dip separation ( Smd) or the strike separation ( Smh) for one marker. The two main purposes of this work include: (1) to analyze the relationship between fault slip and fault separation when slickenside lineations of a fault are known; (2) to estimate the slip direction when the parameters Smd or Smh, and ? for two non-parallel markers at a place (e.g., a point) are known. We tested the approaches using an example from a mainly strike-slip fault in East Quantoxhead, United Kingdom, and another example from the Jordan Field, Ector County, Texas. Also, we estimated the relative errors of apparent heave of the normal faults from the Sierra de San Miguelito, central Mexico.

  10. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226

  11. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  12. The need for quantitative methods in syntax and semantics research

    E-print Network

    Gibson, Edward A.

    The prevalent method in syntax and semantics research involves obtaining a judgement of the acceptability of a sentence/meaning pair, typically by just the author of the paper, sometimes with feedback from colleagues. This ...

  13. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  14. Reconstruction-classification method for quantitative photoacoustic tomography

    E-print Network

    Malone, Emma; Cox, Ben T; Arridge, Simon R

    2015-01-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in 2 and 3 dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches.

  15. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  16. A general method for the quantitative assessment of mineral pigments.

    PubMed

    Zurita Ares, M C; Fernández, J M

    2016-01-01

    A general method for the estimation of mineral pigment contents in different bases has been proposed using a sole set of calibration curves, (one for each pigment), calculated for a white standard base, thus elaborating patterns for each utilized base is not necessary. The method can be used in different bases and its validity had ev en been proved in strongly tinted bases. The method consists of a novel procedure that combines diffuse reflectance spectroscopy, second derivatives and the Kubelka-Munk function. This technique has proved to be at least one order of magnitude more sensitive than X-Ray diffraction for colored compounds, since it allowed the determination of the pigment amount in colored samples containing 0.5wt% of pigment that was not detected by X-Ray Diffraction. The method can be used to estimate the concentration of mineral pigments in a wide variety of either natural or artificial materials, since it does not requiere the calculation of each pigment pattern in every base. This fact could have important industrial consequences, as the proposed method would be more convenient, faster and cheaper. PMID:26695268

  17. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  18. Spatial Access Priority Mapping (SAPM) with Fishers: A Quantitative GIS Method for Participatory Planning

    PubMed Central

    Yates, Katherine L.; Schoeman, David S.

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers’ spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers’ willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process in a transparent, quantitative way. PMID:23874623

  19. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process in a transparent, quantitative way. PMID:23874623

  20. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, D.A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  1. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei. PMID:19285419

  2. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  3. A new method for fast quantitative mapping of absolute water content in vivo.

    PubMed

    Neeb, H; Zilles, K; Shah, N J

    2006-07-01

    The presence of brain edema, in its various forms, is an accompanying feature of many diseased states. Although the localized occurrence of brain edema may be demonstrated with MRI, the quantitative determination of absolute water content, an aspect that could play an important role in the objective evaluation of the dynamics of brain edema and the monitoring of the efficiency of treatment, is much more demanding. We present a method for the localized and quantitative measurement of absolute water content based on the combination of two fast multi-slice and multi-time point sequences QUTE and TAPIR for mapping the T(2)* and T(1) relaxation times, respectively. Incorporation of corrections for local B(1) field miscalibrations, temperature differences between the subject and a reference probe placed in the FOV, receiver profile inhomogeneities and T(1) saturation effects are included and allow the determination of water content with anatomical resolution and a precision >98%. The method was validated in phantom studies and was applied to the localized in vivo measurement of water content in a group of normal individuals and a patient with brain tumor. The results demonstrate that in vivo measurement of regional absolute water content is possible in clinically relevant measurement times with a statistical and systematic measurement error of <2%. PMID:16650780

  4. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the clinical setting.

  5. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (?2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (?2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  6. Magnetic ligation method for quantitative detection of microRNAs.

    PubMed

    Liong, Monty; Im, Hyungsoon; Majmudar, Maulik D; Aguirre, Aaron D; Sebas, Matthew; Lee, Hakho; Weissleder, Ralph

    2014-07-01

    A magnetic ligation method is utilized for the detection of microRNAs among a complex biological background without polymerase chain reaction or nucleotide modification. The sandwich probes assay can be adapted to analyze a panel of microRNAs associated with cardiovascular diseases in heart tissue samples. PMID:24532323

  7. Magnetic Ligation Method for Quantitative Detection of MicroRNAs

    PubMed Central

    Liong, Monty; Im, Hyungsoon; Majmudar, Maulik D.; Aguirre, Aaron D.; Sebas, Matthew; Lee, Hakho; Weissleder, Ralph

    2014-01-01

    A magnetic ligation method is utilized for the detection of microRNAs amongst a complex biological background without polymerase chain reaction or nucleotide modification. The sandwich probes assay can be adapted to analyze a panel of microRNAs associated with cardiovascular diseases in heart tissue samples. PMID:24532323

  8. Selection methods in forage breeding: a quantitative appraisal

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Forage breeding can be extraordinarily complex because of the number of species, perenniality, mode of reproduction, mating system, and the genetic correlation for some traits evaluated in spaced plants vs. performance under cultivation. Aiming to compare eight forage breeding methods for direct sel...

  9. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  10. Demand for public transport services: Integrating qualitative and quantitative methods

    E-print Network

    Bierlaire, Michel

    Abstract This research is in the context of a mode choice study in Switzerland. This paper represents. The integration of the latent variables requires qualitative methods to be able to come up with an initial set powerful transport mode choice model at hand. The research is carried out in the context of a collaborative

  11. New facility design and work method for the quantitative fit testing laboratory. Master's thesis

    SciTech Connect

    Ward, G.F.

    1989-05-01

    The United States Air Force School of Aerospace Medicine (USAFSAM) tests the quantitative fit of masks which are worn by military personnel during nuclear, biological, and chemical warfare. Subjects are placed in a Dynatech-Frontier Fit Testing Chamber, salt air is fed into the chamber, and samples of air are drawn from the mask and the chamber. The ratio of salt air outside the mask to salt air inside the mask is called the quantitative fit factor. A motion-time study was conducted to evaluate the efficiency of the layout and work method presently used in the laboratory. A link analysis was done to determine equipment priorities, and the link data and design guidelines were used to develop three proposed laboratory designs. The proposals were evaluated by projecting the time and motion efficiency, and the energy expended working in each design. Also evaluated were the lengths of the equipment links for each proposal, and each proposal's adherence to design guidelines. A mock-up was built of the best design proposal, and a second motion-time study was run. Results showed that with the new laboratory and work procedures, the USAFSAM analyst could test 116 more subjects per year than are currently tested. Finally, the results of a questionnaire given to the analyst indicated that user acceptance of the work area improved with the new design.

  12. Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research

    PubMed Central

    SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN

    2015-01-01

    Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073

  13. The use of electromagnetic induction methods for establishing quantitative permafrost models in West Greenland

    NASA Astrophysics Data System (ADS)

    Ingeman-Nielsen, Thomas; Brandt, Inooraq

    2010-05-01

    The sedimentary settings at West Greenlandic town and infrastructural development sites are dominated by fine-grained marine deposits of late to post glacial origin. Prior to permafrost formation, these materials were leached by percolating precipitation, resulting in depletion of salts. Present day permafrost in these deposits is therefore very ice-rich with ice contents approaching 50-70% vol. in some areas. Such formations are of great concern in building and construction projects in Greenland, as they loose strength and bearing capacity upon thaw. It is therefore of both technical and economical interest to develop methods to precisely investigate and determine parameters such as ice-content and depth to bedrock in these areas. In terms of geophysical methods for near surface investigations, traditional methods such as Electrical Resistivity Tomography (ERT) and Refraction Seismics (RS) have generally been applied with success. The Georadar method usually fails due to very limited penetration depth in the fine-grained materials, and Electromagnetic Induction (EMI) methods are seldom applicable for quantitative interpretation due to the very high resistivities causing low induced currents and thus small secondary fields. Nevertheless, in some areas of Greenland the marine sequence was exposed relatively late, and as a result the sediments may not be completely leached of salts. In such cases, layers with pore water salinity approaching that of sea water, may be present below an upper layer of very ice rich permafrost. The saline pore water causes a freezing-point depression which results in technically unfrozen sediments at permafrost temperatures around -3 °C. Traditional ERT and VES measurements are severely affected by equivalency problems in these settings, practically prohibiting reasonable quantitative interpretation without constraining information. Such prior information may be obtained of course from boreholes, but equipment capable of drilling permafrozen sediments is generally not available in Greenland, and mobilization costs are therefore considerable thus limiting the use of geotechnical borings to larger infrastructure and construction projects. To overcome these problems, we have tested the use of shallow Transient ElectroMagnetic (TEM) measurements, to provide constraints in terms of depth to and resistivity of the conductive saline layer. We have tested such a setup at two field sites in the Ilulissat area (mid-west Greenland), one with available borehole information (site A), the second without (site C). VES and TEM soundings were collected at each site and the respective data sets subsequently inverted using a mutually constrained inversion scheme. At site A, the TEM measurements (20x20m square loop, in-loop configuration) show substantial and repeatable negative amplitude segments, and therefore it has not presently been possible to provide a quantitative interpretation for this location. Negative segments are typically a sign of Induced Polarization or cultural effects. Forward modeling based on inversion of the VES data constrained with borehole information has indicated that IP effects could indeed be the cause of the observed anomaly, although such effects are not normally expected in permafrost or saline deposits. Data from site C has shown that jointly inverting the TEM and VES measurements does provide well determined estimates for all layer parameters except the thickness of the active layer and resistivity of the bedrock. The active layer thickness may be easily probed to provide prior information on this parameter, and the bedrock resistivity is of limited interest in technical applications. Although no confirming borehole information is available at this site, these results indicate that joint or mutually constrained inversion of TEM and VES data is feasible and that this setup may provide a fast and cost effective method for establishing quantitative interpretations of permafrost structure in partly saline conditions.

  14. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. PMID:21402442

  15. Application of bias correction methods to improve the accuracy of quantitative radar rainfall in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.-K.; Kim, J.-H.; Suk, M.-K.

    2015-11-01

    There are many potential sources of the biases in the radar rainfall estimation process. This study classified the biases from the rainfall estimation process into the reflectivity measurement bias and the rainfall estimation bias by the Quantitative Precipitation Estimation (QPE) model and also conducted the bias correction methods to improve the accuracy of the Radar-AWS Rainrate (RAR) calculation system operated by the Korea Meteorological Administration (KMA). In the Z bias correction for the reflectivity biases occurred by measuring the rainfalls, this study utilized the bias correction algorithm. The concept of this algorithm is that the reflectivity of the target single-pol radars is corrected based on the reference dual-pol radar corrected in the hardware and software bias. This study, and then, dealt with two post-process methods, the Mean Field Bias Correction (MFBC) method and the Local Gauge Correction method (LGC), to correct the rainfall estimation bias by the QPE model. The Z bias and rainfall estimation bias correction methods were applied to the RAR system. The accuracy of the RAR system was improved after correcting Z bias. For the rainfall types, although the accuracy of the Changma front and the local torrential cases was slightly improved without the Z bias correction the accuracy of the typhoon cases got worse than the existing results in particular. As a result of the rainfall estimation bias correction, the Z bias_LGC was especially superior to the MFBC method because the different rainfall biases were applied to each grid rainfall amount in the LGC method. For the rainfall types, the results of the Z bias_LGC showed that the rainfall estimates for all types was more accurate than only the Z bias and, especially, the outcomes in the typhoon cases was vastly superior to the others.

  16. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstünda?, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs. PMID:26085428

  17. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  18. Quantitative interferometric microscopic flow cytometer with expanded principal component analysis method

    NASA Astrophysics Data System (ADS)

    Wang, Shouyu; Jin, Ying; Yan, Keding; Xue, Liang; Liu, Fei; Li, Zhenhua

    2014-11-01

    Quantitative interferometric microscopy is used in biological and medical fields and a wealth of applications are proposed in order to detect different kinds of biological samples. Here, we develop a phase detecting cytometer based on quantitative interferometric microscopy with expanded principal component analysis phase retrieval method to obtain phase distributions of red blood cells with a spatial resolution ~1.5 ?m. Since expanded principal component analysis method is a time-domain phase retrieval algorithm, it could avoid disadvantages of traditional frequency-domain algorithms. Additionally, the phase retrieval method realizes high-speed phase imaging from multiple microscopic interferograms captured by CCD camera when the biological cells are scanned in the field of view. We believe this method can be a powerful tool to quantitatively measure the phase distributions of different biological samples in biological and medical fields.

  19. [Stakeholder participation in priority setting - a consideration of the normative status of quantitative and qualitative methods].

    PubMed

    Friedrich, Daniel R; Stumpf, Sabine; Alber, Kathrin

    2012-01-01

    Priority setting in medicine is generally regarded as an appropriate means for preparing just allocation of medical resources. By involving the general public or affected stakeholders in priority setting, advocates hope to legitimise this process and increase the acceptability of future decisions on resource allocation. Here, we differentiate between two ideal-typical methods of stakeholder involvement: 1) qualitative and 2) quantitative ones. We argue that the level of information of participants is important to the quality of the outcome of participatory events. Qualitative methods aim at fostering deliberative discussions among well-informed stakeholders. By contrast, quantitative methods usually do not have the capacity to ensure or, at least, control the level of information that participants use to guide their decisions. Hence, we conclude that in the context of priority setting qualitative and especially deliberative methods are preferable to quantitative approaches. PMID:22857728

  20. Immunochemical methods for quantitation of vitamin B6. Technical report

    SciTech Connect

    Brandon, D.L.; Corse, J.W.

    1981-09-30

    A procedure is described which proposes schemes for determining the total of all B6 vitamins in acid-hydrolyzed samples utilizing a radio-immunoassay (RIA) or an enzyme-immunoassay (EIA). Sample preparation is similar for both RIA and EIA. Two specific antibodies (antipyridoxine and antipyridoxamine) are employed to determine pyridoxamine, a portion of the sample is reduced with sodium borohydride. Pyridoxal is determined by difference between pyridoxine before and after reduction. The results indicate that two procedures have been developed which are selective for pyridoxamine (the fluorescent enzyme immunoassay and the spin immunoassay) and one assay which is equally sensitive to pyridoxine and pyridoxamine (the radio-immunoassay).

  1. New methods for quantitative and qualitative facial studies: an overview.

    PubMed

    Thomas, I T; Hintz, R J; Frias, J L

    1989-01-01

    The clinical study of birth defects has traditionally followed the Gestalt approach, with a trend, in recent years, toward more objective delineation. Data collection, however, has been largely restricted to measurements from X-rays and anthropometry. In other fields, new techniques are being applied that capitalize on the use of modern computer technology. One such technique is that of remote sensing, of which photogrammetry is a branch. Cartographers, surveyors and engineers, using specially designed cameras, have applied geometrical techniques to locate points on an object precisely. These techniques, in their long-range application, have become part of our industrial technology and have assumed great importance with the development of satellite-borne surveillance systems. The close-range application of similar techniques has the potential for extremely accurate clinical measurement. We are currently evaluating the application of remote sensing to facial measurement using three conventional 35 mm still cameras. The subject is photographed in front of a carefully measured grid, and digitization is then carried out on 35-mm slides specific landmarks on the cranioface are identified, along with points on the background grid and the four corners of the slide frame, and are registered as xy coordinates by a digitizer. These coordinates are then converted into precise locations in object space. The technique is capable of producing measurements to within 1/100th of an inch. We suggest that remote sensing methods such as this may well be of great value in the study of congenital malformations. PMID:2677039

  2. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    PubMed

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

  3. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  4. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  5. Comparison of Overlap Methods for Quantitatively Synthesizing Single-Subject Data

    ERIC Educational Resources Information Center

    Wolery, Mark; Busick, Matthew; Reichow, Brian; Barton, Erin E.

    2010-01-01

    Four overlap methods for quantitatively synthesizing single-subject data were compared to visual analysts' judgments. The overlap methods were percentage of nonoverlapping data, pairwise data overlap squared, percentage of data exceeding the median, and percentage of data exceeding a median trend. Visual analysts made judgments about 160 A-B data…

  6. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a particular…

  7. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  8. Measuring access to medicines: a review of quantitative methods used in household surveys

    PubMed Central

    2010-01-01

    Background Medicine access is an important goal of medicine policy; however the evaluation of medicine access is a subject under conceptual and methodological development. The aim of this study was to describe quantitative methodologies to measure medicine access on household level, access expressed as paid or unpaid medicine acquisition. Methods Searches were carried out in electronic databases and health institutional sites; within references from retrieved papers and by contacting authors. Results Nine papers were located. The methodologies of the studies presented differences in the recall period, recruitment of subjects and medicine access characterization. Conclusions The standardization of medicine access indicators and the definition of appropriate recall periods are required to evaluate different medicines and access dimensions, improving studies comparison. Besides, specific keywords must be established to allow future literature reviews about this topic. PMID:20509960

  9. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. PMID:25842334

  10. Comparison of Concentration Methods for Quantitative Detection of Sewage-Associated Viral Markers in Environmental Waters

    PubMed Central

    Harwood, V. J.; Gyawali, P.; Sidhu, J. P. S.; Toze, S.

    2015-01-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ?10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg2+) and elution with acid were inefficient and more prone to inhibition, and will result in underestimation of the prevalence and concentrations of HAdVs and HPyVs markers in environmental waters. PMID:25576614

  11. Intracranial aneurysm segmentation in 3D CT angiography: method and quantitative validation

    NASA Astrophysics Data System (ADS)

    Firouzian, Azadeh; Manniesing, R.; Flach, Z. H.; Risselada, R.; van Kooten, F.; Sturkenboom, M. C. J. M.; van der Lugt, A.; Niessen, W. J.

    2010-03-01

    Accurately quantifying aneurysm shape parameters is of clinical importance, as it is an important factor in choosing the right treatment modality (i.e. coiling or clipping), in predicting rupture risk and operative risk and for pre-surgical planning. The first step in aneurysm quantification is to segment it from other structures that are present in the image. As manual segmentation is a tedious procedure and prone to inter- and intra-observer variability, there is a need for an automated method which is accurate and reproducible. In this paper a novel semi-automated method for segmenting aneurysms in Computed Tomography Angiography (CTA) data based on Geodesic Active Contours is presented and quantitatively evaluated. Three different image features are used to steer the level set to the boundary of the aneurysm, namely intensity, gradient magnitude and variance in intensity. The method requires minimum user interaction, i.e. clicking a single seed point inside the aneurysm which is used to estimate the vessel intensity distribution and to initialize the level set. The results show that the developed method is reproducible, and performs in the range of interobserver variability in terms of accuracy.

  12. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study. PMID:21313806

  13. Simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and dead reckoning

    NASA Astrophysics Data System (ADS)

    Davey, Neil S.; Godil, Haris

    2013-05-01

    This article presents a comparative study between a well-known SLAM (Simultaneous Localization and Mapping) algorithm, called Gmapping, and a standard Dead-Reckoning algorithm; the study is based on experimental results of both approaches by using a commercial skid-based turning robot, P3DX. Five main base-case scenarios are conducted to evaluate and test the effectiveness of both algorithms. The results show that SLAM outperformed the Dead Reckoning in terms of map-making accuracy in all scenarios but one, since SLAM did not work well in a rapidly changing environment. Although the main conclusion about the excellence of SLAM is not surprising, the presented test method is valuable to professionals working in this area of mobile robots, as it is highly practical, and provides solid and valuable results. The novelty of this study lies in its simplicity. The simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and Dead Reckoning and some applications using autonomous robots are being patented by the authors in U.S. Patent Application Nos. 13/400,726 and 13/584,862.

  14. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    PubMed

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances. PMID:26545323

  15. Comparative assessment of fluorescent transgene methods for quantitative imaging in human cells.

    PubMed

    Mahen, Robert; Koch, Birgit; Wachsmuth, Malte; Politi, Antonio Z; Perez-Gonzalez, Alexis; Mergenthaler, Julia; Cai, Yin; Ellenberg, Jan

    2014-11-01

    Fluorescence tagging of proteins is a widely used tool to study protein function and dynamics in live cells. However, the extent to which different mammalian transgene methods faithfully report on the properties of endogenous proteins has not been studied comparatively. Here we use quantitative live-cell imaging and single-molecule spectroscopy to analyze how different transgene systems affect imaging of the functional properties of the mitotic kinase Aurora B. We show that the transgene method fundamentally influences level and variability of expression and can severely compromise the ability to report on endogenous binding and localization parameters, providing a guide for quantitative imaging studies in mammalian cells. PMID:25232003

  16. Precision of dehydroascorbic acid quantitation with the use of the subtraction method--validation of HPLC-DAD method for determination of total vitamin C in food.

    PubMed

    Mazurek, Artur; Jamroz, Jerzy

    2015-04-15

    In food analysis, a method for determination of vitamin C should enable measuring of total content of ascorbic acid (AA) and dehydroascorbic acid (DHAA) because both chemical forms exhibit biological activity. The aim of the work was to confirm applicability of HPLC-DAD method for analysis of total content of vitamin C (TC) and ascorbic acid in various types of food by determination of validation parameters such as: selectivity, precision, accuracy, linearity and limits of detection and quantitation. The results showed that the method applied for determination of TC and AA was selective, linear and precise. Precision of DHAA determination by the subtraction method was also evaluated. It was revealed that the results of DHAA determination obtained by the subtraction method were not precise which resulted directly from the assumption of this method and the principles of uncertainty propagation. The proposed chromatographic method should be recommended for routine determinations of total vitamin C in various food. PMID:25466057

  17. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals.

    PubMed

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-09-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization, and generalized synchronization) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error under null hypothesis (independence between two signals) and mean variance computed over all values of coupling degree in each model, we provide a criterion for comparing performances. Results show that the performances of the compared methods are highly dependent on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model dependent. PMID:17025676

  18. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals

    NASA Astrophysics Data System (ADS)

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-09-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization, and generalized synchronization) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error under null hypothesis (independence between two signals) and mean variance computed over all values of coupling degree in each model, we provide a criterion for comparing performances. Results show that the performances of the compared methods are highly dependavxx on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model dependent.

  19. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-05-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

  20. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-01

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries. PMID:26510530

  1. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  2. Overcoming Methods Anxiety: Qualitative First, Quantitative Next, Frequent Feedback along the Way

    ERIC Educational Resources Information Center

    Bernstein, Jeffrey L.; Allen, Brooke Thomas

    2013-01-01

    Political Science research methods courses face two problems. First is what to cover, as there are too many techniques to explore in any one course. Second is dealing with student anxiety around quantitative material. We explore a novel way to approach these issues. Our students began by writing a qualitative paper. They followed with a term…

  3. The Use of Quantitative Methods as an Aid to Decision Making in Educational Administration.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.

    Three quantitative methods are outlined, with suggestions for application to particular problem areas of educational administration: (1) The Leontief input-output analysis, incorporating a "transaction table" for displaying relationships between economic outputs and inputs, mainly applicable to budget analysis and planning; (2) linear programing,…

  4. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  5. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  6. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  7. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  8. A new method for quantitative analysis of multiple scelerosis using MR images

    E-print Network

    analysis of multiple sclerosis (MS) was presented. An automatic self-adaptive image segmentation algorithm Analysis, Multipl Sclerosis, Brain Atrophy 1. INTRODUCTION Multiple sclerosis (MS) is the most commonA new method for quantitative analysis of multiple scelerosis using MR images Dongqing Chen*a , Wei

  9. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  10. William W. Hargrove Forrest M. Hoffman Paul F. Hessburg Mapcurves: a quantitative method

    E-print Network

    Hoffman, Forrest M.

    William W. Hargrove Ć Forrest M. Hoffman Ć Paul F. Hessburg Mapcurves: a quantitative method Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831, USA E-mail: forrest@fs.fed.us J Geograph Syst (2006) DOI 10.1007/s10109-006-0025-x ORIGINAL ARTICLE #12;Keywords Ecoregion Ć

  11. The quantitative assessment of the pre- and postoperative craniosynostosis using the methods of image analysis.

    PubMed

    Fabija?ska, Anna; W?gli?ski, Tomasz

    2015-12-01

    This paper considers the problem of the CT based quantitative assessment of the craniosynostosis before and after the surgery. First, fast and efficient brain segmentation approach is proposed. The algorithm is robust to discontinuity of skull. As a result it can be applied both in pre- and post-operative cases. Additionally, image processing and analysis algorithms are proposed for describing the disease based on CT scans. The proposed algorithms automate determination of the standard linear indices used for assessment of the craniosynostosis (i.e. cephalic index CI and head circumference HC) and allow for planar and volumetric analysis which so far have not been reported. Results of applying the introduced methods to sample craniosynostotic cases before and after the surgery are presented and discussed. The results show that the proposed brain segmentation algorithm is characterized by high accuracy when applied both in the pre- and postoperative craniosynostosis, while the introduced planar and volumetric indices for the disease description may be helpful to distinguish between the types of the disease. PMID:26143078

  12. Result Demonstration: A Method That Works 

    E-print Network

    Boleman, Chris; Dromgoole, Darrell A.

    2007-05-24

    The result demonstration is one of the most effective ways to transfer research-based knowledge to agricultural producers or to any audience. This publication explains the factors affecting a learner's decision to adopt an innovation and the five...

  13. Development of a quantitative method to measure vision in children with chronic cortical visual impairment.

    PubMed Central

    Good, W V

    2001-01-01

    PURPOSE: Cortical visual impairment (CVI) is the most common cause of bilateral vision impairment in children in Western countries. Better quantitative tools for measuring vision are needed to assess these children, to allow measurement of their visual deficit, and to monitor their response to treatment and rehabilitation. The author performed a series of experiments to assess the use of the sweep visual evoked potential (VEP) as a quantitative tool for measuring vision in CVI. METHODS: The first experiment was a reliability measure (test/retest) of VEP grating acuity thresholds of 23 children with CVI. To validate the VEP procedure, VEP grating acuity was compared to a clinical measure of vision, the Huo scale, and to a psychophysical measure of vision, the Teller Acuity Card procedure. Finally, the sweep VEP was tested as a tool for defining optimal luminance conditions for grating acuity in 13 children with CVI, by measuring grating thresholds under 2 different luminance conditions: 50 and 100 candela per square meter (cd/m2). RESULTS: Retest thresholds were similar to original thresholds (r2 = 0.662; P = .003, 1-tailed t test). Grating VEP measures correlate significantly with the clinical index (r2 = 0.63; P = .00004). Teller acuity measurements are also similar to VEP measures in children (r2 = 0.64; P = .0005) but show lower acuities compared to the VEP for children with particularly low vision. Finally, 3 of 13 children tested under 2 background luminance conditions showed paradoxical improvement in grating threshold with dimmer luminance. CONCLUSIONS: The sweep VEP tool is a reliable and valid means for measuring grating acuity in children with CVI. The tool also shows promise as a means of determining the optimal visual environment for children with CVI. PMID:11797314

  14. Deep neural nets as a method for quantitative structure-activity relationships.

    PubMed

    Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir

    2015-02-23

    Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable. PMID:25635324

  15. Methylation methods for the quantitative analysis of conjugated linoleic acid (CLA) isomers in various lipid samples.

    PubMed

    Park, Sook J; Park, Cherl W; Kim, Seck J; Kim, Jung K; Kim, Young R; Park, Kyung A; Kim, Jeong O; Ha, Yeong L

    2002-02-27

    Precise methylation methods for various chemical forms of conjugated linoleic acid (CLA), which minimize the formation of t,t isomers and allylmethoxy derivatives (AMD) with the completion of methylation, were developed using a 50 mg lipid sample, 3 mL of 1.0 N H(2)SO(4)/methanol, and/or 3 mL of 20% tetramethylguanidine (TMG)/methanol solution(s). Free CLA (FCLA) was methylated with 1.0 N H(2)SO(4)/methanol (55 degrees C, 5 min). CLA esterified in safflower oil (CLA-SO) was methylated with 20% TMG/methanol (100 degrees C, 5 min), whereas CLA esterified in phospholipid (CLA-PL) was methylated with 20% TMG/methanol (100 degrees C, 10 min), followed by an additional reaction with 1.0 N H(2)SO(4)/methanol (55 degrees C, 5 min). Similarly, CLA esterified in egg yolk lipid (CLA-EYL) was methylated by base hydrolysis, followed by reaction with 1.0 N H(2)SO(4)/methanol (55 degrees C, 5 min). These results suggest that for the quantitative analysis of CLA in lipid samples by GC, proper methylation methods should be chosen on the basis of the chemical forms of CLA in samples. PMID:11853469

  16. An Evaluation of Quantitative Methods of Determining the Degree of Melting Experienced by a Chondrule

    NASA Technical Reports Server (NTRS)

    Nettles, J. W.; Lofgren, G. E.; Carlson, W. D.; McSween, H. Y., Jr.

    2004-01-01

    Many workers have considered the degree to which partial melting occurred in chondrules they have studied, and this has led to attempts to find reliable methods of determining the degree of melting. At least two quantitative methods have been used in the literature: a convolution index (CVI), which is a ratio of the perimeter of the chondrule as seen in thin section divided by the perimeter of a circle with the same area as the chondrule, and nominal grain size (NGS), which is the inverse square root of the number density of olivines and pyroxenes in a chondrule (again, as seen in thin section). We have evaluated both nominal grain size and convolution index as melting indicators. Nominal grain size was measured on the results of a set of dynamic crystallization experiments previously described, where aliquots of LEW97008(L3.4) were heated to peak temperatures of 1250, 1350, 1370, and 1450 C, representing varying degrees of partial melting of the starting material. Nominal grain size numbers should correlate with peak temperature (and therefore degree of partial melting) if it is a good melting indicator. The convolution index is not directly testable with these experiments because the experiments do not actually create chondrules (and therefore they have no outline on which to measure a CVI). Thus we had no means to directly test how well the CVI predicted different degrees of melting. Therefore, we discuss the use of the CVI measurement and support the discussion with X-ray Computed Tomography (CT) data.

  17. Raman spectroscopy provides a rapid, non-invasive method for quantitation of starch in live, unicellular microalgae.

    PubMed

    Ji, Yuetong; He, Yuehui; Cui, Yanbin; Wang, Tingting; Wang, Yun; Li, Yuanguang; Huang, Wei E; Xu, Jian

    2014-12-01

    Conventional methods for quantitation of starch content in cells generally involve starch extraction steps and are usually labor intensive, thus a rapid and non-invasive method will be valuable. Using the starch-producing unicellular microalga Chlamydomonas reinhardtii as a model, we employed a customized Raman spectrometer to capture the Raman spectra of individual single cells under distinct culture conditions and along various growth stages. The results revealed a nearly linear correlation (R(2) = 0.9893) between the signal intensity at 478 cm(-1) and the starch content of the cells. We validated the specific correlation by showing that the starch-associated Raman peaks were eliminated in a mutant strain where the AGPase (ADP-glucose pyrophosphorylase) gene was disrupted and consequentially the biosynthesis of starch blocked. Furthermore, the method was validated in an industrial algal strain of Chlorella pyrenoidosa. This is the first demonstration of starch quantitation in individual live cells. Compared to existing cellular starch quantitation methods, this single-cell Raman spectra-based approach is rapid, label-free, non-invasive, culture-independent, low-cost, and potentially able to simultaneously track multiple metabolites in individual live cells, therefore should enable many new applications. PMID:24906189

  18. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits-Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2) = 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2) = 0.97), with a means ± SD value of -0.06 ± 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P < 0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primer design strategy may be applied to the detection of other RNA or DNA viruses. PMID:26612712

  19. Studies of crack dynamics in clay soil I. Experimental methods, results, and morphological quantification

    E-print Network

    Hoffmann, Heiko

    Studies of crack dynamics in clay soil I. Experimental methods, results, and morphological geometric measures which provide a quantitative description of crack patterns at the soil surface including Minkowski functions. Additionally, we measured the distribution of angles within the crack network

  20. Laboratory and field validation of a Cry1Ab protein quantitation method for water.

    PubMed

    Strain, Katherine E; Whiting, Sara A; Lydy, Michael J

    2014-10-01

    The widespread planting of crops expressing insecticidal proteins derived from the soil bacterium Bacillus thuringiensis (Bt) has given rise to concerns regarding potential exposure to non-target species. These proteins are released from the plant throughout the growing season into soil and surface runoff and may enter adjacent waterways as runoff, erosion, aerial deposition of particulates, or plant debris. It is crucial to be able to accurately quantify Bt protein concentrations in the environment to aid in risk analyses and decision making. Enzyme-linked immunosorbent assay (ELISA) is commonly used for quantitation of Bt proteins in the environment; however, there are no published methods detailing and validating the extraction and quantitation of Bt proteins in water. The objective of the current study was to optimize the extraction of a Bt protein, Cry1Ab, from three water matrices and validate the ELISA method for specificity, precision, accuracy, stability, and sensitivity. Recovery of the Cry1Ab protein was matrix-dependent and ranged from 40 to 88% in the validated matrices, with an overall method detection limit of 2.1 ng/L. Precision among two plates and within a single plate was confirmed with a coefficient of variation less than 20%. The ELISA method was verified in field and laboratory samples, demonstrating the utility of the validated method. The implementation of a validated extraction and quantitation protocol adds consistency and reliability to field-collected data regarding transgenic products. PMID:25059137

  1. Multi-Window Classical Least Squares Multivariate Calibration Methods for Quantitative ICP-AES Analyses

    SciTech Connect

    CHAMBERS,WILLIAM B.; HAALAND,DAVID M.; KEENAN,MICHAEL R.; MELGAARD,DAVID K.

    1999-10-01

    The advent of inductively coupled plasma-atomic emission spectrometers (ICP-AES) equipped with charge-coupled-device (CCD) detector arrays allows the application of multivariate calibration methods to the quantitative analysis of spectral data. We have applied classical least squares (CLS) methods to the analysis of a variety of samples containing up to 12 elements plus an internal standard. The elements included in the calibration models were Ag, Al, As, Au, Cd, Cr, Cu, Fe, Ni, Pb, Pd, and Se. By performing the CLS analysis separately in each of 46 spectral windows and by pooling the CLS concentration results for each element in all windows in a statistically efficient manner, we have been able to significantly improve the accuracy and precision of the ICP-AES analyses relative to the univariate and single-window multivariate methods supplied with the spectrometer. This new multi-window CLS (MWCLS) approach simplifies the analyses by providing a single concentration determination for each element from all spectral windows. Thus, the analyst does not have to perform the tedious task of reviewing the results from each window in an attempt to decide the correct value among discrepant analyses in one or more windows for each element. Furthermore, it is not necessary to construct a spectral correction model for each window prior to calibration and analysis: When one or more interfering elements was present, the new MWCLS method was able to reduce prediction errors for a selected analyte by more than 2 orders of magnitude compared to the worst case single-window multivariate and univariate predictions. The MWCLS detection limits in the presence of multiple interferences are 15 rig/g (i.e., 15 ppb) or better for each element. In addition, errors with the new method are only slightly inflated when only a single target element is included in the calibration (i.e., knowledge of all other elements is excluded during calibration). The MWCLS method is found to be vastly superior to partial least squares (PLS) in this case of limited numbers of calibration samples.

  2. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  3. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles

    SciTech Connect

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki; Department of Surgical Oncology, Graduate School of Medicine, Tohoku University, Seiryo-machi, Aoba-ku, Sendai 980-8574

    2012-09-28

    Highlights: Black-Right-Pointing-Pointer Organic fluorescent material-assembled nanoparticles for IHC were prepared. Black-Right-Pointing-Pointer New nanoparticle fluorescent intensity was 10.2-fold greater than Qdot655. Black-Right-Pointing-Pointer Nanoparticle staining analyzed a wide range of ER expression levels in tissue. Black-Right-Pointing-Pointer Nanoparticle staining enhanced the quantitative sensitivity for ER diagnosis. -- Abstract: The detection of estrogen receptors (ERs) by immunohistochemistry (IHC) using 3,3 Prime -diaminobenzidine (DAB) is slightly weak as a prognostic marker, but it is essential to the application of endocrine therapy, such as antiestrogen tamoxifen-based therapy. IHC using DAB is a poor quantitative method because horseradish peroxidase (HRP) activity depends on reaction time, temperature and substrate concentration. However, IHC using fluorescent material provides an effective method to quantitatively use IHC because the signal intensity is proportional to the intensity of the photon excitation energy. However, the high level of autofluorescence has impeded the development of quantitative IHC using fluorescence. We developed organic fluorescent material (tetramethylrhodamine)-assembled nanoparticles for IHC. Tissue autofluorescence is comparable to the fluorescence intensity of quantum dots, which are the most representative fluorescent nanoparticles. The fluorescent intensity of our novel nanoparticles was 10.2-fold greater than quantum dots, and they did not bind non-specifically to breast cancer tissues due to the polyethylene glycol chain that coated their surfaces. Therefore, the fluorescent intensity of our nanoparticles significantly exceeded autofluorescence, which produced a significantly higher signal-to-noise ratio on IHC-imaged cancer tissues than previous methods. Moreover, immunostaining data from our nanoparticle fluorescent IHC and IHC with DAB were compared in the same region of adjacent tissues sections to quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal.

  4. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    PET reconstruction is key to the quantification of PET data. To our knowledge, no comparative study of reconstruction methods has been performed to date. In this study, we compared reconstruction methods with various filters in terms of their spatial resolution, non-uniformities (NU), recovery coefficients (RCs), and spillover ratios (SORs). In addition, the linearity of reconstructed radioactivity between linearity of measured and true concentrations were also assessed. A Siemens Inveon PET scanner was used in this study. Spatial resolution was measured with NEMA standard by using a 1 mm3 sized 18F point source. Image quality was assessed in terms of NU, RC and SOR. To measure the effect of reconstruction algorithms and filters, data was reconstructed using FBP, 3D reprojection algorithm (3DRP), ordered subset expectation maximization 2D (OSEM 2D), and maximum a posteriori (MAP) with various filters or smoothing factors (?). To assess the linearity of reconstructed radioactivity, image quality phantom filled with 18F was used using FBP, OSEM and MAP (? =1.5 & 5 × 10-5). The highest achievable volumetric resolution was 2.31 mm3 and the highest RCs were obtained when OSEM 2D was used. SOR was 4.87% for air and 3.97% for water, obtained OSEM 2D reconstruction was used. The measured radioactivity of reconstruction image was proportional to the injected one for radioactivity below 16 MBq/ml when FBP or OSEM 2D reconstruction methods were used. By contrast, when the MAP reconstruction method was used, activity of reconstruction image increased proportionally, regardless of the amount of injected radioactivity. When OSEM 2D or FBP were used, the measured radioactivity concentration was reduced by 53% compared with true injected radioactivity for radioactivity <16 MBq/ml. The OSEM 2D reconstruction method provides the highest achievable volumetric resolution and highest RC among all the tested methods and yields a linear relation between the measured and true concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  5. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples

    DOEpatents

    Caldwell, John T. (Los Alamos, NM); Kunz, Walter E. (Santa Fe, NM); Cates, Michael R. (Oak Ridge, TN); Franks, Larry A. (Santa Barbara, CA)

    1985-01-01

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fissions are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for .sup.239 Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  6. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    PubMed Central

    Manfredi, Marcello; Bearman, Greg; Williamson, Greg; Kronkright, Dale; Doehne, Eric; Jacobs, Megan; Marengo, Emilio

    2014-01-01

    In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI) we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time. PMID:25010699

  7. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesC? approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  8. A method for operative quantitative interpretation of multispectral images of biological tissues

    NASA Astrophysics Data System (ADS)

    Lisenko, S. A.; Kugeiko, M. M.

    2013-10-01

    A method for operative retrieval of spatial distributions of biophysical parameters of a biological tissue by using a multispectral image of it has been developed. The method is based on multiple regressions between linearly independent components of the diffuse reflection spectrum of the tissue and unknown parameters. Possibilities of the method are illustrated by an example of determining biophysical parameters of the skin (concentrations of melanin, hemoglobin and bilirubin, blood oxygenation, and scattering coefficient of the tissue). Examples of quantitative interpretation of the experimental data are presented.

  9. Radial period extraction method employing frequency measurement for quantitative collimation testing

    NASA Astrophysics Data System (ADS)

    Li, Sikun; Wang, Xiangzhao

    2016-01-01

    A radial period extraction method employing frequency measurement is proposed for quantitative collimation testing using spiral gratings. The radial period of the difference-frequency fringe is treated as a measure of the collimation condition. A frequency measurement technique based on wavelet transform and a statistical approach is presented to extract the radial period directly from the amplitude-transmittance spiral fringe. A basic constraint to set the parameters of the wavelet is introduced. Strict mathematical demonstration is given. The method outperforms methods employing phase measurement in terms of precision, stability and noise immune ability.

  10. a Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis.

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Available from UMI in association with The British Library. Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4-tetrahidro-1, 6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 Br.H_2O) has been determined, first using monochromatic Mo Kalpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed a R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop- 2-en-1-one, (C_{25 }H_{20}N _2O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analysis respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analysis of the benzil compound ((C_6H_5 O.CO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature -114 ^circC. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new molecular structure of (Dinitrato-(N,N ^'-dimethylethylene-diamine)copper(II)) has been determined using Mo Kalpha radiation on a four circle diffractometer. The refinement resulted in an R-factor (on F) of 4.06%.

  11. A Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4 -tetrahidro-1,6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 BrcdotH_2O) has been determined, first using monochromatic Mo K alpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed an R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop-2-en-1-one, (C_{25}H _{20}N_2 O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analyses respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analyses of the benzil compound ((C_6H_5 OcdotCO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new molecular structure of (Dinitrato-(N,N ^'-dimethylethylene-diamine)copper(II)) has been determined using Mo Kalpha radiation on a four circle diffractometer. The refinement resulted in an R-factor (on F) of 4.06%.

  12. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18 O-Labeling Method for Quantitative Proteomics

    SciTech Connect

    Lopez-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather S.; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-08-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min and minimized the amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from bacteria Shewanella oneidensis, and mouse plasma, as well as for the labeling of complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, and rapid, and thus well-suited for automation.

  13. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  14. Quantitative Analysis Method of Output Loss due to Restriction for Grid-connected PV Systems

    NASA Astrophysics Data System (ADS)

    Ueda, Yuzuru; Oozeki, Takashi; Kurokawa, Kosuke; Itou, Takamitsu; Kitamura, Kiyoyuki; Miyamoto, Yusuke; Yokota, Masaharu; Sugihara, Hiroyuki

    Voltage of power distribution line will be increased due to reverse power flow from grid-connected PV systems. In the case of high density grid connection, amount of voltage increasing will be higher than the stand-alone grid connection system. To prevent the over voltage of power distribution line, PV system's output will be restricted if the voltage of power distribution line is close to the upper limit of the control range. Because of this interaction, amount of output loss will be larger in high density case. This research developed a quantitative analysis method for PV systems output and losses to clarify the behavior of grid connected PV systems. All the measured data are classified into the loss factors using 1 minute average of 1 second data instead of typical 1 hour average. Operation point on the I-V curve is estimated to quantify the loss due to the output restriction using module temperature, array output voltage, array output current and solar irradiance. As a result, loss due to output restriction is successfully quantified and behavior of output restriction is clarified.

  15. Localization and Quantitation of Chloroplast Enzymes and Light-Harvesting Components Using Immunocytochemical Methods 12

    PubMed Central

    Mustardy, Laszlo; Cunningham, Francis X.; Gantt, Elisabeth

    1990-01-01

    Seven chloroplast proteins were localized in Porphyridium cruentum (ATCC 50161) by immunolabeling with colloidal gold on electron microscope sections of log phase cells grown under red, green, and white light. Ribulose bisphosphate carboxylase labeling occurred almost exclusively in the pyrenoid. The major apoproteins of photosystem I (56-64 kD) occurred mostly over the stromal thylakoid region and also appeared over the thylakoids passing through the pyrenoid. Labeling for photosystem II core components (D2 and a 45 kD Chl-binding protein), for phycobilisomes (allophycocyanin, and a 91 kD Lcm linker) and for ATP synthase (? subunit) were predominantly present in the thylakoid region but not in the pyrenoid region of the chloroplast. Red light cells had increased labeling per thylakoid length for polypeptides of photosystem II and of phycobilisomes, while photosystem I density decreased, compared to white light cells. Conversely, green light cells had a decreased density of photosystem II and phycobilisome polypeptides, while photosystem I density changed little compared with white light cells. A comparison of the immunogold labeling results with data from spectroscopic methods and from rocket immunoelectrophoresis indicates that it can provide a quantitative measure of the relative amounts of protein components as well as their localization in specific organellar compartments. Images Figure 1 Figure 2 PMID:16667706

  16. Localization and quantitation of chloroplast enzymes and light-harvesting components using immunocytochemical methods

    SciTech Connect

    Mustardy, L.; Cunningham, F.X Jr.; Gantt, E. )

    1990-09-01

    Seven chloroplast proteins were localized in Porphyridium cruentum (ATCC 50161) by immunolabeling with colloidal gold on electron microscope sections of log phase cells grown under red, green, and white light. Ribulose bisphosphate carboxylase labeling occurred almost exclusively in the pyrenoid. The major apoproteins of photosystem I (56-64 kD) occurred mostly over the stromal thylakoid region and also appeared over the thylakoids passing through the pyrenoid. Labeling for photosystem II core components (D2 and a 45 kD Chl-binding protein), for phycobilisomes (allophycocyanin, and a 91 kD L{sub CM} linker) and for ATP synthase ({beta} subunit) were predominantly present in the thylakoid region but not in the pyrenoid region of the chloroplast. Red light cells had increased labeling per thylakoid length for polypeptides of photosystem II and of phycobilisomes, while photosystem I density decreased, compared to white light cells. Conversely, green light cells had a decreased density of photosystem II and phycobilisome polypeptides, while photosystem I density changed little compared with white light cells. A comparison of the immunogold labeling results with data from spectroscopic methods and from rocket immunoelectrophoresis indicates that it can provide a quantitative measure of the relative amounts of protein components as well as their localization in specific organeller compartments.

  17. Qualitative and quantitative methods to determine miscibility in amorphous drug-polymer systems.

    PubMed

    Meng, Fan; Dave, Vivek; Chauhan, Harsh

    2015-09-18

    Amorphous drug-polymer systems or amorphous solid dispersions are commonly used in pharmaceutical industry to enhance the solubility of compounds with poor aqueous solubility. The degree of miscibility between drug and polymer is important both for solubility enhancement as well as for the formation of a physically stable amorphous system. Calculation of solubility parameters, Computational data mining, Tg measurements by DSC and Raman mapping are established traditional methods used to qualitatively detect the drug-polymer miscibility. Calculation of Flory-Huggins interaction parameter, computational analysis of X-Ray Diffraction (XRD) data, solid state Nuclear Magnetic Resonance (NMR) spectroscopy and Atomic Forced Microscopy (AFM) have been recently developed to quantitatively determine the miscibility in amorphous drug-polymer systems. This brief review introduces and compiles these qualitative and quantitative methods employed in the evaluation of drug-polymer miscibility. Combination of these techniques can provide deeper insights into the true miscibility of the drug-polymer systems. PMID:26006307

  18. Quantitative methods for genome-scale analysis of in situ hybridization and correlation with microarray data

    PubMed Central

    Lee, Chang-Kyu; Sunkin, Susan M; Kuan, Chihchau; Thompson, Carol L; Pathak, Sayan; Ng, Lydia; Lau, Chris; Fischer, Shanna; Mortrud, Marty; Slaughterbeck, Cliff; Jones, Allan; Lein, Ed; Hawrylycz, Michael

    2008-01-01

    With the emergence of genome-wide colorimetric in situ hybridization (ISH) data sets such as the Allen Brain Atlas, it is important to understand the relationship between this gene expression modality and those derived from more quantitative based technologies. This study introduces a novel method for standardized relative quantification of colorimetric ISH signal that enables a large-scale cross-platform expression level comparison of ISH with two publicly available microarray brain data sources. PMID:18234097

  19. Si NMR sensitivity enhancement methods for the quantitative study of organosilicate hydrolysis and condensation

    E-print Network

    Sahai, Nita

    29 Si NMR sensitivity enhancement methods for the quantitative study of organosilicate hydrolysis condi- tions for efficient silica production [1­7]. The process involves the hydrolysis and subsequent­O­H ! Si­O­Si ţ H2O đ2Ţ and Si­O­H ţ Si­O­R ! Si­O­Si ţ ROH đ3Ţ Thorough characterization of hydrolysis

  20. A method for estimating and removing streaking artifacts in quantitative susceptibility mapping

    PubMed Central

    Li, Wei; Wang, Nian; Yu, Fang; Han, Hui; Cao, Wei; Romero, Rebecca; Tantiwongkosi, Bundhit; Duong, Timothy Q.; Liu, Chunlei

    2015-01-01

    Quantitative susceptibility mapping (QSM) is a novel MRI method for quantifying tissue magnetic property. In the brain, it reflects the molecular composition and microstructure of the local tissue. However, susceptibility maps reconstructed from single-orientation data still suffer from streaking artifacts which obscure structural details and small lesions. We propose and have developed a general method for estimating streaking artifacts and subtracting them from susceptibility maps. Specifically, this method uses a sparse linear equation and least-squares (LSQR)-algorithm-based method to derive an initial estimation of magnetic susceptibility, a fast quantitative susceptibility mapping method to estimate the susceptibility boundaries, and an iterative approach to estimate the susceptibility artifact from ill-conditioned k-space regions only. With a fixed set of parameters for the initial susceptibility estimation and subsequent streaking artifact estimation and removal, the method provides an unbiased estimate of tissue susceptibility with negligible streaking artifacts, as compared to multi-orientation QSM reconstruction. This method allows for improved delineation of white matter lesions in patients with multiple sclerosis and small structures of the human brain with excellent anatomical details. The proposed methodology can be extended to other existing QSM algorithms. PMID:25536496

  1. A method of quantitative risk assessment for transmission pipeline carrying natural gas.

    PubMed

    Jo, Young-Do; Ahn, Bum Jong

    2005-08-31

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline. PMID:15913887

  2. Using quantitative and qualitative data in health services research – what happens when mixed method findings conflict? [ISRCTN61522618

    PubMed Central

    Moffatt, Suzanne; White, Martin; Mackintosh, Joan; Howel, Denise

    2006-01-01

    Background In this methodological paper we document the interpretation of a mixed methods study and outline an approach to dealing with apparent discrepancies between qualitative and quantitative research data in a pilot study evaluating whether welfare rights advice has an impact on health and social outcomes among a population aged 60 and over. Methods Quantitative and qualitative data were collected contemporaneously. Quantitative data were collected from 126 men and women aged over 60 within a randomised controlled trial. Participants received a full welfare benefits assessment which successfully identified additional financial and non-financial resources for 60% of them. A range of demographic, health and social outcome measures were assessed at baseline, 6, 12 and 24 month follow up. Qualitative data were collected from a sub-sample of 25 participants purposively selected to take part in individual interviews to examine the perceived impact of welfare rights advice. Results Separate analysis of the quantitative and qualitative data revealed discrepant findings. The quantitative data showed little evidence of significant differences of a size that would be of practical or clinical interest, suggesting that the intervention had no impact on these outcome measures. The qualitative data suggested wide-ranging impacts, indicating that the intervention had a positive effect. Six ways of further exploring these data were considered: (i) treating the methods as fundamentally different; (ii) exploring the methodological rigour of each component; (iii) exploring dataset comparability; (iv) collecting further data and making further comparisons; (v) exploring the process of the intervention; and (vi) exploring whether the outcomes of the two components match. Conclusion The study demonstrates how using mixed methods can lead to different and sometimes conflicting accounts and, using this six step approach, how such discrepancies can be harnessed to interrogate each dataset more fully. Not only does this enhance the robustness of the study, it may lead to different conclusions from those that would have been drawn through relying on one method alone and demonstrates the value of collecting both types of data within a single study. More widespread use of mixed methods in trials of complex interventions is likely to enhance the overall quality of the evidence base. PMID:16524479

  3. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  4. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  5. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  6. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    PubMed

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-01

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine. PMID:25943076

  7. Quantitative methods for measuring DNA flexibility in vitro and in vivo.

    PubMed

    Peters, Justin P; Becker, Nicole A; Rueter, Emily M; Bajzer, Zeljko; Kahn, Jason D; Maher, L James

    2011-01-01

    The double-helical DNA biopolymer is particularly resistant to bending and twisting deformations. This property has important implications for DNA folding in vitro and for the packaging and function of DNA in living cells. Among the outstanding questions in the field of DNA biophysics are the underlying origin of DNA stiffness and the mechanisms by which DNA stiffness is overcome within cells. Exploring these questions requires experimental methods to quantitatively measure DNA bending and twisting stiffness both in vitro and in vivo. Here, we discuss two classical approaches: T4 DNA ligase-mediated DNA cyclization kinetics and lac repressor-mediated DNA looping in Escherichia coli. We review the theoretical basis for these techniques and how each can be applied to quantitate biophysical parameters that describe the DNA polymer. We then show how we have modified these methods and applied them to quantitate how apparent DNA physical properties are altered in vitro and in vivo by sequence-nonspecific architectural DNA-binding proteins such as the E. coli HU protein and eukaryotic HMGB proteins. PMID:21195233

  8. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  9. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  10. Three-dimensional quantitative analysis of adhesive remnants and enamel loss resulting from debonding orthodontic molar tubes

    PubMed Central

    2014-01-01

    Aims Presenting a new method for direct, quantitative analysis of enamel surface. Measurement of adhesive remnants and enamel loss resulting from debonding molar tubes. Material and methods Buccal surfaces of fifteen extracted human molars were directly scanned with an optic blue-light 3D scanner to the nearest 2 ?m. After 20 s etching molar tubes were bonded and after 24 h storing in 0.9% saline - debonded. Then 3D scanning was repeated. Superimposition and comparison were proceeded and shape alterations of the entire objects were analyzed using specialized computer software. Residual adhesive heights as well as enamel loss depths have been obtained for the entire buccal surfaces. Residual adhesive volume and enamel loss volume have been calculated for every tooth. Results The maximum height of adhesive remaining on enamel surface was 0.76 mm and the volume on particular teeth ranged from 0.047 mm3 to 4.16 mm3. The median adhesive remnant volume was 0.988 mm3. Mean depths of enamel loss for particular teeth ranged from 0.0076 mm to 0.0416 mm. Highest maximum depth of enamel loss was 0.207 mm. Median volume of enamel loss was 0.104 mm3 and maximum volume was 1.484 mm3. Conclusions Blue-light 3D scanning is able to provide direct precise scans of the enamel surface, which can be superimposed in order to calculate shape alterations. Debonding molar tubes leaves a certain amount of adhesive remnants on the enamel, however the interface fracture pattern varies for particular teeth and areas of enamel loss are present as well. PMID:25208969

  11. Motivation Method Testing Results(1) Instabilities Results(2) Future Viscoelasticity in mantle convection

    E-print Network

    Cerveny, Vlastislav

    Motivation Method Testing Results(1) Instabilities Results(2) Future Viscoelasticity in mantle@karel.troja.mff.cuni.cz] 14th October 2015 Viscoelasticity in mantle convection Charles University in Prague #12;Motivation Method Testing Results(1) Instabilities Results(2) Future Content Motivation Method Testing: Elastic slab

  12. Results are visually appealing, but not dramatically superior to older methods such as filtering and dividing. Good convergence properties.

    E-print Network

    Willsky, Alan S.

    and quantitative results demonstrating the effectiveness of the proposed method in producing debiased and denoised MR images. Results Previous Work nMost previous methods make the following general assumptions: n are piecewise constant. nEarliest work in mid-80's: Haselgrove and Prammer (1986), Lufkin et al (1986

  13. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  14. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. PMID:26243267

  15. Simultaneous quantitation and validation of method for the quality evaluation of Eucommiae cortex by HPLC/UV.

    PubMed

    Zhao, Bing Tian; Jeong, Su Yang; Kim, Tae In; Seo, Eun Kyoung; Min, Byung Sun; Son, Jong Keun; Woo, Mi Hee

    2015-12-01

    A new HPLC/UV method has been developed for the simultaneous quantitative determination of four major components in Eucommiae cortex, namely geniposidic acid (1), geniposide (2), pinoresinol di-O-?-D-glucopyranoside (3), and liriodendrin (4). Simultaneous separations of these four components were achieved on a J'sphere ODS C18 column (250 × 4.6 mm, 4 µm). The elution was done using water with 0.1 % phosphoric acid (A) and acetonitrile with 0.1 % phosphoric acid (B) in a two-step elution of the mobile phase at a flow rate of 1.0 mL/min and a wavelength of 230 nm. The method was validated for linearity, recovery, precision, accuracy, stability and robustness. All calibration curves showed good linear regression (r (2)  > 0.999) within the test ranges. This method showed good recovery and reproducibility for the quantification of these four components in 85 species of Eucommiae cortex. The intra-day and inter-day precisions were lower than 0.53 % (as a relative standard deviation, RSD) and accuracies between 93.00 and 106.28 % for all standards. The results indicate that the established HPLC/UV method is suitable for quantitation and quality evaluation of Eucommiae cortex. PMID:26216707

  16. Development of a HPLC Method for the Quantitative Determination of Capsaicin in Collagen Sponge

    PubMed Central

    Guo, Chun-Lian; Chen, Hong-Ying; Cui, Bi-Ling; Chen, Yu-Huan; Zhou, Yan-Fang; Peng, Xin-Sheng; Wang, Qin

    2015-01-01

    Controlling the concentration of drugs in pharmaceutical products is essential to patient's safety. In this study, a simple and sensitive HPLC method is developed to quantitatively analyze capsaicin in collagen sponge. The capsaicin from sponge was extracted for 30?min with ultrasonic wave extraction technique and methanol was used as solvent. The chromatographic method was performed by using isocratic system composed of acetonitrile-water (70?:?30) with a flow rate of 1?mL/min and the detection wavelength was at 280?nm. Capsaicin can be successfully separated with good linearity (the regression equation is A = 9.7182C + 0.8547; R2 = 1.0) and perfect recovery (99.72%). The mean capsaicin concentration in collagen sponge was 49.32?mg/g (RSD = 1.30%; n = 3). In conclusion, the ultrasonic wave extraction method is simple and the extracting efficiency is high. The HPLC assay has excellent sensitivity and specificity and is a convenient method for capsaicin detection in collagen sponge. This paper firstly discusses the quantitative analysis of capsaicin in collagen sponge. PMID:26612986

  17. Quantitative Detection Method of Hydroxyapatite Nanoparticles Based on Eu(3+) Fluorescent Labeling in Vitro and in Vivo.

    PubMed

    Xie, Yunfei; Perera, Thalagalage Shalika Harshani; Li, Fang; Han, Yingchao; Yin, Meizhen

    2015-11-01

    One major challenge for application of hydroxyapatite nanoparticles (nHAP) in nanomedicine is the quantitative detection method. Herein, we exploited one quantitative detection method for nHAP based on the Eu(3+) fluorescent labeling via a simple chemical coprecipitation method. The trace amount of nHAP in cells and tissues can be quantitatively detected on the basis of the fluorescent quantitative determination of Eu(3+) ions in nHAP crystal lattice. The lowest concentration of Eu(3+) ions that can be quantitatively detected is 0.5 nM using DELFIA enhancement solution. This methodology can be broadly applicable for studying the tissue distribution and metabolization of nHAP in vivo. PMID:26495748

  18. "Do I Need Research Skills in Working Life?": University Students' Motivation and Difficulties in Quantitative Methods Courses

    ERIC Educational Resources Information Center

    Murtonen, Mari; Olkinuora, Erkki; Tynjala, Paivi; Lehtinen, Erno

    2008-01-01

    This study explored university students' views of whether they will need research skills in their future work in relation to their approaches to learning, situational orientations on a learning situation of quantitative methods, and difficulties experienced in quantitative research courses. Education and psychology students in both Finland (N =…

  19. An IMSIMS threshold method for semi-quantitative determination of activation barriers: Interconversion of proline cis $ trans forms in

    E-print Network

    Clemmer, David E.

    energy A B S T R A C T Collisional activation of selected conformations by multidimensional ion mobility- quantitative activation energies for interconversion of different structures of the nonapeptide bradykinin (BKAn IMS­IMS threshold method for semi-quantitative determination of activation barriers

  20. A Dilute-and-Shoot LC-MS Method for Quantitating Opioids in Oral Fluid.

    PubMed

    Enders, Jeffrey R; McIntire, Gregory L

    2015-10-01

    Opioid testing represents a dominant share of the market in pain management clinical testing facilities. Testing of this drug class in oral fluid (OF) has begun to rise in popularity. OF analysis has traditionally required extensive clean-up protocols and sample concentration, which can be avoided. This work highlights the use of a fast, 'dilute-and-shoot' method that performs no considerable sample manipulation. A quantitative method for the determination of eight common opioids and associated metabolites (codeine, morphine, hydrocodone, hydromorphone, norhydrocodone, oxycodone, noroxycodone and oxymorphone) in OF is described herein. OF sample is diluted 10-fold in methanol/water and then analyzed using an Agilent chromatographic stack coupled with an AB SCIEX 4500. The method has a 2.2-min LC gradient and a cycle time of 2.9 min. In contrast to most published methods of this particular type, this method uses no sample clean-up or concentration and has a considerably faster LC gradient, making it ideal for very high-throughput laboratories. Importantly, the method requires only 100 ?L of sample and is diluted 10-fold prior to injection to help with instrument viability. Baseline separation of all isobaric opioids listed above was achieved on a phenyl-hexyl column. The validated calibration range for this method is 2.5-1,000 ng/mL. This 'dilute-and-shoot' method removes the unnecessary, costly and time-consuming extraction steps found in traditional methods and still surpasses all analytical requirements. PMID:26378142

  1. Broad-spectrum detection and quantitation methods of Soil-borne cereal mosaic virus isolates.

    PubMed

    Vaďanopoulos, Céline; Legrčve, Anne; Moreau, Virginie; Bragard, Claude

    2009-08-01

    A broad-spectrum reverse transcription-polymerase chain reaction (RT-PCR) protocol was developed for detecting Soil-borne cereal mosaic virus (SBCMV) isolates, responsible for mosaic diseases in Europe, using primers targeting the highly conserved 3'-untranslated region of RNA-1 and RNA-2 of SBCMV. The 3'-end region is a privileged target for the detection of a wide range of isolates, because of sequence conservation, of the tRNA-like structure, the major role in viral replication and the signal amplification due to the presence of numerous genomic and subgenomic RNAs. The primers were also designed for virus quantitation using real-time RT-PCR with SYBR-Green chemistry. No cross-reaction with Wheat spindle streak mosaic virus, frequently associated with SBCMV, was observed. The use of RT-PCR and real-time quantitative RT-PCR allowed a more sensitive detection and quantitation of SBCMV to be made than was the case with ELISA. The methods enabled European isolates of SBCMV from Belgium, France, Germany, Italy and the UK to be detected and quantified. Real-time RT-PCR represents a new tool for comparing soil inoculum potential as well as cultivar resistance to SBCMV. PMID:19490978

  2. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  3. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T. (Los Alamos, NM); Kunz, Walter E. (Santa Fe, NM); Atencio, James D. (Los Alamos, NM)

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  4. Investigation of a diffuse optical measurements-assisted quantitative photoacoustic tomographic method in reflection geometry

    PubMed Central

    Xu, Chen; Kumavor, Patrick D.; Aguirre, Andres; Zhu, Quing

    2012-01-01

    Abstract. Photoacoustic tomography provides the distribution of absorbed optical energy density, which is the product of optical absorption coefficient and optical fluence distribution. We report the experimental investigation of a novel fitting procedure that quantitatively determines the optical absorption coefficient of chromophores. The experimental setup consisted of a hybrid system of a 64-channel photoacoustic imaging system with a frequency-domain diffused optical measurement system. The fitting procedure included a complete photoacoustic forward model and an analytical solution of a target chromophore using the diffusion approximation. The fitting procedure combines the information from the photoacoustic image and the background information from the diffuse optical measurements to minimize the photoacoustic measurements and forward model data and recover the target absorption coefficient quantitatively. 1-cm-cube phantom absorbers of high and low contrasts were imaged at depths of up to 3.0 cm. The fitted absorption coefficient results were at least 80% of their true values. The sensitivities of this fitting procedure to target location, target radius, and background optical properties were also investigated. We found that this fitting procedure was most sensitive to the accurate determination of the target radius and depth. Blood sample in a thin tube of radius 0.58 mm, simulating a blood vessel, was also studied. The photoacoustic images and fitted absorption coefficients are presented. These results demonstrate the clinical potential of this fitting procedure to quantitatively characterize small lesions in breast imaging. PMID:22734743

  5. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  6. Quantitative chromosome map of the polyploid Saccharum spontaneum by multicolor fluorescence in situ hybridization and imaging methods.

    PubMed

    Ha, S; Moore, P H; Heinz, D; Kato, S; Ohmido, N; Fukui, K

    1999-04-01

    Somatic chromosomes of a wild relative of sugarcane (Saccharum spontaneum L.) anther culture-derived clone (AP 85-361, 2n = 32) were identified and characterized by computer-aided imaging technology and molecular cytological methods. The presence of four satellite chromosomes and four nearly identical chromosome sets suggests that the clone is a tetrahaploid with the basic number x = 8. A quantitative chromosome map, or idiogram, was developed using image analysis of the condensation pattern (CP) at the prometaphase stage of somatic chromosomes. The 45S and 5S ribosomal RNA gene (rDNA) loci were simultaneously visualized by multi-color fluorescence in situ hybridization (McFISH) and precisely localized to the regions of 3p3.1 and 6q1.3 on the idiogram. The simultaneous visualization of two sets of four ribosomal RNA genes confirms tetraploidy of this clone. This conclusion is consistent with results of molecular marker mapping. The quantitative chromosome map produced will become the foundation for genome analyses based on chromosome identity and structure. Previously impossible identification of small chromosomes and untestable hypotheses about the polyploid nature of plants can now be settled with these two approaches of quantitative karyotyping and FISH. PMID:10380803

  7. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    NASA Astrophysics Data System (ADS)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  8. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  9. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J. (Los Alamos, NM); Cremers, David A. (Los Alamos, NM)

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  10. A quantitative solid-state Raman spectroscopic method for control of fungicides.

    PubMed

    Ivanova, Bojidarka; Spiteller, Michael

    2012-07-21

    A new analytical procedure using solid-state Raman spectroscopy within the THz-region for the quantitative determination of mixtures of different conformations of trifloxystrobin (EE, EZ, ZE and ZZ), tebuconazole (1), and propiconazole (2) as an effective method for the fungicide product quality monitoring programmes and control has been developed and validated. The obtained quantities were controlled independently by the validated hybrid HPLC electrospray ionization (ESI) tandem mass spectrometric (MS) and matrix-assisted laser desorption/ionization (MALDI) MS methods in the condensed phase. The quantitative dependences were obtained on the twenty binary mixtures of the analytes and were further tested on the three trade fungicide products, containing mixtures of trifloxystrobin-tebuconazole and trifloxystrobin-propiconazole, as an emissive concentrate or water soluble granules of the active ingredients. The present methods provided sufficient sensitivity as reflected by the metrologic quantities, evaluating the concentration limit of detection (LOD) and quantification (LOQ), linear limit (LL), measurement accuracy and precision, true quantity value, trueness of measurement and more. PMID:22679621

  11. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  12. A quantitative autoradiographic method for the measurement of local rates of brain protein synthesis

    SciTech Connect

    Dwyer, B.E.; Donatoni, P.; Wasterlain, C.G.

    1982-05-01

    We have developed a new method for measuring local rates of brain protein synthesis in vivo. It combines the intraperitoneal injection of a large dose of low specific activity amino acid with quantitative autoradiography. This method has several advantages: 1) It is ideally suited for young or small animals or where immobilizing an animal is undesirable. 2 The amino acid injection ''floods'' amino acid pools so that errors in estimating precursor specific activity, which is especially important in pathological conditions, are minimized. 3) The method provides for the use of a radioautographic internal standard in which valine incorporation is measured directly. Internal standards from experimental animals correct for tissue protein content and self-absorption of radiation in tissue sections which could vary under experimental conditions.

  13. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci. PMID:23973416

  14. Voxel Spread Function (VSF) Method for Correction of Magnetic Field Inhomogeneity Effects in Quantitative Gradient-Echo-Based MRI

    PubMed Central

    Yablonskiy, Dmitriy A; Sukstanskii, Alexander L; Luo, Jie; Wang, Xiaoqi

    2012-01-01

    Purpose Macroscopic magnetic field inhomogeneities adversely affect different aspects of MRI images. In quantitative MRI when the goal is to quantify biological tissue parameters, they bias and often corrupt such measurements. The goal of this paper is to develop a method for correction of macroscopic field inhomogeneities that can be applied to a variety of quantitative gradient-echo-based MRI techniques. Methods We have re-analyzed a basic theory of gradient echo (GE) MRI signal formation in the presence of background field inhomogeneities and derived equations that allow for correction of magnetic field inhomogeneity effects based on the phase and magnitude of GE data. We verified our theory by mapping R2* relaxation rate in computer simulated, phantom, and in vivo human data collected with multi-GE sequences. Results The proposed technique takes into account voxel spread function (VSF) effects and allowed obtaining virtually free from artifacts R2* maps for all simulated, phantom and in vivo data except of the edge areas with very steep field gradients. Conclusion The VSF method, allowing quantification of tissue specific R2*-related tissue properties, has a potential to breed new MRI biomarkers serving as surrogates for tissue biological properties similar to R1 and R2 relaxation rate constants widely used in clinical and research MRI. PMID:23233445

  15. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years.

    PubMed

    Tapaltsyan, Vagan; Eronen, Jussi T; Lawing, A Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D

    2015-05-01

    The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem-cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3,500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine whether evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  16. Methods for quantitative detection of antibody-induced complement activation on red blood cells.

    PubMed

    Meulenbroek, Elisabeth M; Wouters, Diana; Zeerleder, Sacha

    2014-01-01

    Antibodies against red blood cells (RBCs) can lead to complement activation resulting in an accelerated clearance via complement receptors in the liver (extravascular hemolysis) or leading to intravascular lysis of RBCs. Alloantibodies (e.g. ABO) or autoantibodies to RBC antigens (as seen in autoimmune hemolytic anemia, AIHA) leading to complement activation are potentially harmful and can be - especially when leading to intravascular lysis - fatal(1). Currently, complement activation due to (auto)-antibodies on RBCs is assessed in vitro by using the Coombs test reflecting complement deposition on RBC or by a nonquantitative hemolytic assay reflecting RBC lysis(1-4). However, to assess the efficacy of complement inhibitors, it is mandatory to have quantitative techniques. Here we describe two such techniques. First, an assay to detect C3 and C4 deposition on red blood cells that is induced by antibodies in patient serum is presented. For this, FACS analysis is used with fluorescently labeled anti-C3 or anti-C4 antibodies. Next, a quantitative hemolytic assay is described. In this assay, complement-mediated hemolysis induced by patient serum is measured making use of spectrophotometric detection of the released hemoglobin. Both of these assays are very reproducible and quantitative, facilitating studies of antibody-induced complement activation. PMID:24514151

  17. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  18. A systematic study on the influencing parameters and improvement of quantitative analysis of multi-component with single marker method using notoginseng as research subject.

    PubMed

    Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing

    2015-03-01

    A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi-component and the quality control of TCMs and TCM prescriptions. PMID:25618711

  19. A quantitative method for evaluating numerical simulation accuracy of time-transient Lamb wave propagation with its applications to selecting appropriate element size and time step.

    PubMed

    Wan, Xiang; Xu, Guanghua; Zhang, Qing; Tse, Peter W; Tan, Haihui

    2016-01-01

    Lamb wave technique has been widely used in non-destructive evaluation (NDE) and structural health monitoring (SHM). However, due to the multi-mode characteristics and dispersive nature, Lamb wave propagation behavior is much more complex than that of bulk waves. Numerous numerical simulations on Lamb wave propagation have been conducted to study its physical principles. However, few quantitative studies on evaluating the accuracy of these numerical simulations were reported. In this paper, a method based on cross correlation analysis for quantitatively evaluating the simulation accuracy of time-transient Lamb waves propagation is proposed. Two kinds of error, affecting the position and shape accuracies are firstly identified. Consequently, two quantitative indices, i.e., the GVE (group velocity error) and MACCC (maximum absolute value of cross correlation coefficient) derived from cross correlation analysis between a simulated signal and a reference waveform, are proposed to assess the position and shape errors of the simulated signal. In this way, the simulation accuracy on the position and shape is quantitatively evaluated. In order to apply this proposed method to select appropriate element size and time step, a specialized 2D-FEM program combined with the proposed method is developed. Then, the proper element size considering different element types and time step considering different time integration schemes are selected. These results proved that the proposed method is feasible and effective, and can be used as an efficient tool for quantitatively evaluating and verifying the simulation accuracy of time-transient Lamb wave propagation. PMID:26315506

  20. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  1. Method of applying internal standard to dried matrix spot samples for use in quantitative bioanalysis.

    PubMed

    Abu-Rabie, Paul; Denniff, Philip; Spooner, Neil; Brynjolffssen, Jan; Galluzzo, Paul; Sanders, Giles

    2011-11-15

    A novel technique is presented that addresses the issue of how to apply internal standard (IS) to dried matrix spot (DMS) samples that allows the IS to integrate with the sample prior to extraction. The TouchSpray, a piezo electric spray system, from The Technology Partnership (TTP), was used to apply methanol containing IS to dried blood spot (DBS) samples. It is demonstrated that this method of IS application has the potential to work in practice, for use in quantitative determination of circulating exposures of pharmaceuticals in toxicokinetic and pharmacokinetic studies. Three different methods of IS application were compared: addition of IS to control blood prior to DBS sample preparation (control 1), incorporation into extraction solvent (control 2), and the novel use of TouchSpray technology (test). It is demonstrated that there was no significant difference in accuracy and precision data using these three techniques obtained using both manual extraction and direct elution. PMID:21972889

  2. A Simple Liquid Chromatography Tandem Mass Spectrometry Method for Quantitation of Plasma Busulfan.

    PubMed

    Deng, Shuang; Kiscoan, Michael; Frazee, Clint; Abdel-Rahman, Susan; Dalal, Jignesh; Garg, Uttam

    2016-01-01

    Busulfan is an alkylating agent widely used in the ablation of bone marrow cells before hematopoietic stem cell transplant. Due to large intraindividual and interindividual variations, and narrow therapeutic window, therapeutic drug monitoring of busulfan is warranted. A quick and reliable HPLC-MS/MS method was developed for the assay of plasma busulfan. HPLC involved C18 column, and MS/MS was used in electrospray ionization (ESI) positive mode. Quantitation and identification of busulfan was made using various multiple reactions monitoring (MRMs). Isotopic labeled busulfan-d8 was used as the internal standard. The method is linear from 50 to 2500 ng/mL and has with-in run and between-run imprecision of <10 %. PMID:26660176

  3. Methods for a quantitative evaluation of odd-even staggering effects

    E-print Network

    Alessandro Olmi; Silvia Piantelli

    2015-06-08

    Odd-even effects, also known as "staggering" effects, are a common feature observed in the yield distributions of fragments produced in different types of nuclear reactions. We review old methods, and we propose new ones, for a quantitative estimation of these effects as a function of proton or neutron number of the reaction products. All methods are compared on the basis of Monte Carlo simulations. We find that some are not well suited for the task, the most reliable ones being those based either on a non-linear fit with a properly oscillating function or on a third (or fourth) finite difference approach. In any case, high statistic is of paramount importance to avoid that spurious structures appear just because of statistical fluctuations in the data and of strong correlations among the yields of neighboring fragments.

  4. Standard test method for quantitative determination of americium 241 in plutonium by Gamma-Ray spectrometry

    E-print Network

    American Society for Testing and Materials. Philadelphia

    1994-01-01

    1.1 This test method covers the quantitative determination of americium 241 by gamma-ray spectrometry in plutonium nitrate solution samples that do not contain significant amounts of radioactive fission products or other high specific activity gamma-ray emitters. 1.2 This test method can be used to determine the americium 241 in samples of plutonium metal, oxide and other solid forms, when the solid is appropriately sampled and dissolved. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  5. A QUANTITATIVE, THREE-DIMENSIONAL METHOD FOR ANALYZING ROTATIONAL MOVEMENT FROM SINGLE-VIEW MOVIES

    PubMed

    Berg

    1994-06-01

    The study of animal movement is an important aspect of functional morphological research. The three-dimensional movements of (parts of) animals are usually recorded on two-dimensional film frames. For a quantitative analysis, the real movements should be reconstructed from their projections. If movements occur in one plane, their projection is distorted only if this plane is not parallel to the film plane. Provided that the parallel orientation of the movement with respect to the film plane is checked accurately, a two-dimensional method of analysis (ignoring projection errors) can be justified for quantitative analysis of planar movements. Films of movements of skeletal elements of the fish head have generally been analyzed with the two-dimensional method (e.g. Sibbing, 1982; Hoogenboezem et al. 1990; Westneat, 1990; Claes and de Vree, 1991), which is justifiable for planar movements. Unfortunately, the movements of the head bones of fish are often strongly non-planar, e.g. the movement of the pharyngeal jaws and the gill arches. The two-dimensional method is inappropriate for studying such complex movements (Sibbing, 1982; Hoogenboezem et al. 1990). For a qualitative description of movement patterns, the conditions for the use of the two-dimensional method may be somewhat relaxed. When two (or more) views of a movement are recorded simultaneously, the three-dimensional movements can readily be reconstructed using two two-dimensional images (e.g. Zarnack, 1972; Nachtigall, 1983; van Leeuwen, 1984; Drost and van den Boogaart, 1986). However, because of technical (and budget) limitations, simultaneous views of a movement cannot always be shot. In this paper, a method is presented for reconstructing the three-dimensional orientation and rotational movement of structures using single-view films and for calculating rotation in an object-bound frame. Ellington (1984) presented a similar method for determining three-dimensional wing movements from single-view films of flying insects. Ellington's method is based upon the bilateral symmetry of the wing movements. The present method does not depend on symmetry and can be applied to a variety of kinematic investigations. It eliminates a systematic error: the projection error. The measuring error is not discussed; it is the same in the two-dimensional and three-dimensional method of analysis. PMID:9317811

  6. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody.

    PubMed

    Yoshinari, Tomoya; Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena; Ohkawa, Hideo; Sugita-Konishi, Yoshiko

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L(-1). The coefficients of variation were 7.9% at 0.003 mg L(-1), 5.0% at 0.03 mg L(-1) and 13.7% at 0.3 mg L(-1), respectively. The limit of detection was 0.006 mg L(-1) for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9-100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg(-1). The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R(2) = 0.9760) than the immunochromatographic assay kit (R(2) = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety. PMID:26320967

  7. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  8. [Application of interval selection methods in quantitative analysis of multicomponent mixtures by terahertz time-domain spectroscopy].

    PubMed

    Chen, Tao; Li, Zhi; Mo, Wei; Hu, Fang-rong

    2014-12-01

    Interval selection methods combined with terahertz time-domain spectroscopy (THz-TDS) technique were used to perform quantitative analysis of component concentrations in multicomponent mixtures. The THz spectra of 100 quaternary pharmaceutical mixtures composed of lactose monohydrate, acetaminophen, microcrystalline cellulose and soluble starch were measured using THz-TDS system. Four spectral interval selection methods, including iPLS, mwPLS, siPLS and biPLS, were employed to select spectral intervals of THz absorbance spectra of multicomponent mixtures and correlate THz absorbance spectra with the concentrations of lactose monohydrate. The mwPLS method yielded the most accurate result as compared with the other three interval selection methods and full-spectrum PLS. The optimal mwPLS model was obtained with lower root mean square error of cross-validation (RMSECV) of 0.9803, lower root mean square error of prediction (RMSEP) of 1.1141, higher correlation coefficient for calibration (Rc) of 0.9960, and higher correlation coefficient for prediction (Rp) of 0.9951. Experimental results demonstrate that spectral interval selection combined with THz-TDS could be successfully applied as an accurate and rapid method to determine component concentrations in multicomponent mixtures. PMID:25881416

  9. Quantitative ultrasound method for assessing stress-strain properties and the cross-sectional area of Achilles tendon

    NASA Astrophysics Data System (ADS)

    Du, Yi-Chun; Chen, Yung-Fu; Li, Chien-Ming; Lin, Chia-Hung; Yang, Chia-En; Wu, Jian-Xing; Chen, Tainsong

    2013-12-01

    The Achilles tendon is one of the most commonly observed tendons injured with a variety of causes, such as trauma, overuse and degeneration, in the human body. Rupture and tendinosis are relatively common for this strong tendon. Stress-strain properties and shape change are important biomechanical properties of the tendon to assess surgical repair or healing progress. Currently, there are rather limited non-invasive methods available for precisely quantifying the in vivo biomechanical properties of the tendons. The aim of this study was to apply quantitative ultrasound (QUS) methods, including ultrasonic attenuation and speed of sound (SOS), to investigate porcine tendons in different stress-strain conditions. In order to find a reliable method to evaluate the change of tendon shape, ultrasound measurement was also utilized for measuring tendon thickness and compared with the change in tendon cross-sectional area under different stress. A total of 15 porcine tendons of hind trotters were examined. The test results show that the attenuation and broadband ultrasound attenuation decreased and the SOS increased by a smaller magnitude as the uniaxial loading of the stress-strain upon tendons increased. Furthermore, the tendon thickness measured with the ultrasound method was significantly correlated with tendon cross-sectional area (Pearson coefficient = 0.86). These results also indicate that attenuation of QUS and ultrasonic thickness measurement are reliable and potential parameters for assessing biomechanical properties of tendons. Further investigations are needed to warrant the application of the proposed method in a clinical setting.

  10. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  11. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 ?g L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams. PMID:26395288

  12. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  13. Qualitative and quantitative characterization of protein-phosphoinositide interactions with liposome-based methods

    PubMed Central

    Busse, Ricarda A.; Scacioc, Andreea; Hernandez, Javier M.; Krick, Roswitha; Stephan, Milena; Janshoff, Andreas; Thumm, Michael; Kühnel, Karin

    2013-01-01

    We characterized phosphoinositide binding of the S. cerevisiae PROPPIN Hsv2 qualitatively with density flotation assays and quantitatively through isothermal titration calorimetry (ITC) measurements using liposomes. We discuss the design of these experiments and show with liposome flotation assays that Hsv2 binds with high specificity to both PtdIns3P and PtdIns(3,5)P2. We propose liposome flotation assays as a more accurate alternative to the commonly used PIP strips for the characterization of phosphoinositide-binding specificities of proteins. We further quantitatively characterized PtdIns3P binding of Hsv2 with ITC measurements and determined a dissociation constant of 0.67 µM and a stoichiometry of 2:1 for PtdIns3P binding to Hsv2. PtdIns3P is crucial for the biogenesis of autophagosomes and their precursors. Besides the PROPPINs there are other PtdIns3P binding proteins with a link to autophagy, which includes the FYVE-domain containing proteins ZFYVE1/DFCP1 and WDFY3/ALFY and the PX-domain containing proteins Atg20 and Snx4/Atg24. The methods described could be useful tools for the characterization of these and other phosphoinositide-binding proteins. PMID:23445924

  14. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  15. Depth determination for shallow teleseismic earthquakes Methods and results

    SciTech Connect

    Stein, S.; Wiens, D.A.

    1986-11-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented. 119 references.

  16. Evaluation of a rapid, quantitative real-time PCR method for enumeration of pathogenic Candida cells in water

    USGS Publications Warehouse

    Brinkman, Nichole E.; Haugland, Richard A.; Wymer, Larry J.; Byappanahalli, Muruleedhara N.; Whitman, Richard L.; Vesper, Stephen J.

    2003-01-01

    Quantitative PCR (QPCR) technology, incorporating fluorigenic 5' nuclease (TaqMan) chemistry, was utilized for the specific detection and quantification of six pathogenic species of Candida (C. albicans, C. tropicalis, C. krusei, C. parapsilosis, C. glabrata and C. lusitaniae) in water. Known numbers of target cells were added to distilled and tap water samples, filtered, and disrupted directly on the membranes for recovery of DNA for QPCR analysis. The assay's sensitivities were between one and three cells per filter. The accuracy of the cell estimates was between 50 and 200% of their true value (95% confidence level). In similar tests with surface water samples, the presence of PCR inhibitory compounds necessitated further purification and/or dilution of the DNA extracts, with resultant reductions in sensitivity but generally not in quantitative accuracy. Analyses of a series of freshwater samples collected from a recreational beach showed positive correlations between the QPCR results and colony counts of the corresponding target species. Positive correlations were also seen between the cell quantities of the target Candida species detected in these analyses and colony counts of Enterococcus organisms. With a combined sample processing and analysis time of less than 4 h, this method shows great promise as a tool for rapidly assessing potential exposures to waterborne pathogenic Candida species from drinking and recreational waters and may have applications in the detection of fecal pollution.

  17. Polymorphism in nimodipine raw materials: development and validation of a quantitative method through differential scanning calorimetry.

    PubMed

    Riekes, Manoela Klüppel; Pereira, Rafael Nicolay; Rauber, Gabriela Schneider; Cuffini, Silvia Lucia; de Campos, Carlos Eduardo Maduro; Silva, Marcos Antonio Segatto; Stulzer, Hellen Karine

    2012-11-01

    Due to the physical-chemical and therapeutic impacts of polymorphism, its monitoring in raw materials is necessary. The purpose of this study was to develop and validate a quantitative method to determine the polymorphic content of nimodipine (NMP) raw materials based on differential scanning calorimetry (DSC). The polymorphs required for the development of the method were characterized through DSC, X-ray powder diffraction (XRPD) and Raman spectroscopy and their polymorphic identity was confirmed. The developed method was found to be linear, robust, precise, accurate and specific. Three different samples obtained from distinct suppliers (NMP 1, NMP 2 and NMP 3) were firstly characterized through XRPD and DSC as polymorphic mixtures. The determination of their polymorphic identity revealed that all samples presented the Modification I (Mod I) or metastable form in greatest proportion. Since the commercial polymorph is Mod I, the polymorphic characteristic of the samples analyzed needs to be investigated. Thus, the proposed method provides a useful tool for the monitoring of the polymorphic content of NMP raw materials. PMID:22795312

  18. Isotonic Regression Based-Method in Quantitative High-Throughput Screenings for Genotoxicity

    PubMed Central

    Fujii, Yosuke; Narita, Takeo; Tice, Raymond Richard; Takeda, Shunich

    2015-01-01

    Quantitative high-throughput screenings (qHTSs) for genotoxicity are conducted as part of comprehensive toxicology screening projects. The most widely used method is to compare the dose-response data of a wild-type and DNA repair gene knockout mutants, using model-fitting to the Hill equation (HE). However, this method performs poorly when the observed viability does not fit the equation well, as frequently happens in qHTS. More capable methods must be developed for qHTS where large data variations are unavoidable. In this study, we applied an isotonic regression (IR) method and compared its performance with HE under multiple data conditions. When dose-response data were suitable to draw HE curves with upper and lower asymptotes and experimental random errors were small, HE was better than IR, but when random errors were big, there was no difference between HE and IR. However, when the drawn curves did not have two asymptotes, IR showed better performance (p < 0.05, exact paired Wilcoxon test) with higher specificity (65% in HE vs. 96% in IR). In summary, IR performed similarly to HE when dose-response data were optimal, whereas IR clearly performed better in suboptimal conditions. These findings indicate that IR would be useful in qHTS for comparing dose-response data.

  19. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    NASA Astrophysics Data System (ADS)

    Ryan, C. G.; Laird, J. S.; Fisher, L. A.; Kirkham, R.; Moorhead, G. F.

    2015-11-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  20. A simple, quantitative method using alginate gel to determine rat colonic tumor volume in vivo.

    PubMed

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-04-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the Apc(Pirc/+) rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mmł. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained. PMID:24674588

  1. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N., Jr.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  2. A quantitative electroencephalographic method for xenobiotic screening in the canine model.

    PubMed

    Jones, R D; Greufe, N P

    1994-08-01

    A method, using quantitative electroencephalography (qEEG), was developed for screening xenobiotics in conjunction with neurological examinations, for defining toxicodynamic profiles of certain drugs and chemicals in the dog. The standard 10-channel montage was used to evaluate normotonic, auditory, visual, and somatosensory cortical activity. Compressed spectral analysis and Fourier Transformation determined spectral edge frequencies, distribution of the total, fractional and absolute powers of delta, theta, alpha, and beta frequencies as parameters. The alpha 2-agonist, xylazine, was used to detect treatment-related differences, threshold effect levels, and qEEG-target parameters. An increase in theta and alpha activity, and a shift to lower spectral edge frequency were noted. Visual stimulation was the least sensitive test condition in detecting significant changes in measured parameters. Data derived by qEEG may make a reliable contribution to the physiologic interpretation, along with biochemical, clinical, and pathological data collected during a regulatory study. PMID:7949380

  3. Cerenkov radiation imaging as a method for quantitative measurements of beta particles in a microfluidic chip

    PubMed Central

    Cho, Jennifer S; Taschereau, Richard; Olma, Sebastian; Liu, Kan; Chen, Yi-Chun; Shen, Clifton K-F; van Dam, R Michael; Chatziioannou, Arion F.

    2009-01-01

    It has been observed that microfluidic chips used for synthesizing 18F-labeled compounds demonstrate visible light emission without nearby scintillators or fluorescent materials. The origin of the light was investigated and found to be consistent with the emission characteristics from Cerenkov radiation. Since 18F decays through the emission of high-energy positrons, the energy threshold for beta particles, i.e., electrons or positrons, to generate Cerenkov radiation was calculated for water and polydimethylsiloxane (PDMS), the most commonly used polymer-based material for microfluidic chips. Beta particles emitted from 18F have a continuous energy spectrum, with a maximum energy that exceeds this energy threshold for both water and PDMS. In addition, the spectral characteristics of the emitted light from 18F in distilled water were also measured, yielding a broad distribution from 300 nm to 700 nm, with higher intensity at shorter wavelengths. A photograph of the 18F solution showed a bluish-white light emitted from the solution, further suggesting Cerenkov radiation. In this study, the feasibility of using this Cerenkov light emission as a method for quantitative measurements of the radioactivity within the microfluidic chip in situ was evaluated. A detector previously developed for imaging microfluidic platforms was used. The detector consisted of a charge coupled device (CCD) optically coupled to a lens. The system spatial resolution, minimum detectable activity and dynamic range were evaluated. In addition, a calibration of Cerenkov signal versus activity concentration in the microfluidic chip was determined. This novel method of Cerenkov radiation measurements will provide researchers with a simple yet robust quantitative imaging tool for microfluidic applications utilizing beta particles. PMID:19847018

  4. Development of an Analytical Method for Quantitative Determination of Atmospheric Particles By Laap-TOF Instrument

    NASA Astrophysics Data System (ADS)

    Gemayel, R.; Temime-Roussel, B.; Hellebust, S.; Gligorovski, S.; Wortham, H.

    2014-12-01

    A comprehensive understanding of the chemical composition of the atmospheric particles is of paramount importance in order to understand their impact on the health and climate. Hence, there is an imperative need for the development of appropriate analytical methods of analysis for the on-line, time-resolved measurements of atmospheric particles. Laser Ablation Aerosol Particle Time of Flight Mass Spectrometry (LAAP-TOF-MS) allows a real time qualitative analysis of nanoparticles of differing composition and size. LAAP-TOF-MS is aimed for on-line and continuous measurements of atmospheric particles with the fast time resolution in order of millisecond. This system uses a 193 nm excimer laser for particle ablation/ionization and a 403 nm scattering laser for sizing (and single particle detection/triggering). The charged ions are then extracted into a bi-polar Time-of-Flight mass spectrometer. Here we present an analytical methodology for quantitative determination of the composition and size-distribution of the particles by LAAP-TOF instrument. We developed and validate an analytical methodology of this high time resolution instrument by comparison with the conventional analysis systems with lower time resolution (electronic microscopy, optical counters…) with final aim to render the methodology quantitative. This was performed with the aid of other instruments for on-line and off-line measurement such as Scanning Mobility Particle Sizer, electronic microscopy... Validation of the analytical method was performed under laboratory conditions by detection and identification of the targeted main types such as SiO2, CeO2, and TiO2

  5. A novel method for quantitative determination of tea polysaccharide by resonance light scattering

    NASA Astrophysics Data System (ADS)

    Wei, Xinlin; Xi, Xionggang; Wu, Muxia; Wang, Yuanfeng

    2011-09-01

    A new method for the determination of tea polysaccharide (TPS) in green tea ( Camellia sinensis) leaves has been developed. The method was based on the enhancement of resonance light scattering (RLS) of TPS in the presence of cetylpyridinium chloride (CPC)-NaOH system. Under the optimum conditions, the RLS intensity of CPC was greatly enhanced by adding TPS. The maximum peak of the enhanced RLS spectra was located at 484.02 nm. The enhanced RLS intensity was proportional to the concentration of TPS in the range of 2.0-20 ?g/ml. It showed that the new method and phenol-sulfuric acid method give some equivalent results by measuring the standard compounds. The recoveries of the two methods were 96.39-103.7% (novel method) and 100.15-103.65% (phenol-sulfuric acid method), respectively. However, it showed that the two methods were different to some extent. The new method offered a limit of detection (LOD) of 0.047 ?g/ml, whereas the phenol-sulfuric acid method gives a LOD of 1.54 ?g/ml. Interfered experiment demonstrated that the new method had highly selectivity, and was more suitable for the determination of TPS than phenol-sulfuric method. Stability test showed that new method had good stability. Moreover, the proposed method owns the advantages of easy operation, rapidity and practicability, which suggested that the proposed method could be satisfactorily applied to the determination of TPS in green tea.

  6. A novel method for quantitative determination of tea polysaccharide by resonance light scattering.

    PubMed

    Wei, Xinlin; Xi, Xionggang; Wu, Muxia; Wang, Yuanfeng

    2011-09-01

    A new method for the determination of tea polysaccharide (TPS) in green tea (Camellia sinensis) leaves has been developed. The method was based on the enhancement of resonance light scattering (RLS) of TPS in the presence of cetylpyridinium chloride (CPC)-NaOH system. Under the optimum conditions, the RLS intensity of CPC was greatly enhanced by adding TPS. The maximum peak of the enhanced RLS spectra was located at 484.02 nm. The enhanced RLS intensity was proportional to the concentration of TPS in the range of 2.0-20 ?g/ml. It showed that the new method and phenol-sulfuric acid method give some equivalent results by measuring the standard compounds. The recoveries of the two methods were 96.39-103.7% (novel method) and 100.15-103.65% (phenol-sulfuric acid method), respectively. However, it showed that the two methods were different to some extent. The new method offered a limit of detection (LOD) of 0.047 ?g/ml, whereas the phenol-sulfuric acid method gives a LOD of 1.54 ?g/ml. Interfered experiment demonstrated that the new method had highly selectivity, and was more suitable for the determination of TPS than phenol-sulfuric method. Stability test showed that new method had good stability. Moreover, the proposed method owns the advantages of easy operation, rapidity and practicability, which suggested that the proposed method could be satisfactorily applied to the determination of TPS in green tea. PMID:21571584

  7. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    PubMed Central

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  8. Quantitative Analysis of the Lamellarity of Giant Liposomes Prepared by the Inverted Emulsion Method

    PubMed Central

    Chiba, Masataka; Miyazaki, Makito; Ishiwata, Shin’ichi

    2014-01-01

    The inverted emulsion method is used to prepare giant liposomes by pushing water-in-oil droplets through the oil/water interface into an aqueous medium. Due to the high encapsulation efficiency of proteins under physiological conditions and the simplicity of the protocol, it has been widely used to prepare various cell models. However, the lamellarity of liposomes prepared by this method has not been evaluated quantitatively. Here, we prepared liposomes that were partially stained with a fluorescent dye, and analyzed their fluorescence intensity under an epifluorescence microscope. The fluorescence intensities of the membranes of individual liposomes were plotted against their diameter. The plots showed discrete distributions, which were classified into several groups. The group with the lowest fluorescence intensity was determined to be unilamellar by monitoring the exchangeability of the inner and the outer solutions of the liposomes in the presence of the pore-forming toxin ?-hemolysin. Increasing the lipid concentration dissolved in oil increased the number of liposomes ?100 times. However, almost all the liposomes were unilamellar even at saturating lipid concentrations. We also investigated the effects of lipid composition and liposome content, such as highly concentrated actin filaments and Xenopus egg extracts, on the lamellarity of the liposomes. Remarkably, over 90% of the liposomes were unilamellar under all conditions examined. We conclude that the inverted emulsion method can be used to efficiently prepare giant unilamellar liposomes and is useful for designing cell models. PMID:25028876

  9. Quantitatively estimating defects in graphene devices using discharge current analysis method

    PubMed Central

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 1014/cm2, which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  10. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  11. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  12. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    ERIC Educational Resources Information Center

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of…

  13. Monochloramine disinfection kinetics of Nitrosomonas europaea by propidium monoazide quantitative PCR and Live/Dead BacLight Methods

    EPA Science Inventory

    Monochloramine disinfection kinetics were determined for the pure culture ammonia-oxidizing bacterium Nitrosomonas europaea (ATCC 19718) by two culture independent methods: (1) LIVE/DEAD® BacLight™ (LD) and (2) propidium monoazide quantitative PCR (PMA-qPCR). Both methods were f...

  14. Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods. NCEE 2014-4017

    ERIC Educational Resources Information Center

    Schochet, Peter Z.; Puma, Mike; Deke, John

    2014-01-01

    This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…

  15. Fourier transform infrared least-squares methods for the quantitative analysis of multicomponent mixtures of airborne vapors of industrial hygiene concern

    SciTech Connect

    Li-shi, Y.; Levine, S.P.

    1989-04-01

    Air monitoring methods suitable for use in the workplace, though accurate for monitoring individual compounds or classes of compounds, cannot be used to monitor several compounds or classes of compounds simultaneously. In the past few years, Fourier transform infrared (FT-IR) spectroscopy have been investigated for use as a method for multicomponent quantitative analysis. This work focuses on quantitative analysis of six mixtures in ambient air. The concentration ranges of the two- to six-component mixtures are from 50 ppm to 100 ppb. The optimal least-squares fit (LSF) method selected, background reference file chosen, and quantitative peak windows picked were evaluated in this effort. The quantitative results of six mixtures were accurate at the 50, 10, and 1 ppm levels. There were some components for which the analysis was also accurate at the 0.1 ppm level. The data indicate that the LSF program could be used to quantify strongly overlapping multicomponent mixtures. The results support the conclusion that the FT-IR spectrometer is appropriate for the direct quantification of multicomponent mixtures of many airborne gases and vapors of industrial hygiene concern.

  16. A new quantitative method to evaluate the in vitro bioactivity of melt and sol-gel-derived silicate glasses.

    PubMed

    Arcos, D; Greenspan, D C; Vallet-Regí, M

    2003-06-01

    Two melt-derived glasses (45S5 and 60S) and four sol-gel glasses (58S, 68S, 77S, and 91S) have been synthesized. The activation energy for the silicon release was determined, and a very close correlation was observed between this value and published results of the bioactive behavior of the glasses. This relationship can be explained in terms of the influence of chemical composition, textural properties, and structural density on the silanol group formation and silicon dissolution. These measurements provide a quantitative method to evaluate the in vitro bioactivity of SiO(2)-based glasses. Preliminary studies suggest an activation energy gap (Ea) of 0.35-0.5 eV as a boundary between bioactive and nonbioactive glasses. PMID:12746881

  17. Comparison of concentration methods for rapid detection of hookworm ova in wastewater matrices using quantitative PCR.

    PubMed

    Gyawali, P; Ahmed, W; Jagals, P; Sidhu, J P S; Toze, S

    2015-12-01

    Hookworm infection contributes around 700 million infections worldwide especially in developing nations due to increased use of wastewater for crop production. The effective recovery of hookworm ova from wastewater matrices is difficult due to their low concentrations and heterogeneous distribution. In this study, we compared the recovery rates of (i) four rapid hookworm ova concentration methods from municipal wastewater, and (ii) two concentration methods from sludge samples. Ancylostoma caninum ova were used as surrogate for human hookworm (Ancylostoma duodenale and Necator americanus). Known concentration of A. caninum hookworm ova were seeded into wastewater (treated and raw) and sludge samples collected from two wastewater treatment plants (WWTPs) in Brisbane and Perth, Australia. The A. caninum ova were concentrated from treated and raw wastewater samples using centrifugation (Method A), hollow fiber ultrafiltration (HFUF) (Method B), filtration (Method C) and flotation (Method D) methods. For sludge samples, flotation (Method E) and direct DNA extraction (Method F) methods were used. Among the four methods tested, filtration (Method C) method was able to recover higher concentrations of A. caninum ova consistently from treated wastewater (39-50%) and raw wastewater (7.1-12%) samples collected from both WWTPs. The remaining methods (Methods A, B and D) yielded variable recovery rate ranging from 0.2 to 40% for treated and raw wastewater samples. The recovery rates for sludge samples were poor (0.02-4.7), although, Method F (direct DNA extraction) provided 1-2 orders of magnitude higher recovery rate than Method E (flotation). Based on our results it can be concluded that the recovery rates of hookworm ova from wastewater matrices, especially sludge samples, can be poor and highly variable. Therefore, choice of concentration method is vital for the sensitive detection of hookworm ova in wastewater matrices. PMID:26358269

  18. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors of uncertainty, and by partial dissolution of the injected CO2, thus reducing the amount of free gas, which can be detected by seismic time-lapse observations. These quantitative assessment studies have shown that conformity between injected and estimated CO2 quantities can only be achieved with some degree of uncertainty which needs to be quantified for a realistic assessment of conformity studies.

  19. The Case Study of an F/OSS Virtualization Platform Deployment and Quantitative Results

    NASA Astrophysics Data System (ADS)

    Stathopoulos, Panagiotis; Soumplis, Alexandros; Houssos, Nikos

    In this paper we present practical experiences and results from the deployment of an F/OSS virtualization platform. EKT’s (NDC) core IT infrastructure was transformed to a virtualized one, using exclusively F/OSS, while severe budget and timing constraints were in place. This migration was initiated in order to better cope with EKT’s services requirements, while accommodating at the same time the need for the in house development of a large scale open access infrastructure. The benefits derived from this migration were not only generic virtualization benefits, such as the quantifiable reduced power consumption and cost reduction through consolidation, but also F/OSS virtualization specific ones.

  20. Development and validation of an event-specific quantitative PCR method for genetically modified maize MIR162.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2014-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162. PMID:25743383

  1. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  2. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  3. Quantitative Equation-of-State Results from Isentropic Compression Experiments to Multimegabar Pressures

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul

    2005-07-01

    Isentropic ramp-wave loading of condensed matter has long been hailed as a possible experimental technique to obtain accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multimegabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited accuracy due to reliance on theoretical EOS of calibration standards, thus accurate isentropic compression data would help immensely in constraining EOS models. An isentropic compression technique developed using the Z Machine at Sandia as a magnetic drive has been extended to the multimegabar regime by recent advances in current-pulse shaping. Diagnostics typically consist of time-resolved velocity interferometry to monitor the back surfaces of samples having different thickness but subjected to the same magnetic loading. Extraction of a stress-density curve from such data requires that the experiment has been designed to avoid coupling (during the time of interest) between the back surface and the joule-heated region where the stress wave is generated. Uncertainty in the result at multimegabar pressure is dominated by uncertainty in the transit time, or difference in arrival times for a particular velocity, between samples of different thickness. After a brief discussion of experiment design issues, a detailed analysis of data on aluminum to 240 GPa will be presented, followed by some results from recent experiments on other materials. * Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000

  4. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging

    NASA Astrophysics Data System (ADS)

    Könik, Arda; Kupinski, Meredith; Hendrik Pretorius, P.; King, Michael A.; Barrett, Harrison H.

    2015-08-01

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3?cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested.

  5. Infrared Fluorescence for Vascular Barrier Breach In Vivo – A Novel Method for Quantitation of Albumin Efflux

    PubMed Central

    von Drygalski, Annette; Furla-Freguia, Christian; Mosnier, Laurent O.; Yegneswaran, Subramanian; Ruf, Wolfram; Griffin, John H.

    2013-01-01

    Summary Vascular hyperpermeability contributes to morbidity in inflammation. Current methodologies for in vivo assessment of permeability based on extravasation of Evans Blue (EB)-bound albumin are cumbersome and often lack sensitivity. We developed a novel infrared fluorescence (IRF) methodology for measurement of EB-albumin extravasation to quantify vascular permeability in murine models. Vascular permeability induced by endotoxemia was examined for all solid organs, brain, skin and peritoneum by IRF and the traditional absorbance-based measurement of EB in tissue extracts. Organ IRF increased linearly with increasing concentrations of i.v. EB (2.5-25 mg/kg). Tissue IRF was more sensitive for EB accumulation compared to the absorbance-based method. Accordingly, differences in vascular permeability and organ EB accumulation between lipopolysaccharide-treated and saline-treated mice were often significant when analyzed by IRF-based detection but not by absorbance-based detection. EB was detected in all 353 organs analyzed with IRF but only in 67% (239/353) of organs analyzed by absorbance-based methodology, demonstrating improved sensitivity of EB detection in organs with IRF. In contrast, EB in plasma after EB administration was readily measured by both methods with high correlation between the two methods (n=116, r2=0.86). Quantitation of organ-specific EB-IRF differences due to endotoxin was optimal when IRF was compared between mice matched for weight, gender, and age, and with appropriate corrections for organ weight and EB plasma concentrations. Notably, EB-IRF methodology leaves organs intact for subsequent histopathology. In summary, EB-IRF is a novel, highly sensitive, rapid, and convenient method for the relative quantification of EB in intact organs of treatment versus control mice. PMID:23052565

  6. Quantitative Results from Shockless Compression Experiments on Solids to Multi-Megabar Pressure

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul; Brown, Justin; Knudson, Marcus; Lemke, Raymond

    2015-03-01

    Quasi-isentropic, shockless ramp-wave experiments promise accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multi-megabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited pressure accuracy due to reliance on theoretical EOS of calibration standards, thus accurate quasi-isentropic compression data would help immensely in constraining EOS models. Multi-megabar shockless compression experiments using the Z Machine at Sandia as a magnetic drive with stripline targets continue to be performed on a number of solids. New developments will be presented in the design and analysis of these experiments, including topics such as 2-D and magneto-hydrodynamic (MHD) effects and the use of LiF windows. Results will be presented for tantalum and/or gold metals, with comparisons to independently developed EOS. * Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  7. Semi-quantitative characterisation of ambient ultrafine aerosols resulting from emissions of coal fired power stations.

    PubMed

    Hinkley, J T; Bridgman, H A; Buhre, B J P; Gupta, R P; Nelson, P F; Wall, T F

    2008-02-25

    Emissions from coal fired power stations are known to be a significant anthropogenic source of fine atmospheric particles, both through direct primary emissions and secondary formation of sulfate and nitrate from emissions of gaseous precursors. However, there is relatively little information available in the literature regarding the contribution emissions make to the ambient aerosol, particularly in the ultrafine size range. In this study, the contribution of emissions to particles smaller than 0.3 mum in the ambient aerosol was examined at a sampling site 7 km from two large Australian coal fired power stations equipped with fabric filters. A novel approach was employed using conditional sampling based on sulfur dioxide (SO(2)) as an indicator species, and a relatively new sampler, the TSI Nanometer Aerosol Sampler. Samples were collected on transmission electron microscope (TEM) grids and examined using a combination of TEM imaging and energy dispersive X-ray (EDX) analysis for qualitative chemical analysis. The ultrafine aerosol in low SO(2) conditions was dominated by diesel soot from vehicle emissions, while significant quantities of particles, which were unstable under the electron beam, were observed in the high SO(2) samples. The behaviour of these particles was consistent with literature accounts of sulfate and nitrate species, believed to have been derived from precursor emissions from the power stations. A significant carbon peak was noted in the residues from the evaporated particles, suggesting that some secondary organic aerosol formation may also have been catalysed by these acid seed particles. No primary particulate material was observed in the minus 0.3 mum fraction. The results of this study indicate the contribution of species more commonly associated with gas to particle conversion may be more significant than expected, even close to source. PMID:18054995

  8. Novel Method for Automated Analysis of Retinal Images: Results in Subjects with Hypertensive Retinopathy and CADASIL.

    PubMed

    Cavallari, Michele; Stamile, Claudio; Umeton, Renato; Calimeri, Francesco; Orzi, Francesco

    2015-01-01

    Morphological analysis of the retinal vessels by fundoscopy provides noninvasive means for detecting and staging systemic microvascular damage. However, full exploitation of fundoscopy in clinical settings is limited by paucity of quantitative, objective information obtainable through the observer-driven evaluations currently employed in routine practice. Here, we report on the development of a semiautomated, computer-based method to assess retinal vessel morphology. The method allows simultaneous and operator-independent quantitative assessment of arteriole-to-venule ratio, tortuosity index, and mean fractal dimension. The method was implemented in two conditions known for being associated with retinal vessel changes: hypertensive retinopathy and Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL). The results showed that our approach is effective in detecting and quantifying the retinal vessel abnormalities. Arteriole-to-venule ratio, tortuosity index, and mean fractal dimension were altered in the subjects with hypertensive retinopathy or CADASIL with respect to age- and gender-matched controls. The interrater reliability was excellent for all the three indices (intraclass correlation coefficient ? 85%). The method represents simple and highly reproducible means for discriminating pathological conditions characterized by morphological changes of retinal vessels. The advantages of our method include simultaneous and operator-independent assessment of different parameters and improved reliability of the measurements. PMID:26167496

  9. Novel Method for Automated Analysis of Retinal Images: Results in Subjects with Hypertensive Retinopathy and CADASIL

    PubMed Central

    Cavallari, Michele; Stamile, Claudio; Umeton, Renato; Calimeri, Francesco; Orzi, Francesco

    2015-01-01

    Morphological analysis of the retinal vessels by fundoscopy provides noninvasive means for detecting and staging systemic microvascular damage. However, full exploitation of fundoscopy in clinical settings is limited by paucity of quantitative, objective information obtainable through the observer-driven evaluations currently employed in routine practice. Here, we report on the development of a semiautomated, computer-based method to assess retinal vessel morphology. The method allows simultaneous and operator-independent quantitative assessment of arteriole-to-venule ratio, tortuosity index, and mean fractal dimension. The method was implemented in two conditions known for being associated with retinal vessel changes: hypertensive retinopathy and Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL). The results showed that our approach is effective in detecting and quantifying the retinal vessel abnormalities. Arteriole-to-venule ratio, tortuosity index, and mean fractal dimension were altered in the subjects with hypertensive retinopathy or CADASIL with respect to age- and gender-matched controls. The interrater reliability was excellent for all the three indices (intraclass correlation coefficient ? 85%). The method represents simple and highly reproducible means for discriminating pathological conditions characterized by morphological changes of retinal vessels. The advantages of our method include simultaneous and operator-independent assessment of different parameters and improved reliability of the measurements. PMID:26167496

  10. A method for quantitative analysis of clump thickness in cervical cytology slides.

    PubMed

    Fan, Yilun; Bradley, Andrew P

    2016-01-01

    Knowledge of the spatial distribution and thickness of cytology specimens is critical to the development of digital slide acquisition techniques that minimise both scan times and image file size. In this paper, we evaluate a novel method to achieve this goal utilising an exhaustive high-resolution scan, an over-complete wavelet transform across multi-focal planes and a clump segmentation of all cellular materials on the slide. The method is demonstrated with a quantitative analysis of ten normal, but difficult to scan Pap stained, Thin-prep, cervical cytology slides. We show that with this method the top and bottom of the specimen can be estimated to an accuracy of 1?m in 88% and 97% of the fields of view respectively. Overall, cellular material can be over 30?m thick and the distribution of cells is skewed towards the cover-slip (top of the slide). However, the median clump thickness is 10?m and only 31% of clumps contain more than three nuclei. Therefore, by finding a focal map of the specimen the number of 1?m spaced focal planes that are required to be scanned to acquire 95% of the in-focus material can be reduced from 25.4 to 21.4 on average. In addition, we show that by considering the thickness of the specimen, an improved focal map can be produced which further reduces the required number of 1?m spaced focal planes to 18.6. This has the potential to reduce scan times and raw image data by over 25%. PMID:26477005

  11. Application of stability-indicating HPTLC method for quantitative determination of metadoxine in pharmaceutical dosage form.

    PubMed

    Kaul, Neeraj; Agrawal, Himani; Patil, Bharat; Kakad, Abhijit; Dhaneshwar, S R

    2005-04-01

    A sensitive, selective, precise and stability-indicating high-performance thin-layer chromatographic method for analysis of metadoxine both as a bulk drug and in formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F-254 as the stationary phase. The solvent system consisted of acetone-chloroform-methanol-ammonia (7.0:4.0:3.0:1.2, v/v/v/v). Densitometric analysis of metadoxine was carried out in the absorbance mode at 315 nm. This system was found to give compact spots for metadoxine (Rf value of 0.45+/-0.02, for six replicates). Metadoxine was subjected to acid, alkali and neutral hydrolysis, oxidation, dry and wet heat treatment and photo and UV degradation. The drug undergoes degradation under all stress conditions. Also, the degraded products were well resolved from the pure drug with significantly different Rf values. The method was validated for linearity, precision, robustness, LOD, LOQ, specificity and accuracy. Linearity was found to be in the range of 100-1500 ng/spot with significantly high value of correlation coefficient r2=0.9997+/-1.02. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.9999+/-0.58 in the working concentration range of 200-700 ng/spot. The mean value of slope and intercept were 0.11+/-0.04 and 18.73+/-1.89, respectively. The limits of detection and quantitation were 50 and 100 ng/spot, respectively. Statistical analysis proves that the method is repeatable and specific for the estimation of the said drug. As the method could effectively separate the drug from its degradation products, it can be employed as a stability-indicating one. Moreover, the proposed HPTLC method was utilized to investigate the kinetics of acid and base degradation process. Arrhenius plot was constructed and activation energy was calculated respectively for acid and base degradation process. PMID:15848212

  12. Semi-quantitative determination of the modes of occurrence of elements in coal: Results from an International Round Robin Project

    SciTech Connect

    Willett, J.C.; Finkelman, R.B.; Mroczkowski, S.J.; Palmer, C.A.; Kolker, A.

    1999-07-01

    Quantifying the modes of occurrence of elements in coal is necessary for the development of models to predict an element's behavior during in-ground leaching, weathering, coal cleaning, and combustion. Anticipating the behavior of the trace elements is necessary for evaluating the environmental and human health impacts, technological impacts, and economic byproduct potential of coal use. To achieve the goal of quantifying element modes of occurrence, an international round robin project was initiated. Four bituminous coal samples (from the United States, England, Australia and Canada) were distributed to participating laboratories (9 labs from 5 countries) for analysis. Preliminary results indicate that there is good agreement among six laboratories for the chemical analyses. Using selective leaching, quantitative electron microprobe analyses, and semi-quantitative X-ray diffraction, the authors found that many elements have similar modes of occurrence in all four samples. For example, at least 75% of the Al, K, and Li and about 50% of Be, Sc, V, and Cr are leached by HF. Because HF dissolves silicates, the authors infer that these elements are in the clays. As, Hg, Cu, Zn, Cd, and Pb are leached primarily by HCl and HNO{sub 3}, indicating that they are associated with mono- (such as sphalerite and galena) and di-sulfides (pyrite). Leaching results indicate that small amounts of these metals may be associated with clays and organics. Iron behaves differently in each three of the samples, likely due to different proportions of iron in sulfide, carbonate, and silicate phases. Results from the other laboratories (using selective leaching and density separations) appear to be consistent with these results.

  13. Effect of platform, reference material, and quantification model on enumeration of Enterococcus by quantitative PCR methods

    EPA Science Inventory

    Quantitative polymerase chain reaction (qPCR) is increasingly being used for the quantitative detection of fecal indicator bacteria in beach water. QPCR allows for same-day health warnings, and its application is being considered as an optionn for recreational water quality testi...

  14. Quantitative determination of zopiclone and its impurity by four different spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Abdelrahman, Maha M.; Naguib, Ibrahim A.; El Ghobashy, Mohamed R.; Ali, Nesma A.

    2015-02-01

    Four simple, sensitive and selective spectrophotometric methods are presented for determination of Zopiclone (ZPC) and its impurity, one of its degradation products, namely; 2-amino-5-chloropyridine (ACP). Method A is a dual wavelength spectrophotometry; where two wavelengths (252 and 301 nm for ZPC, and 238 and 261 nm for ACP) were selected for each component in such a way that difference in absorbance is zero for the second one. Method B is isoabsorptive ratio method by combining the isoabsorptive point (259.8 nm) in the ratio spectrum using ACP as a divisor and the ratio difference for a single step determination of both components. Method C is third derivative (D3) spectrophotometric method which allows determination of both ZPC at 283.6 nm and ACP at 251.6 nm without interference of each other. Method D is based on measuring the peak amplitude of the first derivative of the ratio spectra (DD1) at 263.2 nm for ZPC and 252 nm for ACP. The suggested methods were validated according to ICH guidelines and can be applied for routine analysis in quality control laboratories. Statistical analysis of the results obtained from the proposed methods and those obtained from the reported method has been carried out revealing high accuracy and good precision.

  15. Multiplexed LC-MS/MS method for the simultaneous quantitation of three novel hepatitis C antivirals, daclatasvir, asunaprevir, and beclabuvir in human plasma.

    PubMed

    Jiang, Hao; Kandoussi, Hamza; Zeng, Jianing; Wang, Jian; Demers, Roger; Eley, Timothy; He, Bing; Burrell, Richard; Easter, John; Kadiyala, Pathanjali; Pursley, Janice; Cojocaru, Laura; Baker, Chanda; Ryan, John; Aubry, Anne-Françoise; Arnold, Mark E

    2015-03-25

    Dual or triple combination regimens of novel hepatitis C direct-acting antivirals (DAA, daclatasvir, asunaprevir, or beclabuvir) provide high sustained virological response rates and reduced frequency of resistance compared to clinical monotherapy. To support pharmacokinetic (PK) assessments in clinical studies, a multiplexed liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the simultaneous quantitation of daclatasvir, asunaprevir, beclabuvir (BMS-791325) and its active metabolite (BMS-794712) in human plasma was developed and validated. Human plasma samples were extracted with methyl-t-butyl ether followed by an LC-MS/MS analysis, which was conducted in a multiple reaction monitoring (MRM) mode. The lower limits of quantitation (LLOQ) were 1 ng/mL for daclatasvir, asunaprevir, and BMS-794712, and 2 ng/mL for beclabuvir. Intra-run precision (?4.5% CV), inter-run precision (?2.9% CV), and accuracy (±5.3% deviation) based on different concentration levels (low, geometric mean, mid and high) of the quality control samples (QCs) provided evidence of the methods accuracy and precision. Selectivity and matrix effect on LC-MS/MS detection, stability in plasma, and potential interference of coadministered drugs (ribavirin and interferon) were all evaluated and the results were acceptable. Method reproducibility was demonstrated by the reanalysis of a portion of study samples. The cross-validation results for QCs demonstrated the equivalency between this method and two single-analyte methods which were previously validated for quantitation of daclatasvir in human plasma. This approach of using a multiplexed LC-MS/MS method for the simultaneous quantitation of three DAAs is time- and cost-effective, and can maintain good data quality in sample analysis. PMID:25676854

  16. Toxicity of ionic liquids: database and prediction via quantitative structure-activity relationship method.

    PubMed

    Zhao, Yongsheng; Zhao, Jihong; Huang, Ying; Zhou, Qing; Zhang, Xiangping; Zhang, Suojiang

    2014-08-15

    A comprehensive database on toxicity of ionic liquids (ILs) is established. The database includes over 4000 pieces of data. Based on the database, the relationship between IL's structure and its toxicity has been analyzed qualitatively. Furthermore, Quantitative Structure-Activity relationships (QSAR) model is conducted to predict the toxicities (EC50 values) of various ILs toward the Leukemia rat cell line IPC-81. Four parameters selected by the heuristic method (HM) are used to perform the studies of multiple linear regression (MLR) and support vector machine (SVM). The squared correlation coefficient (R(2)) and the root mean square error (RMSE) of training sets by two QSAR models are 0.918 and 0.959, 0.258 and 0.179, respectively. The prediction R(2) and RMSE of QSAR test sets by MLR model are 0.892 and 0.329, by SVM model are 0.958 and 0.234, respectively. The nonlinear model developed by SVM algorithm is much outperformed MLR, which indicates that SVM model is more reliable in the prediction of toxicity of ILs. This study shows that increasing the relative number of O atoms of molecules leads to decrease in the toxicity of ILs. PMID:24996150

  17. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V. (Tijeras, NM)

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  18. A simple method for the quantitative microextraction of polychlorinated biphenyls from soils and sediments.

    SciTech Connect

    Szostek, B.; Tinklenberg, J. A.; Aldstadt, J. H., III; Environmental Research

    1999-01-01

    We demonstrate the quantitative extraction of polychlorinated biphenyls (PCBs) from environmental solids by using a microscale adaptation of pressurized fluid extraction ({mu}PFE). The stainless steel extraction cells are filled with a solid sample and solvent and are heated at elevated temperature. After cooling the cell to room temperature, we determined PCBs in the extract by direct injection to a gas chromatograph with an electron capture detection system. This extraction method was tested on a set of PCB-spiked solid matrices and on a PCB-contaminated river sediment (KIST SRM 1939). Recoveries were measured for eight PCB congeners spiked into two soil types with hexane extraction at 100{sup o}C (>81.9 {+-} 5.4% to 112.5 {+-} 10.1 %). The extraction process for SRM 1939 with hexane at 300{sup o}C provided significantly higher recoveries for several representative PCB congeners than reported for a duplicate 16-hour Soy-Wet extraction with a mixture of organic solvents (acetone/hexane).

  19. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  20. A model independent method for quantitative estimation of $SU(3)$ flavor symmetry breaking using Dalitz plot

    E-print Network

    Dibyakrupa Sahoo; Rahul Sinha; N. G. Deshpande

    2015-02-25

    The light hadron states are satisfactorily described in the quark model using $SU(3)$ flavor symmetry. If the $SU(3)$ flavor symmetry relating the light hadrons were exact, one would have an exchange symmetry between these hadrons arising out of the exchange of the up, down and strange quarks. This aspect of $SU(3)$ symmetry is used extensively to relate many decay modes of heavy quarks. However, the nature of the effects of $SU(3)$ breaking in such decays is not well understood and hence, a reliable estimate of $SU(3)$ breaking effects is missing. In this work we propose a new method to quantitatively estimate the extent of flavor symmetry breaking and better understand the nature of such breaking using Dalitz plot. We study the three non-commuting $SU(2)$ symmetries (subsumed in $SU(3)$ flavor symmetry): isospin (or $T$-spin), $U$-spin and $V$-spin, using the Dalitz plots of some three-body meson decays. We look at the Dalitz plot distributions of decays in which pairs of the final three particles are related by two distinct $SU(2)$ symmetries. We show that such decay modes have characteristic distributions that enable the measurement of violation of each of the three $SU(2)$ symmetries via Dalitz plot asymmetries in a single decay mode. Experimental estimates of these easily measurable asymmetries would help in better understanding the weak decays of heavy mesons into both two and three light mesons.

  1. Challenges of Interdisciplinary Research: Reconciling Qualitative and Quantitative Methods for Understanding Human-Landscape Systems

    NASA Astrophysics Data System (ADS)

    Lach, Denise

    2014-01-01

    While interdisciplinary research is increasingly practiced as a way to transcend the limitations of individual disciplines, our concepts, and methods are primarily rooted in the disciplines that shape the way we think about the world and how we conduct research. While natural and social scientists may share a general understanding of how science is conducted, disciplinary differences in methodologies quickly emerge during interdisciplinary research efforts. This paper briefly introduces and reviews different philosophical underpinnings of quantitative and qualitative methodological approaches and introduces the idea that a pragmatic, realistic approach may allow natural and social scientists to work together productively. While realism assumes that there is a reality that exists independently of our perceptions, the work of scientists is to explore the mechanisms by which actions cause meaningful outcomes and the conditions under which the mechanisms can act. Our task as interdisciplinary researchers is to use the insights of our disciplines in the context of the problem to co-produce an explanation for the variables of interest. Research on qualities necessary for successful interdisciplinary researchers is also discussed along with recent efforts by funding agencies and academia to increase capacities for interdisciplinary research.

  2. Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation

    PubMed Central

    Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.

    2013-01-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  3. Development of Screening Method for an Frail Elderly by Measurement Quantitative Lower Limb Muscular Strength

    NASA Astrophysics Data System (ADS)

    Yamashita, Kazuhiko; Iwakami, Yumi; Imaizumi, Kazuya; Sato, Mitsuru; Nakajima, Sawako; Ino, Shuichi; Kawasumi, Masashi; Ifukube, Tohru

    Falling is one of the most serious problems for the elderly. The aim of this study was to develop a screening method for identifying factors that increase the risk of falling among the elderly, particularly with regard to lower limb muscular strength. Subjects were 48 elderly volunteers, including 25 classed as healthy and 23 classed as frail. All subjects underwent measurement of lower limb muscular strength via toe gap force and measurement of muscle strength of the hip joint adductor via knee gap force. In the frail group, toe gap force of the right foot was 20% lower than that in the healthy group; toe gap force of the left foot in the frail group was 23% lower than that in the healthy group, while knee gap force was 20% lower. Furthermore, we found that combining left toe gap force and knee gap force gave the highest odds ratio (6.05) with 82.6% sensitivity and 56.0% specificity when the toe gap force was 24 N and the knee gap force was 100 N. Thus, lower limb muscular strength can be used for simple and efficient screening, and approaches to prevent falls can be based on quantitative data such as lower limb muscular strength.

  4. Spectral simulation methods for enhancing qualitative and quantitative analyses based on infrared spectroscopy and quantitative calibration methods for passive infrared remote sensing of volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Sulub, Yusuf Ismail

    Infrared spectroscopy (IR) has over the years found a myriad of applications including passive environmental remote sensing of toxic pollutants and the development of a blood glucose sensor. In this dissertation, capabilities of both these applications are further enhanced with data analysis strategies employing digital signal processing and novel simulation approaches. Both quantitative and qualitative determinations of volatile organic compounds are investigated in the passive IR remote sensing research described in this dissertation. In the quantitative work, partial least-squares (PLS) regression analysis is used to generate multivariate calibration models for passive Fourier transform IR remote sensing measurements of open-air generated vapors of ethanol in the presence methanol as an interfering species. A step-wise co-addition scheme coupled with a digital filtering approach is used to attenuate the effects of variation in optical path length or plume width. For the qualitative study, an IR imaging line scanner is used to acquire remote sensing data in both spatial and spectral domains. This technology is capable of not only identifying but also specifying the location of the sample under investigation. Successful implementation of this methodology is hampered by the huge costs incurred to conduct these experiments and the impracticality of acquiring large amounts of representative training data. To address this problem, a novel simulation approach is developed that generates training data based on synthetic analyte-active and measured analyte-inactive data. Subsequently, automated pattern classifiers are generated using piecewise linear discriminant analysis to predict the presence of the analyte signature in measured imaging data acquired in remote sensing applications. Near infrared glucose determinations based on the region of 5000--4000 cm-1 is the focus of the research in the latter part of this dissertation. A six-component aqueous matrix of glucose in the presence of five other interferent species, all spanning physiological levels, is analyzed quantitatively. Multivariate PLS regression analysis in conjunction with samples designated into a calibration set is used to formulate models for predicting glucose concentrations. Variations in the instrumental response caused by drift and environmental factors are observed to degrade the performance of these models. As a remedy, a model updating approach based on spectral simulation is developed that is highly successful in eliminating the adverse effects of non-chemical variations.

  5. Quantitation of the main constituents of vanilla by reverse phase HPLC and ultra-high-pressure-liquid-chromatography with UV detection: method validation and performance comparison.

    PubMed

    Cicchetti, Esmeralda; Chaintreau, Alain

    2009-09-01

    Vanilla's main constituents, i. e., vanillin, para-hydroxybenzaldehyde, and their corresponding acids, can be easily quantified by RP LC with UV detection and external calibration. This paper describes two methods that were developed using HPLC and ultra-high-pressure LC (UHPLC), respectively, and validated according to the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). Both methods were highly specific, exhibited good linearities with high precision, and achieved good accuracies of quantitative results. The UHPLC method was more sensitive, five times shorter, and gave better peak resolutions than the HPLC alternative. PMID:19714659

  6. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    EPA Science Inventory

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  7. Errors in quantitative T1rho imaging and the correction methods

    PubMed Central

    2015-01-01

    The spin-lattice relaxation time constant in rotating frame (T1rho) is useful for assessment of the properties of macromolecular environment inside tissue. Quantification of T1rho is found promising in various clinical applications. However, T1rho imaging is prone to image artifacts and quantification errors, which remains one of the greatest challenges to adopt this technique in routine clinical practice. The conventional continuous wave spin-lock is susceptible to B1 radiofrequency (RF) and B0 field inhomogeneity, which appears as banding artifacts in acquired images. A number of methods have been reported to modify T1rho prep RF pulse cluster to mitigate this effect. Adiabatic RF pulse can also be used for spin-lock with insensitivity to both B1 RF and B0 field inhomogeneity. Another source of quantification error in T1rho imaging is signal evolution during imaging data acquisition. Care is needed to affirm such error does not take place when specific pulse sequence is used for imaging data acquisition. Another source of T1rho quantification error is insufficient signal-to-noise ratio (SNR), which is common among various quantitative imaging approaches. Measurement of T1rho within an ROI can mitigate this issue, but at the cost of reduced resolution. Noise-corrected methods are reported to address this issue in pixel-wise quantification. For certain tissue type, T1rho quantification can be confounded by magic angle effect and the presence of multiple tissue components. Review of these confounding factors from inherent tissue properties is not included in this article. PMID:26435922

  8. Effects of drying methods on qualitative and quantitative properties of essential oil of two basil landraces.

    PubMed

    Ghasemi Pirbalouti, Abdollah; Mahdad, Elahe; Craker, Lyle

    2013-12-01

    Sweet basil, a plant that is extensively cultivated in some countries, is used to enhance the flavour of salads, sauces, pasta and confectioneries as both a fresh and dried herb. To determine the effect of drying methods on qualitative and quantitative characteristics of the plant and essential oil of basil, two landraces, Purple and Green, were dried in sunlight, shade, mechanical ovens at 40 °C and 60 °C, a microwave oven at 500 W and by freeze-drying. For comparison, the essential oils of all samples were extracted by hydrodistillation and analyzed using GC and GC-MS. The highest essential oil yields (v/w on dry weight basis) were obtained from shade-dried tissue in both landraces followed by the freeze-dried sample of the purple landrace and the fresh sample of green landrace. Increasing the drying temperature significantly decreased the essential oil content of all samples. Significant changes in the chemical profile of the essential oils from each of the landrace were associated with the drying method, including the loss of most monoterpene hydrocarbons, as compared with fresh samples. No significant differences occurred among several constituents in the extracted essential oils, including methyl chavicol (estragole), the major compound in the oil of both landraces, whether the plants were dried in the shade or sun, oven at 40 °C or freeze-dried, as compared with a fresh sample. The percentage methyl chavicol in the oil, however, decreased significantly when the plant material was dried in the oven at 60 °C or microwaved. In addition, linalool, the second major compound in the purple landrace, and geranial and neral, major compounds in the green landrace, decreased significantly when the plant tissue was dried in the oven at 60 °C or microwaved. PMID:23870979

  9. Initial Results of an MDO Method Evaluation Study

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley MDO method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of re- producible experiments. In the first phase of the study, three MDO methods were implemented in the SIGHT: framework and used to solve a set of ten relatively simple problems. In this paper, we comment on the general considerations for conducting method evaluation studies and report some initial results obtained to date. In particular, although the results are not conclusive because of the small initial test set, other formulations, optimality conditions, and sensitivity of solutions to various perturbations. Optimization algorithms are used to solve a particular MDO formulation. It is then appropriate to speak of local convergence rates and of global convergence properties of an optimization algorithm applied to a specific formulation. An analogous distinction exists in the field of partial differential equations. On the one hand, equations are analyzed in terms of regularity, well-posedness, and the existence and unique- ness of solutions. On the other, one considers numerous algorithms for solving differential equations. The area of MDO methods studies MDO formulations combined with optimization algorithms, although at times the distinction is blurred. It is important to

  10. Quantitative Single-letter Sequencing: a method for simultaneously monitoring numerous known allelic variants in single DNA samples

    PubMed Central

    Monsion, Baptiste; Duborjal, Hervé; Blanc, Stéphane

    2008-01-01

    Background Pathogens such as fungi, bacteria and especially viruses, are highly variable even within an individual host, intensifying the difficulty of distinguishing and accurately quantifying numerous allelic variants co-existing in a single nucleic acid sample. The majority of currently available techniques are based on real-time PCR or primer extension and often require multiplexing adjustments that impose a practical limitation of the number of alleles that can be monitored simultaneously at a single locus. Results Here, we describe a novel method that allows the simultaneous quantification of numerous allelic variants in a single reaction tube and without multiplexing. Quantitative Single-letter Sequencing (QSS) begins with a single PCR amplification step using a pair of primers flanking the polymorphic region of interest. Next, PCR products are submitted to single-letter sequencing with a fluorescently-labelled primer located upstream of the polymorphic region. The resulting monochromatic electropherogram shows numerous specific diagnostic peaks, attributable to specific variants, signifying their presence/absence in the DNA sample. Moreover, peak fluorescence can be quantified and used to estimate the frequency of the corresponding variant in the DNA population. Using engineered allelic markers in the genome of Cauliflower mosaic virus, we reliably monitored six different viral genotypes in DNA extracted from infected plants. Evaluation of the intrinsic variance of this method, as applied to both artificial plasmid DNA mixes and viral genome populations, demonstrates that QSS is a robust and reliable method of detection and quantification for variants with a relative frequency of between 0.05 and 1. Conclusion This simple method is easily transferable to many other biological systems and questions, including those involving high throughput analysis, and can be performed in any laboratory since it does not require specialized equipment. PMID:18291029

  11. "What about People Our Age?" Applying Qualitative and Quantitative Methods to Uncover How Political Ads Alienate College Students

    ERIC Educational Resources Information Center

    Parmelee, John H.; Perkins, Stephynie C.; Sayre, Judith J.

    2007-01-01

    This study uses a sequential transformative mixed methods research design to explain how political advertising fails to engage college students. Qualitative focus groups examined how college students interpret the value of political advertising to them, and a quantitative manifest content analysis concerning ad framing of more than 100 ads from…

  12. Teaching Integrative Physiology Using the Quantitative Circulatory Physiology Model and Case Discussion Method: Evaluation of the Learning Experience

    ERIC Educational Resources Information Center

    Rodriguez-Barbero, A.; Lopez-Novoa, J. M.

    2008-01-01

    One of the problems that we have found when teaching human physiology in a Spanish medical school is that the degree of understanding by the students of the integration between organs and systems is rather poor. We attempted to remedy this problem by using a case discussion method together with the Quantitative Circulatory Physiology (QCP)…

  13. DEVELOPMENT OF A RAPID, QUANTITATIVE METHOD FOR THE DETECTION OF INFECTIVE COXSACKIE AND ECHO VIRUSES IN DRINKING WATER

    EPA Science Inventory

    The objectives of this research are to improve on the current analytical methods for quantitative detection of infective coxsackie and echo viruses in drinking water. The specific objectives of this research are to: (1) Improve the sensitivity and specificity of IMS-PCR for in...

  14. Vision-Only Aircraft Flight Control Methods and Test Results

    E-print Network

    Johnson, Eric N.

    to determine the position and attitude of the vehicle. This is in contrast to what is found in nature. HumanVision-Only Aircraft Flight Control Methods and Test Results Alison A. Proctor and Eric N. Johnson an array of sensors whose output is used to estimate the vehicle's attitude, velocity and position

  15. Results of two new methods for aeroacoustics benchmark problems

    NASA Technical Reports Server (NTRS)

    Huynh, H. T.

    1995-01-01

    Two new methods for the numerical solution of conservation laws (the Euler equations in particular) are presented: a uniformly second-order accurate upwind scheme and a third-order accurate centered scheme. Results of these schemes are shown for problems 1, 2, and 5 of this workshop's benchmark problems.

  16. Evaluating Multiple Prevention Programs: Methods, Results, and Lessons Learned

    ERIC Educational Resources Information Center

    Adler-Baeder, Francesca; Kerpelman, Jennifer; Griffin, Melody M.; Schramm, David G.

    2010-01-01

    Extension faculty and agents/educators are increasingly collaborating with local and state agencies to provide and evaluate multiple, distinct programs, yet there is limited information about measuring outcomes and combining results across similar program types. This article explicates the methods and outcomes of a state-level evaluation of…

  17. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  18. Establishment of quantitative PCR methods for the quantification of geosmin-producing potential and Anabaena sp. in freshwater systems.

    PubMed

    Su, Ming; Gaget, Virginie; Giglio, Steven; Burch, Michael; An, Wei; Yang, Min

    2013-06-15

    Geosmin has often been associated with off-flavor problems in drinking water with Anabaena sp. as the major producer. Rapid on-site detection of geosmin-producers as well as geosmin is important for a timely management response to potential off-flavor events. In this study, quantitative polymerase chain reaction (qPCR) methods were developed to detect the levels of Anabaena sp. and geosmin, respectively, by designing two PCR primer sets to quantify the rpoC1 gene (ARG) and geosmin synthase one (GSG) in Anabaena sp. in freshwater systems. The ARG density determined by qPCR assay is highly related to microscopic cell count (r(2) = 0.726, p < 0.001), and the limit of detection (LOD) and limit of quantification (LOQ) of the qPCR method were 0.02 pg and 0.2 pg of DNA, respectively. At the same time, the relationship between geosmin concentrations measured by gas chromatography-mass spectrometry (GC-MS) and GSG copies was also established (r(2) = 0.742, p < 0.001) with similar LOD and LOQ values. Using the two qPCR protocols, we succeeded in measuring different levels of ARG and GSG copies in different freshwater systems with high incidence environmental substrata and diverse ecological conditions, showing that the methods developed could be applied for environmental monitoring. Moreover, comparing to the microscopic count and GC-MS analytical methods, the qPCR methods can reduce the time-to-results from several days to a few hours and require considerably less traditional algal identification and taxonomic expertise. PMID:23622984

  19. The development of processing methods for a quantitative histological investigation of rat hearts 

    E-print Network

    Jetton, Emily Hope

    2004-11-15

    In order to understand the mechanical functions of the cardiac muscle it is important to first understand the microstructure of the tissue. Young et al. (1998) realized that quantitative three-dimensional information about the ventricular...

  20. Spatial-resolution optimization of 3D high-frequency quantitative ultrasound methods to

    E-print Network

    Illinois at Urbana-Champaign, University of

    on estimate bias and variance was investigated using a database of 101 lymph nodes of colorectal--high-frequency ultrasound, quantitative ultra- sound, lymph node, metastasis. I. INTRODUCTION Detection of small metastatic

  1. Structured decision making as a method for linking quantitative decision support to community fundamental objectives

    EPA Science Inventory

    Decision support intended to improve ecosystem sustainability requires that we link stakeholder priorities directly to quantitative tools and measures of desired outcomes. Actions taken at the community level can have large impacts on production and delivery of ecosystem service...

  2. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    PubMed

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. PMID:26269460

  3. Fast-Neutron Hodoscope at TREAT: Methods for Quantitative Determination of Fuel Dispersal

    SciTech Connect

    De Volci, A.; Fink, C. L.; Marsh, G. E.; Rhodes, E. A.; Stanford, G. S.

    1980-01-01

    Fuel-motion surveillance using the fast-neutron hodoscope in TREAT experiments has advanced from an initial role of providing time/location/velocity data to that of offering quantitative mass results. The material and radiation surroundings of tha test section contribute to intrinsic and instrumental effects upon hodoscope detectors that require detailed corrections. Depending upon the experiment, count rate compensation is usually required for deadtime, power level, nonlinear response, efficiency, background, and detector calibration. Depending on their magnitude and amenability to analytical and empirical treatment, systematic corrections may be needed for self-shielding, self-multiplication, self-attenuation, flux depression, and other effects. Current verified hodoscope response (for 1- to 7-pin fuel bundles) may be paramatrically characterized under optimum conditions by 1-ms time resolution; 0.25-mm lateral and 5-mm axial-motion displacement resolution; and 50-mg single-pin mass resolution. The experimental and theoretical foundation for this performance is given, with particular emphasis on the geometrical response function and the statistical limits of fuel-motion resolution. Comparisons are made with alternative diagnostic systems.

  4. Application of bias correction methods to improve the accuracy of quantitative radar rainfall in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.-K.; Kim, J.-H.; Suk, M.-K.

    2015-04-01

    There are many potential sources of bias in the radar rainfall estimation process. This study classified the biases from the rainfall estimation process into the reflectivity measurement bias and QPE model bias and also conducted the bias correction methods to improve the accuracy of the Radar-AWS Rainrate (RAR) calculation system operated by the Korea Meteorological Administration (KMA). For the Z bias correction, this study utilized the bias correction algorithm for the reflectivity. The concept of this algorithm is that the reflectivity of target single-pol radars is corrected based on the reference dual-pol radar corrected in the hardware and software bias. This study, and then, dealt with two post-process methods, the Mean Field Bias Correction (MFBC) method and the Local Gauge Correction method (LGC), to correct rainfall-bias. The Z bias and rainfall-bias correction methods were applied to the RAR system. The accuracy of the RAR system improved after correcting Z bias. For rainfall types, although the accuracy of Changma front and local torrential cases was slightly improved without the Z bias correction, especially, the accuracy of typhoon cases got worse than existing results. As a result of the rainfall-bias correction, the accuracy of the RAR system performed Z bias_LGC was especially superior to the MFBC method because the different rainfall biases were applied to each grid rainfall amount in the LGC method. For rainfall types, Results of the Z bias_LGC showed that rainfall estimates for all types was more accurate than only the Z bias and, especially, outcomes in typhoon cases was vastly superior to the others.

  5. Statistical uncertainty and its propagation in the analysis of quantitative polymerase chain reaction data: comparison of methods.

    PubMed

    Tellinghuisen, Joel; Spiess, Andrej-Nikolai

    2014-11-01

    Most methods for analyzing real-time quantitative polymerase chain reaction (qPCR) data for single experiments estimate the hypothetical cycle 0 signal y0 by first estimating the quantification cycle (Cq) and amplification efficiency (E) from least-squares fits of fluorescence intensity data for cycles near the onset of the growth phase. The resulting y0 values are statistically equivalent to the corresponding Cq if and only if E is taken to be error free. But uncertainty in E usually dominates the total uncertainty in y0, making the latter much degraded in precision compared with Cq. Bias in E can be an even greater source of error in y0. So-called mechanistic models achieve higher precision in estimating y0 by tacitly assuming E=2 in the baseline region and so are subject to this bias error. When used in calibration, the mechanistic y0 is statistically comparable to Cq from the other methods. When a signal threshold yq is used to define Cq, best estimation precision is obtained by setting yq near the maximum signal in the range of fitted cycles, in conflict with common practice in the y0 estimation algorithms. PMID:24991688

  6. Quantitative nondestructive in-service evaluation of stay cables of cable-stayed bridges: methods and practical experience

    NASA Astrophysics Data System (ADS)

    Weischedel, Herbert R.; Hoehle, Hans-Werner

    1995-05-01

    Stay cables of cable-stayed bridges have corrosion protection systems that can be elaborate. For example, such a system may simply consist of one or several coats of paint, or--more complex--of plastic pipes that are wrapped with tape and filled with grout. Frequently, these corrosion protection systems prevent visual inspections. Therefore, alternative nondestructive examination methods are called for. For example, modern dual-function electromagnetic (EM) instruments allow the simultaneous detection of external and internal localized flaws (such as external and internal broken wires and corrosion piting) and the measurement of loss of metallic cross-sectional area (typically caused by external or internal corrosion or wear). Initially developed for mining and skiing applications, these instruments have been successfully used for the inspection of stays of cable-stayed bridges, and for the inspection of guys of smoke stacks, flare stacks, broadcast towers, suspended roofs, etc. As a rule, guys and bridge cables are not subjected to wear and bending stresses. However, their safety can be compromised by corrosion caused by the failure of corrosion protection systems. Furthermore, live loads and wind forces create intermittent tensile stresses that can cause fatigue breaks of wires. This paper discusses the use of dual-function EM instruments for the detection and the nondestructive quantitative evaluation of cable deterioration. It explains the underlying principles. Experiences with this method together with field inspection results will be presented.

  7. Quantitative Pleistocene calcareous nannofossil biostratigraphy: preliminary results from the IODP Site U1385 (Exp 339), the Shackleton Site

    NASA Astrophysics Data System (ADS)

    Balestra, B.; Flores, J. A.; Acton, G.; Alvarez Zarikian, C. A.; Grunert, P.; Hernandez-Molina, F. J.; Hodell, D. A.; Li, B.; Richter, C.; Sanchez Goni, M.; Sierro, F. J.; Singh, A.; Stow, D. A.; Voelker, A.; Xuan, C.

    2013-12-01

    In order to explore the effects of Mediterranean Outflow Water (MOW) on North Atlantic circulation and climate, Integrated Ocean Drilling Program (IODP) Expedition 339 (Mediterranean Outflow) cored a series of sites in the Gulf of Cadiz slope and off West Iberia (North East Atlantic). Site U1385 (37°48'N, 10°10?W, 3146 m water depth) was selected and drilled in the lower slope of the Portuguese margin, at a location close to the so-called Shackleton Site MD95-2042 (in honor of the late Sir Nicholas Shackleton), to provide a marine reference section of Pleistocene millennial-scale climate variability. Three holes were cored at Site U1385 using the Advanced Piston Corer (APC) to a depth of ~151 meters below seafloor in order to recover a continuous stratigraphic record covering the past 1.4 Ma. Here we present preliminary results of the succession of standard and alternative calcareous nannofossil events. Our quantitative study based on calcareous nannofossils shows well-preserved and abundant assemblages throughout the core. Most conventional Pleistocene events were recognized. Moreover, our quantitative investigations provide further data on the stratigraphic distribution of some species and groups, such as the large Emiliania huxleyi (>4 ?m), the small Gephyrocapsa group, and Reticulofenestra cisnerosii. A preliminary calibration of the calcareous nannofossil events with the paleomagnetic and astronomical signal, estimated by comparison with geophysical and logging parameters is also presented. *IODP Expedition 339 Scientists: Bahr, A., Ducassou. E., Flood, R., Furota, S., Jimenez-Espejo, F., Kim, J. K., Krissek, L., Kuroda, J., Llave, E., Lofi, J., Lourens, L., Miller, M., Nanayama, F., Nishida, N., Roque, C., Sloss, C., Takashimizu, Y., Tzanova, A., Williams, T.

  8. Possibilities and Limitations for Quantitative Precipitation Forecasts Using Nowcasting Methods with Infrared Geosynchronous Satellite Imagery.

    NASA Astrophysics Data System (ADS)

    Grose, Andrew M. E.; Smith, Eric A.; Chung, Hyo-Sang; Ou, Mi-Lim; Sohn, Byung-Ju; Turk, F. Joseph

    2002-07-01

    A rainfall nowcasting system is developed that identifies locations of raining clouds on consecutive infrared geosynchronous satellite images while predicting the movement of the rain cells for up to 10 h using cloud-motion-based winds. As part of the analysis, the strengths and weaknesses of various kinds of cloud wind filtering schemes and both steady and nonsteady storm advection techniques as forecast operators for quantitative precipitation forecasting are evaluated. The first part of the study addresses the development of a probability matching method (PMM) between histograms of equivalent blackbody temperatures (EBBTs) and Special Sensor Microwave Imager (SSM/I)-derived rain rates (RRs), which enables estimating RRs from instantaneous infrared imagery and allows for RR forecasts from the predicted EBBT fields. The second part of the study addresses the development and testing of the nowcasting system built upon the PMM capability and analyzes its success according to various skill score metrics. Key processes involved in the nowcasting system include the retrieved cloud-motion wind field, the filtered cloud-motion wind field, and the forecasting of a future rain field by storm advection and EBBT tendencies. These processes allow for the short-term forecasting of cloud and rain locations and of rain intensity, using PMM-based RRs from different datasets of infrared Geostationary Meteorological Satellite (GMS) and Geostationary Operational Environmental Satellite (GOES) imagery. For this study, three convective rain sequences from the Caribbean basin, the Amazon basin, and the Korean peninsula are analyzed. The final part of the study addresses the decay of forecast accuracy with time (i.e., the point at which the asymptotic limit on forecast skill is reached). This analysis indicates that the nowcasting system can produce useful rainfall forecast information out to approximately 6 h.

  9. A robust multiple-locus method for quantitative trait locus analysis of non-normally distributed multiple traits.

    PubMed

    Li, Z; Möttönen, J; Sillanpää, M J

    2015-12-01

    Linear regression-based quantitative trait loci/association mapping methods such as least squares commonly assume normality of residuals. In genetics studies of plants or animals, some quantitative traits may not follow normal distribution because the data include outlying observations or data that are collected from multiple sources, and in such cases the normal regression methods may lose some statistical power to detect quantitative trait loci. In this work, we propose a robust multiple-locus regression approach for analyzing multiple quantitative traits without normality assumption. In our method, the objective function is least absolute deviation (LAD), which corresponds to the assumption of multivariate Laplace distributed residual errors. This distribution has heavier tails than the normal distribution. In addition, we adopt a group LASSO penalty to produce shrinkage estimation of the marker effects and to describe the genetic correlation among phenotypes. Our LAD-LASSO approach is less sensitive to the outliers and is more appropriate for the analysis of data with skewedly distributed phenotypes. Another application of our robust approach is on missing phenotype problem in multiple-trait analysis, where the missing phenotype items can simply be filled with some extreme values, and be treated as outliers. The efficiency of the LAD-LASSO approach is illustrated on both simulated and real data sets. PMID:26174023

  10. International ring trial for the validation of an event-specific Golden Rice 2 quantitative real-time polymerase chain reaction method.

    PubMed

    Jacchia, Sara; Nardini, Elena; Bassani, Niccolň; Savini, Christian; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco

    2015-05-27

    This article describes the international validation of the quantitative real-time polymerase chain reaction (PCR) detection method for Golden Rice 2. The method consists of a taxon-specific assay amplifying a fragment of rice Phospholipase D ?2 gene, and an event-specific assay designed on the 3' junction between transgenic insert and plant DNA. We validated the two assays independently, with absolute quantification, and in combination, with relative quantification, on DNA samples prepared in haploid genome equivalents. We assessed trueness, precision, efficiency, and linearity of the two assays, and the results demonstrate that both the assays independently assessed and the entire method fulfill European and international requirements for methods for genetically modified organism (GMO) testing, within the dynamic range tested. The homogeneity of the results of the collaborative trial between Europe and Asia is a good indicator of the robustness of the method. PMID:25946377

  11. An adapted mindfulness-based stress reduction program for elders in a continuing care retirement community: quantitative and qualitative results from a pilot randomized controlled trial.

    PubMed

    Moss, Aleezé S; Reibel, Diane K; Greeson, Jeffrey M; Thapar, Anjali; Bubb, Rebecca; Salmon, Jacqueline; Newberg, Andrew B

    2015-06-01

    The purpose of this study was to test the feasibility and effectiveness of an adapted 8-week Mindfulness-Based Stress Reduction (MBSR) program for elders in a continuing care community. This mixed-methods study used both quantitative and qualitative measures. A randomized waitlist control design was used for the quantitative aspect of the study. Thirty-nine elderly were randomized to MBSR (n = 20) or a waitlist control group (n = 19), mean age was 82 years. Both groups completed pre-post measures of health-related quality of life, acceptance and psychological flexibility, facets of mindfulness, self-compassion, and psychological distress. A subset of MBSR participants completed qualitative interviews. MBSR participants showed significantly greater improvement in acceptance and psychological flexibility and in role limitations due to physical health. In the qualitative interviews, MBSR participants reported increased awareness, less judgment, and greater self-compassion. Study results demonstrate the feasibility and potential effectiveness of an adapted MBSR program in promoting mind-body health for elders. PMID:25492049

  12. Quantitative methods in fractography; Proceedings of the Symposium on Evaluation and Techniques in Fractography, Atlanta, GA, Nov. 10, 1988

    SciTech Connect

    Strauss, B.M.; Putatunda, S.K.

    1990-01-01

    Papers are presented on the application of quantitative fractography and computed tomography to fracture processes in materials, the relationships between fractographic features and material toughness, the quantitative analysis of fracture surfaces using fractals, and the analysis and interpretation of aircraft component defects by means of quantitative fractography. Also discussed are the characteristics of hydrogen-assisted cracking measured by the holding-load and fractographic method, a fractographic study of isolated cleavage regions in nuclear pressure vessel steels and their weld metals, a fractographic and metallographic study of the initiation of brittle fracture in weldments, cracking mechanisms for mean stress/strain low-cycle multiaxial fatigue loadings, and corrosion fatigue crack arrest in Al alloys.

  13. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  14. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  15. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  16. Pressure ulcer prevention algorithm content validation: a mixed-methods, quantitative study.

    PubMed

    van Rijswijk, Lia; Beitz, Janice M

    2015-04-01

    Translating pressure ulcer prevention (PUP) evidence-based recommendations into practice remains challenging for a variety of reasons, including the perceived quality, validity, and usability of the research or the guideline itself. Following the development and face validation testing of an evidence-based PUP algorithm, additional stakeholder input and testing were needed. Using convenience sampling methods, wound care experts attending a national wound care conference and a regional wound ostomy continence nursing (WOCN) conference and/or graduates of a WOCN program were invited to participate in an Internal Review Board-approved, mixed-methods quantitative survey with qualitative components to examine algorithm content validity. After participants provided written informed consent, demographic variables were collected and participants were asked to comment on and rate the relevance and appropriateness of each of the 26 algorithm decision points/steps using standard content validation study procedures. All responses were anonymous. Descriptive summary statistics, mean relevance/appropriateness scores, and the content validity index (CVI) were calculated. Qualitative comments were transcribed and thematically analyzed. Of the 553 wound care experts invited, 79 (average age 52.9 years, SD 10.1; range 23-73) consented to participate and completed the study (a response rate of 14%). Most (67, 85%) were female, registered (49, 62%) or advanced practice (12, 15%) nurses, and had > 10 years of health care experience (88, 92%). Other health disciplines included medical doctors, physical therapists, nurse practitioners, and certified nurse specialists. Almost all had received formal wound care education (75, 95%). On a Likert-type scale of 1 (not relevant/appropriate) to 4 (very relevant and appropriate), the average score for the entire algorithm/all decision points (N = 1,912) was 3.72 with an overall CVI of 0.94 (out of 1). The only decision point/step recommendation with a CVI of ? 0.70 was the recommendation to provide medical-grade sheepskin for patients at high risk for friction/shear. Many positive and substantive suggestions for minor modifications including color, flow, and algorithm orientation were received. The high overall and individual item rating scores and CVI further support the validity and appropriateness of the PUP algorithm with the addition of the minor modifications. The generic recommendations facilitate individualization, and future research should focus on construct validation testing. PMID:25853377

  17. Quantitative measurement of high intensity focused ultrasound pressure field by optical phase contrast method applying non-continuous phase unwrapping algorithm

    NASA Astrophysics Data System (ADS)

    Syahid, Mohd; Oyama, Seiji; Yasuda, Jun; Yoshizawa, Shin; Umemura, Shin-ichiro

    2015-07-01

    A fast and accurate ultrasound pressure field measurement is necessary for the progress of ultrasound application in medicine. In general, a hydrophone is used to measure the ultrasound field, which takes a long measurement time and might disturb the ultrasound field. Hence, we proposed a new method categorized in an optical method called Phase Contrast method to overcome the drawback in the hydrophone method. The proposed method makes use of the spatial DC spectrum formed in the focal plane to measure the modulated optical phase induced by ultrasound propagation in water. In this study, we take into account the decreased intensity of the DC spectrum at high ultrasound intensity to increase the measurement accuracy of the modulated optical phase. Then, we apply a non-continuous phase unwrapping algorithm to unwrap the modulated optical phase at high ultrasound intensity. From, the unwrapped result, we evaluate the quantitativeness of the proposed method.

  18. Test Results for Entry Guidance Methods for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2003-01-01

    There are a number of approaches to advanced guidance and control (AG&C) that have the potential for achieving the goals of significantly increasing reusable launch vehicle (RLV) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies (ITAGCT) has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future reusable vehicle concepts.

  19. Reference Genes Selection for Quantitative Real-Time PCR Using RankAggreg Method in Different Tissues of Capra hircus

    PubMed Central

    Najafpanah, Mohammad Javad; Sadeghi, Mostafa; Bakhtiarizadeh, Mohammad Reza

    2013-01-01

    Identification of reference genes with stable levels of gene expression is an important prerequisite for obtaining reliable results in analysis of gene expression data using quantitative real time PCR (RT-qPCR). Since the underlying assumption of reference genes is that expressed at the exact same level in all sample types, in this study, we evaluated the expression stability of nine most commonly used endogenous controls (GAPDH, ACTB, 18S rRNA, RPS18, HSP-90, ALAS, HMBS, ACAC, and B2M) in four different tissues of the domestic goat, Capra hircus, including liver, visceral, subcutaneous fat and longissimus muscles, across different experimental treatments (a standard diet prepared using the NRC computer software as control and the same diet plus one mg chromium/day). We used six different software programs for ranking of reference genes and found that individual rankings of the genes differed among them. Additionally, there was a significant difference in ranking patterns of the studied genes among different tissues. A rank aggregation method was applied to combine the ranking lists of the six programs to a consensus ranking. Our results revealed that HSP-90 was nearly always among the two most stable genes in all studied tissues. Therefore, it is recommended for accurate normalization of RT-qPCR data in goats, while GAPDH, ACTB, and RPS18 showed the most varied expressions and should be avoided as reference genes. PMID:24358246

  20. Novel X-ray phase-contrast tomography method for quantitative studies of heat induced structural changes in meat.

    PubMed

    Miklos, Rikke; Nielsen, Mikkel Schou; Einarsdóttir, Hildur; Feidenhans'l, Robert; Lametsch, René

    2015-02-01

    The objective of this study was to evaluate the use of X-ray phase-contrast tomography combined with 3D image segmentation to investigate the heat induced structural changes in meat. The measurements were performed at the Swiss synchrotron radiation light source using a grating interferometric setup. The non-destructive method allowed the same sample to be measured before and after cooking. Heat denaturation resulted in a 36% decrease in the volume of the muscle fibers, while solubilization of the connective tissues increased the volume from 8.4%to 24.9%. The cooking loss was quantified and separated into a water phase and a gel phase formed by the sarcoplasmic proteins in the exudate. The results show that X-ray phase contrast tomography offers unique possibilities in studies both the meat structure and the different meat component such as water, fat, connective tissue and myofibrils in a qualitative and quantitative manner without prior sample preparation as isolation of single muscle components, calibration or histology. PMID:25460128

  1. High speed moire based phase retrieval method for quantitative phase imaging of thin objects without phase unwrapping or aberration compensation

    NASA Astrophysics Data System (ADS)

    Wang, Shouyu; Yan, Keding; Xue, Liang

    2016-01-01

    Phase retrieval composed of phase extracting and unwrapping is of great significance in different occasions, such as fringe projection based profilometry, quantitative interferometric microscopy and moire detections. Compared to phase extracting, phase unwrapping occupies most time consuming in phase retrieval, and it becomes an obstacle to realize real time measurements. In order to increase the calculation efficiency of phase retrieval as well as simplify its procedures, here, a high speed moire based phase retrieval method is proposed which is capable of calculating quantitative phase distributions without phase unwrapping or aberration compensation. We demonstrate the capability of the presented phase retrieval method by both theoretical analysis and experiments. It is believed that the proposed method will be useful in real time phase observations and measurements.

  2. Fast chiral chromatographic method development and validation for the quantitation of eszopiclone in human plasma using LC/MS/MS.

    PubMed

    Meng, Min; Rohde, Lisa; Cápka, Vladimír; Carter, Spencer J; Bennett, Patrick K

    2010-12-01

    Traditional chiral chromatographic separation method development is time consuming even for an experienced chromatographer. This paper describes the application of computer software ACD Lab to facilitate the development of chiral separation for the quantitation of eszopiclone using LC-MS/MS technology. Assisted by ACD/Chrom Manager and LC Simulator software, the optimal chiral chromatographic development was completed within hours. The baseline chiral separation was achieved with a total cycle time of 3 min. For sample extraction method development, a Waters Oasis Sorbent Selection Plate containing four different sorbents was utilized. Optimal conditions were determined using a single plate under various load, wash and elution conditions. This was followed by a GLP validation which demonstrated excellent intra- and inter-day accuracy and precision for the quantitation of eszopiclone in human plasma at 1.00-100 ng/mL range using LC/MS/MS technology. This method was utilized to support multiple clinic bioequivalence studies. PMID:20650591

  3. Ultrahigh-performance liquid chromatographic-tandem mass spectrometric multimycotoxin method for quantitating 26 mycotoxins in maize silage.

    PubMed

    Van Pamel, Els; Verbeken, Annemieke; Vlaemynck, Geertrui; De Boever, Johan; Daeseleire, Els

    2011-09-28

    A multianalyte method was developed to identify and quantitate 26 mycotoxins simultaneously in maize silage by means of ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS). The extraction and cleanup procedure consists of two extraction steps followed by purification on a Waters Oasis HLB column. The method developed was validated with the requirements of Commission Decision 2002/657/EC taken into account. The limit of detection and quantitation ranges were 5-348 and 11-695 ng/g, respectively. Apparent recovery varied between 61 and 116%, whereas repeatability and reproducibility were within the ranges of 3-45 and 5-49%, respectively. The method developed was successfully applied for maize silage samples taken at the cutting surface and 1 m behind that surface. Mainly Fusarium toxins (beauvericin, deoxynivalenol, enniatins, fumonisins, fusaric acid, and zearalenone) were detected, but postharvest toxins such as mycophenolic acid and roquefortine C were identified as well. PMID:21888373

  4. Method performance and multi-laboratory assessment of a normal phase HPLC/FLD method for the quantitation of flavanols and procyanidins in cocoa and chocolate containing samples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The quantitative parameters and method performance for a normal-phase HPLC separation of flavanols and procyanidins in chocolate and cocoa-containing food products were optimized and assessed. The chromatographic separation based on degree of polymerization (DP) was achieved on a diol stationary ph...

  5. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples. [Patent application

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Cates, M.R.; Franks, L.A.

    1982-07-07

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fission are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for /sup 239/Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  6. Infrared contrast data analysis method for quantitative measurement and monitoring in flash infrared thermography

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2015-04-01

    The paper provides information on a new infrared (IR) image contrast data post-processing method that involves converting raw data to normalized contrast versus time evolutions from the flash infrared thermography inspection video data. Thermal measurement features such as peak contrast, peak contrast time, persistence time, and persistence energy are calculated from the contrast evolutions. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat bottom holes in a test plate of the subject material. The measurement features are used to monitor growth of anomalies and to characterize the void-like anomalies. The method was developed to monitor and analyze void-like anomalies in reinforced carbon-carbon (RCC) materials used on the wing leading edge of the NASA Space Shuttle Orbiters, but the method is equally applicable to other materials. The thermal measurement features relate to the anomaly characteristics such as depth and size. Calibration of the contrast is used to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat bottom hole (EFBH) from the calibration data. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH diameter are compared with actual widths to evaluate utility of IR Contrast method. Some thermal measurements relate to gap thickness of the delaminations. Results of IR Contrast method on RCC hardware are provided. Keywords: normalized contrast, flash infrared thermography.

  7. Development and application of a quantitative method for determination of flavonoids in orange peel: Influence of sample pretreatment on composition.

    PubMed

    Molina-Calle, María; Priego-Capote, Feliciano; Luque de Castro, María D

    2015-11-01

    Peel, a part of the citrus rich in compounds with high-added value, constitutes the bulk of the waste generated in citrus juice industries. Flavonoids are a class of these high-added value compounds characterized by their bioactivity. In this research, a method for analysis of flavonoids, based on LC-MS/MS by using a triple quadrupole detector, has been developed and applied to the quantitative analysis of 16 flavonoids in extracts obtained by maceration of citrus peel. The parameters involved in the ionization and fragmentation of the target analytes were optimized to develop a selected reaction monitoring (SRM) method, which reported detection and quantitation limits ranging from 0.005 to 5ng/mL and from 0.01 to 10ng/mL, respectively. The raw materials for flavonoids extraction were fresh, oven-dried and lyophilized peel of 8 different orange varieties, and the proposed quantitation method was applied to the analysis of the obtained extracts. Evaluation of the two methods of water removal showed that lyophilization preserves the concentration of the flavonoids, while oven-dried peel presented a decrease of glycosylated flavonoids and an increase of aglycone forms. PMID:26452832

  8. [Molecular epidemiology of infectious diseases: analytical methods and results interpretation].

    PubMed

    Sammarco, M L; Ripabelli, G; Tamburro, M

    2014-01-01

    Molecular typing and fingerprinting of microbial pathogens represent an essential tool for the epidemiological surveillance, outbreak detection and control of infectious diseases. Indeed, epidemiological investigation without genotyping data may not provide comprehensive information to allow the most appropriate interventions; despite this consideration, some barriers still hamper the routine application and interpretation of molecular typing data. In this paper, the most important methods currently used for characterization of pathogenic microorganisms for microbial source tracking and for the identification of clonal relationships among different isolates, are described according to their principles, advantages and limitations. Criteria for their evaluation and guidelines for the correct interpretation of results are also proposed. Molecular typing methods can be grouped into four categories based on different methodological principles, which include the characterization of restriction sites in genomic or plasmid DNA; the amplification of specific genetic targets; the restriction enzyme digestion and the subsequent amplification; sequence analysis. Although the development and the extensive use of molecular typing systems have greatly improved the understanding of the infectious diseases epidemiology, the rapid diversification, partial evaluation and lack of comparative data on the methods have raised significant questions about the selection of the most appropriate typing method, as well as difficulties for the lack of consensus about the interpretation of the results and nomenclature used for interpretation. Several criteria should be considered in order to evaluate the intrinsic performance and practical advantages of a typing system. However none of the available genotyping methods fully meets all these requirements. Therefore, the combined use of different approaches may lead to a more precise characterization and discrimination of isolates than a single method, especially if used in a hierarchical manner. The interpretation of the molecular results differs according to the typing system's characteristics: for example in the restriction fragments-based analysis, the divergences or the similarity percentages among the profiles are evaluated, whilst the differences in terms of number and intensity of bands are analyzed in the amplification-based approaches. Moreover, a correct interpretation of molecular results significantly depends by other critical factors, such as the comprehension of the typing system and data quality, the microbial diversity, and the epidemiological context in which the method is used. The analysis of PFGE data, considered as the "gold standard", is based on the differences of the number and position of bands patterns, although recent recommendations are now available from the Centers for Diseases Control and Prevention (CDC) for a more accurate interpretation, which also include the evaluation of the gel quality, the genetic diversity of the microorganism, the time and geographical scale of an epidemic event. Future advances in the molecular typing technologies indeed will provide rapid methodological improvements, such as a greater degree of automation, better resolution, higher throughput, and a greater availability of dedicated bioinformatics tools. These factors will all contribute to an increasing application of genotyping methods to better understand the epidemiology of infectious diseases, and to implement, along with the strengthened international and interdisciplinary partnerships, more effective control and prevention strategies for Public Health improvements. PMID:24452182

  9. A processing method and results of meteor shower radar observations

    NASA Technical Reports Server (NTRS)

    Belkovich, O. I.; Suleimanov, N. I.; Tokhtasjev, V. S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed.

  10. Methods and preliminary measurement results of liquid Li wettability

    SciTech Connect

    Zuo, G. Z. Hu, J. S.; Ren, J.; Sun, Z.; Yang, Q. X.; Li, J. G.; Zakharov, L. E.; Mansfield, D. K.

    2014-02-15

    A test of lithium wettability was performed in high vacuum (< 3 × 10{sup ?4} Pa). High magnification images of Li droplets on stainless steel substrates were produced and processed using the MATLAB{sup ®} program to obtain clear image edge points. In contrast to the more standard “?/2” or polynomial fitting methods, ellipse fitting of the complete Li droplet shape resulted in reliable contact angle measurements over a wide range of contact angles. Using the ellipse fitting method, it was observed that the contact angle of a liquid Li droplet on a stainless steel substrate gradually decreased with increasing substrate temperature. The critical wetting temperature of liquid Li on stainless steel was observed to be about 290?°C.

  11. Photographic Reading Center of the Idiopathic Intracranial Hypertension Treatment Trial (IIHTT): Methods and Baseline Results

    PubMed Central

    Fischer, William S.; Wall, Michael; McDermott, Michael P.; Kupersmith, Mark J.; Feldon, Steven E.

    2015-01-01

    Purpose. To describe the methods used by the Photographic Reading Center (PRC) of the Idiopathic Intracranial Hypertension Treatment Trial (IIHTT) and to report baseline assessments of papilledema severity in participants. Methods. Stereoscopic digital images centered on the optic disc and the macula were collected using certified personnel and photographic equipment. Certification of the camera system included standardization and calibration using a model eye. Lay readers assessed disc photos of all eyes using the Frisén grade and performed quantitative measurements of papilledema. Frisén grades by PRC were compared with site investigator clinical grades. Spearman rank correlations were used to quantify associations among disc features and selected clinical variables. Results. Frisén grades according to the PRC and site investigator's grades, matched exactly in 48% of the study eyes and 42% of the fellow eyes and within one grade in 94% of the study eyes and 92% of the fellow eyes. Frisén grade was strongly correlated (r > 0.65, P < 0.0001) with quantitative measures of disc area. Cerebrospinal fluid pressure was weakly associated with Frisén grade and disc area determinations (r ? 0.31). Neither Frisén grade nor any fundus feature was associated with perimetric mean deviation. Conclusions. In a prospective clinical trial, lay readers agreed reasonably well with physicians in assessing Frisén grade. Standardization of camera systems enhanced consistency of photographic quality across study sites. Images were affected more by sensors with poor dynamic range than by poor resolution. Frisén grade is highly correlated with quantitative assessment of disc area. (ClinicalTrials.gov number, NCT01003639.) PMID:26024112

  12. Quantitative Analysis of Organic Compounds: A Simple and Rapid Method for Use in Schools

    ERIC Educational Resources Information Center

    Schmidt, Hans-Jurgen

    1973-01-01

    Describes the procedure for making a quantitative analysis of organic compounds suitable for secondary school chemistry classes. Using the Schoniger procedure, the organic compound, such as PVC, is decomposed in a conical flask with oxygen. The products are absorbed in a suitable liquid and analyzed by titration. (JR)

  13. FRE 385/585 Quantitative Methods for Business and Resource Management

    E-print Network

    will concentrate on frequently used quantitative and decision making models that include decision analysis and MFRE 585. Selected chapters from: Spreadsheet Modeling and Decision Analysis (6th edition) by Ragsdale Introduction to the course (Model Building). Chapt. 1 Week 2 Sept 8 Decision Analysis: Excel, Pivot Tables

  14. Methods for Evidence-Based Practice: Quantitative Synthesis of Single-Subject Designs

    ERIC Educational Resources Information Center

    Shadish, William R.; Rindskopf, David M.

    2007-01-01

    Good quantitative evidence does not require large, aggregate group designs. The authors describe ground-breaking work in managing the conceptual and practical demands in developing meta-analytic strategies for single subject designs in an effort to add to evidence-based practice. (Contains 2 figures.)

  15. Simple, rapid, and inexpensive cleanup method for quantitation of aflatoxins in important agricultural products by HPLC

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A simple, fast and cheap chemical cleanup procedure for low-level quantitative determination of aflatoxins in major economically important agricultural commodities using HPLC has been developed. Aflatoxins were extracted from a ground sample with methanol-water (80:20, v/v), and after a cleanup step...

  16. Development of a rapid method fluorescent biosenser to quantitate bioavailable methionine 

    E-print Network

    Froelich, Clifford Anthony

    2003-01-01

    and to quantitate available amino acid concentrations in feeds. Naturally found in the gastrointestinal microflora, E. coli was chosen as the foundation of an attempt to develop a microbial-based bioassay. An auxotroph was used to accurately indicate the existence...

  17. Hemostatic assessment, treatment strategies, and hematology consultation in massive postpartum hemorrhage: results of a quantitative survey of obstetrician-gynecologists

    PubMed Central

    James, Andra H; Cooper, David L; Paidas, Michael J

    2015-01-01

    Objective To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. Study design A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. Results Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with “massive” PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a “stat” complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. Conclusion The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist. PMID:26604829

  18. Factors affecting broadband ultrasound attenuation results of the calcaneus using a gel-coupled quantitative ultrasound scanning system.

    PubMed

    Cheng, S; Fan, B; Wang, L; Fuerst, T; Lian, M; Njeh, C; He, Y; Kern, M; Lappin, M; Tylavsky, F; Casal, D; Harris, S; Genant, H K

    1999-01-01

    This study aimed to assess the factors that may influence the distribution and description of broadband ultrasound attenuation (BUA) and to identify specific criteria for diagnostic consideration when collecting BUA reference data. Two hundred Caucasian women (aged 20-79 years) without a history of atraumatic fractures or medicines known to affect bone metabolism were selected for this study. Medical and menstrual history, medication usage, family history of osteoporosis (FHO), physical activity, activities of daily living (ADL), dietary calcium intake, as well as smoking and alcohol consumption were obtained. Broadband ultrasound attenuation (BUA, dB/MHz) was determined in the right foot using a new gel-coupled ultrasound system. BUA was significantly associated with age (p<0.001), body weight (p<0.001), level of physical activity (p = 0.024) and dietary calcium intake (p = 0.023). Smoking, alcohol and coffee consumption and ADL were not associated with BUA (p>0.05). There were no differences in BUA (p>0.05) between those women who reported taking medications or had diseases (known to not affect bone metabolism), were using contraceptives, taking vitamin/mineral supplements and/or had traumatic fractures and their counterparts who did not report these characteristics. Premenopausal women with a FHO had significantly lower BUA values compared with those without a FHO (p = 0.013). When those participants with a FHO were removed from the sample, the peak BUA value was 1.1-4.4% higher and the variability (SD) was reduced by about 3.3-9.3% depending on which age range was used to define the peak BUA value. Consequently, an additional 4.5% of the population were classified as having a T-score <-2. Our results suggest that the impact on BUA of risk factors such as a FHO, body weight, physical activity and dietary calcium intake is similar to that on bone mineral density obtained by dual-energy X-ray absorptiometry (DXA), and thus provides further information on the comparability of quantitative ultrasound and DXA for assessment of risk of fracture. The criteria for calculating the T-score need further study to determine whether young adults with FHO should be included and what cutoff age range should be used in collecting peak values of quantitative ultrasound parameters. PMID:10663351

  19. Outflow forces of low-mass embedded objects in Ophiuchus: a quantitative comparison of analysis methods

    NASA Astrophysics Data System (ADS)

    van der Marel, N.; Kristensen, L. E.; Visser, R.; Mottram, J. C.; Y?ld?z, U. A.; van Dishoeck, E. F.

    2013-08-01

    Context. The outflow force of molecular bipolar outflows is a key parameter in theories of young stellar feedback on their surroundings. The focus of many outflow studies is the correlation between the outflow force, bolometric luminosity, and envelope mass. However, it is difficult to combine the results of different studies in large evolutionary plots over many orders of magnitude due to the range of data quality, analysis methods, and corrections for observational effects, such as opacity and inclination. Aims: We aim to determine the outflow force for a sample of low-luminosity embedded sources. We quantify the influence of the analysis method and the assumptions entering the calculation of the outflow force. Methods: We used the James Clerk Maxwell Telescope to map 12CO J = 3-2 over 2'× 2' regions around 16 Class I sources of a well-defined sample in Ophiuchus at 15? resolution. The outflow force was then calculated using seven different methods differing, e.g., in the use of intensity-weighted emission and correction factors for inclination. Two well studied outflows (HH 46 and NGC1 333 IRAS4A) are added to the sample and included in the comparison. Results: The results from the analysis methods differ from each other by up to a factor of 6, whereas observational properties and choices in the analysis procedure affect the outflow force by up to a factor of 4. Subtraction of cloud emission and integrating over the remaining profile increases the outflow force at most by a factor of 4 compared to line wing integration. For the sample of Class I objects, bipolar outflows are detected around 13 sources including 5 new detections, where the three nondetections are confused by nearby outflows from other sources. New outflow structures without a clear powering source are discovered at the corners of some of the maps. Conclusions: When combining outflow forces from different studies, a scatter by up to a factor of 5 can be expected. Although the true outflow force remains unknown, the separation method (separate calculation of dynamical time and momentum) is least affected by the uncertain observational parameters. The correlations between outflow force, bolometric luminosity, and envelope mass are further confirmed down to low-luminosity sources. Appendices are available in electronic form at http://www.aanda.org

  20. Searching for seafloor massive sulfides: a quantitative review of high-resolution methods in deep sea sonar bathymetry for mining applications

    NASA Astrophysics Data System (ADS)

    Mitchley, Michael; Sears, Michael

    2014-06-01

    Seafloor massive sulphides are deep sea mineral deposits currently being examined as a potential mining resource. Conventional sonar bathymetry products gathered by sea surface platforms do not achieve adequate spatial resolution to detect these resources. High-resolution beamforming methods (such as multiple signal classification and estimation of signal parameters via rotational invariance techniques) improve the resolution of sonar bathymetry. We perform a quantitative review of these high-resolution methods using a novel simulator, showing results in the absence of platform motion for a single ping cycle. It was found that high-resolution methods achieved greater bathymetric accuracy and higher resolution than conventional beamforming and that these methods may be adequate for this style of marine exploration. These methods were also robust in the presence of unwanted persistent signals and low signal to noise ratios.

  1. A Simple ERP Method for Quantitative Analysis of Cognitive Workload in Myoelectric Prosthesis Control and Human-Machine Interaction

    PubMed Central

    Deeny, Sean; Chicoine, Caitlin; Hargrove, Levi; Parrish, Todd; Jayaraman, Arun

    2014-01-01

    Common goals in the development of human-machine interface (HMI) technology are to reduce cognitive workload and increase function. However, objective and quantitative outcome measures assessing cognitive workload have not been standardized for HMI research. The present study examines the efficacy of a simple event-related potential (ERP) measure of cortical effort during myoelectric control of a virtual limb for use as an outcome tool. Participants trained and tested on two methods of control, direct control (DC) and pattern recognition control (PRC), while electroencephalographic (EEG) activity was recorded. Eighteen healthy participants with intact limbs were tested using DC and PRC under three conditions: passive viewing, easy, and hard. Novel auditory probes were presented at random intervals during testing, and significant task-difficulty effects were observed in the P200, P300, and a late positive potential (LPP), supporting the efficacy of ERPs as a cognitive workload measure in HMI tasks. LPP amplitude distinguished DC from PRC in the hard condition with higher amplitude in PRC, consistent with lower cognitive workload in PRC relative to DC for complex movements. Participants completed trials faster in the easy condition using DC relative to PRC, but completed trials more slowly using DC relative to PRC in the hard condition. The results provide promising support for ERPs as an outcome measure for cognitive workload in HMI research such as prosthetics, exoskeletons, and other assistive devices, and can be used to evaluate and guide new technologies for more intuitive HMI control. PMID:25402345

  2. Quantitative Evaluation of E1 Endoglucanase Recovery from Tobacco Leaves Using the Vacuum Infiltration-Centrifugation Method

    PubMed Central

    Kingsbury, Nathaniel J.; McDonald, Karen A.

    2014-01-01

    As a production platform for recombinant proteins, plant leaf tissue has many advantages, but commercialization of this technology has been hindered by high recovery and purification costs. Vacuum infiltration-centrifugation (VI-C) is a technique to obtain extracellularly-targeted products from the apoplast wash fluid (AWF). Because of its selective recovery of secreted proteins without homogenizing the whole tissue, VI-C can potentially reduce downstream production costs. Lab scale experiments were conducted to quantitatively evaluate the VI-C method and compared to homogenization techniques in terms of product purity, concentration, and other desirable characteristics. From agroinfiltrated Nicotiana benthamiana leaves, up to 81% of a truncated version of E1 endoglucanase from Acidothermus cellulolyticus was recovered with VI-C versus homogenate extraction, and average purity and concentration increases of 4.2-fold and 3.1-fold, respectively, were observed. Formulas were developed to predict recovery yields of secreted protein obtained by performing multiple rounds of VI-C on the same leaf tissue. From this, it was determined that three rounds of VI-C recovered 97% of the total active recombinant protein accessible to the VI-C procedure. The results suggest that AWF recovery is an efficient process that could reduce downstream processing steps and costs for plant-made recombinant proteins. PMID:24971334

  3. Simplified and rapid method for extraction of ergosterol from natural samples and detection with quantitative and semi-quantitative methods using thin-layer chromatography.

    PubMed

    Larsen, Thomas; Axelsen, Jřrgen; Weber Ravn, Helle

    2004-02-13

    A new and simplified method for extraction of ergosterol (ergosta-5,7,22-trien-3beta-ol) from fungi in soil and litter was developed using pre-soaking extraction and paraffin oil for recovery. Recoveries of ergosterol were in the range of 94-100% depending on the solvent to oil ratio. Extraction efficiencies equal to heat-assisted extraction treatments were obtained with pre-soaking extraction. Ergosterol was detected with thin-layer chromatography (TLC) using fluorodensitometry with a quantification limit of 8 ng. Using visual evaluation of images of TLC plates photographed in UV-light the quantification limit was 16 ng. PMID:14763758

  4. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  5. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  6. P35-M Quantitative Measurement of Gene Expression from Formalin-Fixed Tissue Providing Identical Results as Fresh Tissue: Biomarker Validation using Clinical Samples

    PubMed Central

    Seligmann, B.; Rimsza, L.; Martel, R.; Sabalos, C.; Robin, R.; Botros, I.; Rounseville, M.; LeBlanc, M.; Unger, J.; M, T.; Grogan, T.

    2007-01-01

    The measurement of gene expression from formalin-fixed paraffin-embedded (FFPE) tissue has proven to be problematic. Consequently, archives of FFPE samples remain unexploited in the quest to elucidate and validate the molecular mechanisms of diseases, cellular processes, and drug activity/safety. We validated the measurement of gene expression from FFPE tissue by a new multiplexed assay, the quantitative nuclease protection assay (qNPA). qNPA measures the RNA cross-linked to tissue without its having to be solubilized, plus the soluble RNA pool—i.e., the total RNA contained in the FFPE sample. This is likely one explanation of qNPA success where methods that measure only soluble RNA have failed. Cross-linked RNA is the major pool in FPE samples, and the fraction it makes up varies from sample to sample, presumably due to differences in fixation time or sample age. Consistent with the observation that qNPA measures the total RNA in fixed tissue, and the fact that it measures the total RNA in fresh samples, is the result that identical quantitative levels of gene expression are measured from fresh fixed and 18-y-old FFPE tissue and from matched fresh or fixed tissue. The expression level for a set of low to moderately expressed genes from fresh vs. FFPE tissue correlated with an R2 = 0.99, slope = 1. Gene expression measurements in FFPE tissue provided average CVs <10%. A retrospective study using clinical diffuse large B-cell lymphoma samples was carried out, validating prognostic biomarkers of disease, disease subtype, and survival. The levels of gene expression measured by qNPA correlated with protein product levels measured by IHC. These results validate that qNPA provides a high-quality gene expression assay of FFPE tissue, enabling research and clinical assays not previously possible.

  7. Effects of processing delay, temperature, and transport tube type on results of quantitative bacterial culture of canine urine.

    PubMed

    Patterson, Carly A; Bishop, Micah A; Pack, Julie D; Cook, Audrey K; Lawhon, Sara D

    2016-01-15

    OBJECTIVE To determine the impact of processing delay, temperature, and transport tube type on results of quantitative bacterial culture (QBC) of canine urine. DESIGN Diagnostic test evaluation. SAMPLE 60 mL of pooled urine from 4 dogs, divided into six 10-mL aliquots. PROCEDURES Urine aliquots were spiked with bacteria from 1 of 6 independent Escherichia coli cultures to achieve a target bacterial concentration of 10(5) CFUs/mL. One milliliter from each aliquot was transferred into 5 silicone-coated clot tubes (SCTs) and 5 urine transport tubes (UTTs). Samples were stored at 4°C (39°F) and 25°C (77°F) for 0, 8, and 24 hours, and then standard QBCs were performed. RESULTS Median bacterial concentration for urine samples stored in a UTT for 24 hours at 4°C was lower than that for samples stored in an SCT under the same conditions. Conversely, a substantial decrease in median bacterial concentration was identified for samples stored for 24 hours in an SCT at 25°C, compared with the median concentration for samples stored in a UTT under the same conditions. Median bacterial concentration in samples stored in an SCT at 25°C for 24 hours (275 CFUs/mL) was less than the cutoff typically used to define clinically important bacteriuria by use of urine samples obtained via cystocentesis (ie, > 1,000 CFUs/mL). CONCLUSIONS AND CLINICAL RELEVANCE Canine urine samples submitted for immediate QBC should be transported in plain sterile tubes such as SCTs. When prolonged (24-hour) storage at room temperature is anticipated, urine samples should be transported in UTTs. PMID:26720084

  8. Quantitative Analysis and Efficiency Study of PSD Methods for a LaBr3:Ce Detector

    E-print Network

    Ming Zeng; Jirong Cang; Zhi Zeng; Xiaoguang Yue; Jianping Cheng; Yinong Liu; Junli Li

    2015-08-17

    The LaBr3:Ce scintillator has been widely studied for nuclear spectroscopy because of its optimal energy resolution (algorithm parameters and discrimination efficiency are calculated for each method. Moreover, for the CCM, the correlation between the CCM feature value distribution and the total charge (energy) is studied, and a fitting equation for the correlation is inferred and experimentally verified. Using the equations, an energy-dependent threshold can be chosen to optimize the discrimination efficiency. Additionally, the experimental results show a potential application in low-activity high-energy {\\gamma} measurement by suppressing the alpha background.

  9. PDMP's Optimal stopping Numerical method Numerical results Numerical method for optimal stopping of

    E-print Network

    De Saporta, Benoîte

    Outline 1 Piecewise deterministic Markov processes Definition Example 2 Optimal stopping 3 Numerical;PDMP's Optimal stopping Numerical method Numerical results Definition Definition of piecewise, management science, economics. . . Examples Queuing systems, investment planning, stochastic scheduling

  10. Preliminary Results from a Mercury Dry Deposition Measurement Methods Intercomparison

    NASA Astrophysics Data System (ADS)

    Marsik, F. J.; Brooks, S.; Gustin, M. S.; Holsen, T.; Landis, M.; Prestbo, E. M.; Poissant, L.

    2009-12-01

    Over the past fifteen years, a number of intensive field campaigns and measurement networks have provided valuable information on the estimated rates of mercury wet deposition to sensitive ecosystems throughout the world. In contrast, the ability to place bounds on the rates of mercury dry deposition has been hampered by the relative lack of direct measurements of this process. Recently, a number of researchers have performed measurements of mercury dry deposition using a variety of direct and indirect measurement techniques. While these studies have provided important information regarding the potential rates of mercury dry deposition to natural surfaces, little is known about the comparability of the results utilizing these different measurement approaches. During the month of August 2008, a mercury dry deposition measurement methods comparison was conducted in Ann Arbor, Michigan over a nine-day period. Seven research groups participated in the study, with the following measurement approaches: water, cation exchange membrane, chemically treated filter and turf surrogate surfaces; and several micrometeorological modeling methods. Continuous monitoring was conducted for ambient meteorological conditions and elemental, oxidized and particulate mercury concentrations. Preliminary results suggest that study-average mercury dry deposition estimates ranged from 0.17 to 0.59 ng/m2/hour for the group of pure-water surrogate surfaces, the cation exchange membrane and a micrometeorological flux gradient approach. The turf surrogate surface, BrCl spiked-water surface and a gold-coated quartz fiber filter surface resulted in significantly higher mercury dry deposition estimates, with the latter two approaches having been designed to measure total mercury dry deposition. Given that the turf surrogate surface and the cation exchange membrane samplers were designed for long-term deployment (up to one week), these methods were deployed for an additional series of four one-week periods. The turf surrogate surface again resulted in a significantly greater estimate of mercury dry deposition (1.59 ng/m2/hour) than that obtained using the cation exchange membrane (0.19 ng/m2/hour). When the turf surrogate surface estimate was adjusted for total surface area, as opposed to its footprint area, the deposition estimate (0.17 ng/m2/hour) was more consistent with that obtained from the cation exchange membrane.

  11. Quantitative methods for three-dimensional comparison and petrographic description of chondrites

    SciTech Connect

    Friedrich, J.M.

    2008-10-20

    X-ray computed tomography can be used to generate three-dimensional (3D) volumetric representations of chondritic meteorites. One of the challenges of using collected X-ray tomographic data is the extraction of useful data for 3D petrographic analysis or description. Here, I examine computer-aided quantitative 3D texture metrics that can be used for the classification of chondritic meteorites. These quantitative techniques are extremely useful for discriminating between chondritic materials, but yield little information on the 3D morphology of chondrite components. To investigate the morphology of chondrite minerals such as Fe(Ni) metal and related sulfides, the homology descriptors known as Betti numbers, are examined. Both methodologies are illustrated with theoretical discussion and examples. Betti numbers may be valuable for examining the nature of metal-silicate structural changes within chondrites with increasing degrees of metamorphism.

  12. A field method for making a quantitative estimate of altered tuff in sandstone

    USGS Publications Warehouse

    Cadigan, R.A.

    1954-01-01

    The use of benzidine to identify altered tuff in sandstone is practical for field or field laboratory studies associated with stratigraphic correlations, mineral deposit investigations, or paleogeographic interpretations. The method is based on the ability of saturated benzidine (C12H12N2) solution to produce a blue stain on montmorillonite-bearing tuff grains. The method is substantiated by the results of microscopic, X-ray spectrometer, and spectrographic tests which lead to the conclusion that: (1) the benzidine stain test differentiates grains of different composition, (2) the white or gray grains which are stained a uniform blue color are fragments of altered tuff, and (3) white or gray grains which stain in a few small spots are probably silicified tuff. The amount of sand grains taken from a hand specimen or an outcrop which will be held by a penny is spread out on a nonabsorbent white surface and soaked with benzidine for 5 minutes. The approximate number blue grains and the average grain size are used in a chart to determine a reference number which measures relative order of abundance. The chart, based on a volume relationship, corrects for the variation in the number of grains in the sample as the grain size varies. Practical use of the method depends on a knowledge of several precautionary measures as well as an understanding of the limitations of benzidine staining tests.

  13. Staged Moduli: A Quantitative Method to Analyze the Complete Compressive Stress-Strain Response for Thermally Damaged Rock

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Xu, Jinyu; Liu, Shi

    2015-07-01

    The ultrasonic method and destructive test were combined to examine sandstone specimens taken from underground construction field in the Mount Taibai of Qinling Mountains, middle part of China. Staged moduli of the four stages during the uniaxial compression of sandstone after temperature varying from 25 to 1,000 °C were defined, through which the complete stress-strain curves of sandstone were studied quantitatively. Thermal damage of sandstone after different high temperatures was analyzed based on the thermal damage factor (TDF) defined by the modulus of compact stage. The temperature-sensitivity coefficient (TSC) was proposed to describe the sensitivity of TDF to temperature as temperature level varied. Research suggests that the compression process of thermally damaged sandstone is of prominent staged characteristic. The strain of compact stage increases significantly in a near-linear style as temperature rises up. For temperature above 400 °C, the ratio of compaction strain to peak strain increases to more than 50 percent. Changing rules of the four-staged moduli with temperature differs widely, among which the modulus of compact stage has a strong relativity with longitudinal wave velocity. The TDF defined by wave velocity loses sight of the change in density and Poisson's ratio, avoiding the defect of which, the defining method based on modulus of compact stage is of greater veracity. Within the range of 25-200 °C, the TSC is largest and the thermal damage of sandstone is more sensitive to temperature. The results of this article have some guiding significance to rock engineering in high-temperature environment.

  14. Mode-Stirred Method Implementation for HIRF Susceptibility Testing and Results Comparison with Anechoic Method

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Ely, Jay J.; Koppen, Sandra V.

    2001-01-01

    This paper describes the implementation of mode-stirred method for susceptibility testing according to the current DO-160D standard. Test results on an Engine Data Processor using the implemented procedure and the comparisons with the standard anechoic test results are presented. The comparison experimentally shows that the susceptibility thresholds found in mode-stirred method are consistently higher than anechoic. This is consistent with the recent statistical analysis finding by NIST that the current calibration procedure overstates field strength by a fixed amount. Once the test results are adjusted for this value, the comparisons with the anechoic results are excellent. The results also show that test method has excellent chamber to chamber repeatability. Several areas for improvements to the current procedure are also identified and implemented.

  15. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results

  16. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    PubMed Central

    Cappione, Amedeo; Lento, Joseph; Chernokalskaya, Elena

    2014-01-01

    Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5?mg/mL) and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2?µL); it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents. PMID:25371845

  17. Trojan Horse Method: recent results in nuclear astrophysics

    NASA Astrophysics Data System (ADS)

    Spitaleri, C.; Lamia, L.; Gimenez Del Santo, M.; Burjan, V.; Carlin, N.; Li, Chengbo; Cherubini, S.; Crucilla, V.; Gulino, M.; Hons, Z.; Kroha, V.; Irgaziev, B.; La Cognata, M.; Mrazek, J.; Mukhamedzhanov, M.; Munhoz, M. G.; Palmerini, S.; Pizzone, R. G.; Puglia, M. R.; Rapisarda, G. G.; Romano, S.; Sergi, L.; Zhou, Shu-Hua; Somorjai, E.; Souza, F. A.; Tabacaru, G.; Szanto de Toledo, A.; Tumino, A.; Wen, Qungang; Wakabayashi, Y.; Yamaguchi, H.

    2015-07-01

    The accurate knowledge of thermonuclear reaction rates is important in understanding the energy generation, the neutrinos luminosity and the synthesis of elements in stars. The physical conditions under which the majority of astrophysical reactions proceed in stellar environments make it difficult or impossible to measure them under the same conditions in the laboratory. That is why different indirect techniques are being used along with direct measurements. The Trojan Horse Method (THM) is introduced as an independent technique to obtain the bare nucleus astrophysical S(E)-factor. As examples the results of recent the application of THM to the 2H(11B, ?08Be)n and 2H(10B, ?07Be)n reactions are presented.

  18. [Colony blot method for detection of legionellas--results of a comparative study].

    PubMed

    Obst, U

    1996-11-01

    In the following short communication a new commercially available immunoassay for the quantitative detection of Legionellae after cultivation was compared with the conventional method recommended by ISO in a study shared by 6 laboratories. After a training phase of the laboratory personal a very good agreement of immunological and conventional method was observed by testing 310 water samples. The colony blot assay for quantitative identification of Legionella spec. is a rapid and specific method and can be recommended for quantification of Legionella spec. in water samples. PMID:9409910

  19. Quantitative evaluation of an image registration method for a NIPAM gel dosimeter

    NASA Astrophysics Data System (ADS)

    Chang, Yuan-Jen; Yao, Chun-Hsu; Wu, Jay; Hsieh, Bor-Tsung; Tsang, Yuk-Wah; Chen, Chin-Hsing

    2015-06-01

    One of the problems in obtaining quality results is image registration when a gel dosimeter is used in conjunction with optical computed tomography (CT). This study proposes a passive alignment mechanism to obtain a precisely measured dose map. A holder plate with two pin-hole pairs is placed on the gel container cap. These two pin-hole pairs attach the gel container to the vertical shaft and can be precisely aligned with the rotation center of the vertical shaft at any time. Accordingly, a better reconstructed image quality is obtained. After obtaining a precisely measured dose map, the scale invariant feature transform (SIFT)-flow algorithm is utilized as an image registration method to align the treatment plan software (TPS) image with the measured dose map image. The results show that the gamma pass rate for the single-field irradiation increases from 83.39% to 94.03% when the algorithm is applied. And the gamma pass rate for the five-field irradiation treatment plan increases from 87.36% to 94.34%. The translation, scaling, and rotation occurring in the dose map image constructed using an optical CT scanner are also aligned with those in the TPS image using the SIFT-flow algorithm. Accordingly, improved gamma comparison results and a higher gamma pass rate are obtained.

  20. Comparison of the Diagnostic Performance of Four Quantitative Myocardial Perfusion Estimation Methods Used in Cardiac MR Imaging: CE-MARC Substudy

    PubMed Central

    Magee, Derek R.; Sourbron, Steven P.; Plein, Sven; Greenwood, John P.; Radjenovic, Aleksandra

    2015-01-01

    Purpose To compare the diagnostic performance of four tracer kinetic analysis methods to quantify myocardial perfusion from magnetic resonance (MR) imaging cardiac perfusion data sets in terms of their ability to lead to the diagnosis of myocardial ischemia. Materials and Methods The study was approved by the regional ethics committee, and all patients gave written consent. A representative sample of 50 patients with suspected ischemic heart disease was retrospectively selected from the Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease trial data set. Quantitative myocardial blood flow (MBF) was estimated from rest and adenosine stress MR imaging perfusion data sets by using four established methods. A matching diagnosis of both an inducible defect as assessed with single photon emission computed tomography and a luminal stenosis of 70% or more as assessed with quantitative x-ray angiography was used as the reference standard for the presence of myocardial ischemia. Diagnostic performance was evaluated with receiver operating characteristic (ROC) curve analysis for each method, with stress MBF and myocardial perfusion reserve (MPR) serving as continuous measures. Results Area under the ROC curve with stress MBF and MPR as the outcome measures, respectively, was 0.86 and 0.92 for the Fermi model, 0.85 and 0.87 for the uptake model, 0.85 and 0.80 for the one-compartment model, and 0.87 and 0.87 for model-independent deconvolution. There was no significant difference between any of the models or between MBF and MPR, except that the Fermi model outperformed the one-compartment model if MPR was used as the outcome measure (P = .02). Conclusion Diagnostic performance of quantitative myocardial perfusion estimates is not affected by the tracer kinetic analysis method used. © RSNA, 2014 Online supplemental material is available for this article. PMID:25521666

  1. Current and emerging quantitative magnetic resonance imaging methods for assessing and predicting the response of breast cancer to neoadjuvant therapy

    PubMed Central

    Abramson, Richard G; Arlinghaus, Lori R; Weis, Jared A; Li, Xia; Dula, Adrienne N; Chekmenev, Eduard Y; Smith, Seth A; Miga, Michael I; Abramson, Vandana G; Yankeelov, Thomas E

    2012-01-01

    Reliable early assessment of breast cancer response to neoadjuvant therapy (NAT) would provide considerable benefit to patient care and ongoing research efforts, and demand for accurate and noninvasive early-response biomarkers is likely to increase. Response assessment techniques derived from quantitative magnetic resonance imaging (MRI) hold great potential for integration into treatment algorithms and clinical trials. Quantitative MRI techniques already available for assessing breast cancer response to neoadjuvant therapy include lesion size measurement, dynamic contrast-enhanced MRI, diffusion-weighted MRI, and proton magnetic resonance spectroscopy. Emerging yet promising techniques include magnetization transfer MRI, chemical exchange saturation transfer MRI, magnetic resonance elastography, and hyperpolarized MR. Translating and incorporating these techniques into the clinical setting will require close attention to statistical validation methods, standardization and reproducibility of technique, and scanning protocol design. PMID:23154619

  2. Potential of multivariate quantitative methods for delineation and visualization of ecoregions.

    PubMed

    Hargrove, William W; Hoffman, Forrest M

    2004-01-01

    Multivariate clustering based on fine spatial resolution maps of elevation, temperature, precipitation, soil characteristics, and solar inputs has been used at several specified levels of division to produce a spectrum of quantitative ecoregion maps for the conterminous United States. The coarse ecoregion divisions accurately capture intuitively-understood regional environmental differences, whereas the finer divisions highlight local condition gradients, ecotones, and clines. Such statistically generated ecoregions can be produced based on user-selected continuous variables, allowing customized regions to be delineated for any specific problem. By creating an objective ecoregion classification, the ecoregion concept is removed from the limitations of human subjectivity, making possible a new array of ecologically useful derivative products. A red-green-blue visualization based on principal components analysis of ecoregion centroids indicates with color the relative combination of environmental conditions found within each ecoregion. Multiple geographic areas can be classified into a single common set of quantitative ecoregions to provide a basis for comparison, or maps of a single area through time can be classified to portray climatic or environmental changes geographically in terms of current conditions. Quantified representativeness can characterize borders between ecoregions as gradual, sharp, or of changing character along their length. Similarity of any ecoregion to all other ecoregions can be quantified and displayed as a "representativeness" map. The representativeness of an existing spatial array of sample locations or study sites can be mapped relative to a set of quantitative ecoregions, suggesting locations for additional samples or sites. In addition, the shape of Hutchinsonian niches in environment space can be defined if a multivariate range map of species occurrence is available. PMID:15883870

  3. A quantitative method for measuring the adherence of group B streptococci to murine peritoneal exudate macrophages.

    PubMed

    Sloan, A R; Pistole, T G

    1992-10-01

    We have developed a solid phase, direct binding, enzyme-linked immunosorbent assay (ELISA) to detect and quantify the adherence of group B streptococci to murine macrophages. The assay correlated well with direct microscopic quantification of adherence. As few as 3.8 x 10(4) bacteria/assay well or less than one bacterium per macrophage could be detected. This assay is both quantitative and selective, and is readily adaptable for multiple sample analysis. It provides a valuable alternative to visual detection of bacterial adherence. PMID:1401955

  4. A versatile, quantitative analytical method for pharmaceutical relevant lipids in drug delivery systems.

    PubMed

    Jeschek, Dominik; Lhota, Gabriele; Wallner, Jakob; Vorauer-Uhl, Karola

    2016-02-01

    Over the past few years, liposomal formulations as drug carrier systems have markedly advanced in pharmaceutical research and development. Therefore, analytical methods to characterize liposome-based formulations are required. One particular issue in liposome analysis is the imbalance of lipid ratios within the vesicle formulations and the detectability of degradation products such as lysophospholipids and fatty acids caused by hydrolysis, especially in low molar ranges. Here, a highly sensitive and selective reversed-phase high-performance liquid chromatography (rp-HPLC) method is described by the combination of an organic solvent/trifluoroacetic acid (TFA) triggered gradient and the application of an evaporative light scattering detector (ELSD). Gain setting adjustments of the ELSD were applied to obtain an optimal detection profile of the analyzed substances. This optimization provides simultaneous separation and quantification of 16 components, including different phosphatidylcholines, phosphatidylglycerols and their degradation products, as well as cholesterol. Parameters such as limit of detection (LOD) and limit of quantification (LOQ) were determined for each of the components and had ranges from 0.25-1.00mg/mL (LOD) and 0.50-2.50?g/mL (LOQ), respectively. The intra-day precision for all analytes is less than 3% (RSD) and inter-day precision is about 8%. The applicability of the method was verified by analyzing two different liposome formulations consisting of DSPC:DPPC:DSPG:Chol (35:35:20:10) and DSPC:DPPC:DSPG (38:38:24). For degradation studies, both formulations were stored at 4°C and at ambient temperature. Additionally, forced degradation experiments were performed to determine hydrolysis mass balances. A total recovery of 96-102% for phospholipid compounds was found. Analytical data revealed that the sensitivity, selectivity, accuracy, and resolution are appropriate for the detection and quantification of phospholipids and their hydrolysis products. These results as well as additional preliminary analyses of other relevant components used in liposomal formulations indicate that the developed method is suitable for the development, characterization, and stability testing of liposomal based biopharmaceuticals. PMID:26641705

  5. QUANTITATIVE METHODS FOR RESERVOIR CHARACTERIZATION AND IMPROVED RECOVERY: APPLICATION TO HEAVY OIL SANDS

    SciTech Connect

    James W. Castle; Fred J. Molz; Ronald W. Falta; Cynthia L. Dinwiddie; Scott E. Brame; Robert A. Bridges

    2002-10-30

    Improved prediction of interwell reservoir heterogeneity has the potential to increase productivity and to reduce recovery cost for California's heavy oil sands, which contain approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley. This investigation involves application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation, particularly in heavy oil sands. The investigation was performed in collaboration with Chevron Production Company U.S.A. as an industrial partner, and incorporates data from the Temblor Formation in Chevron's West Coalinga Field. Observations of lateral variability and vertical sequences observed in Temblor Formation outcrops has led to a better understanding of reservoir geology in West Coalinga Field. Based on the characteristics of stratigraphic bounding surfaces in the outcrops, these surfaces were identified in the subsurface using cores and logs. The bounding surfaces were mapped and then used as reference horizons in the reservoir modeling. Facies groups and facies tracts were recognized from outcrops and cores of the Temblor Formation and were applied to defining the stratigraphic framework and facies architecture for building 3D geological models. The following facies tracts were recognized: incised valley, estuarine, tide- to wave-dominated shoreline, diatomite, and subtidal. A new minipermeameter probe, which has important advantages over previous methods of measuring outcrop permeability, was developed during this project. The device, which measures permeability at the distal end of a small drillhole, avoids surface weathering effects and provides a superior seal compared with previous methods for measuring outcrop permeability. The new probe was used successfully for obtaining a high-quality permeability data set from an outcrop in southern Utah. Results obtained from analyzing the fractal structure of permeability data collected from the southern Utah outcrop and from core permeability data provided by Chevron from West Coalinga Field were used in distributing permeability values in 3D reservoir models. Spectral analyses and the Double Trace Moment method (Lavallee et al., 1991) were used to analyze the scaling and multifractality of permeability data from cores from West Coalinga Field. T2VOC, which is a numerical flow simulator capable of modeling multiphase, multi-component, nonisothermal flow, was used to model steam injection and oil production for a portion of section 36D in West Coalinga Field. The layer structure and permeability distributions of different models, including facies group, facies tract, and fractal permeability models, were incorporated into the numerical flow simulator. The injection and production histories of wells in the study area were modeled, including shutdowns and the occasional conversion of production wells to steam injection wells. The framework provided by facies groups provides a more realistic representation of the reservoir conditions than facies tracts, which is revealed by a comparison of the history-matching for the oil production. Permeability distributions obtained using the fractal results predict the high degree of heterogeneity within the reservoir sands of West Coalinga Field. The modeling results indicate that predictions of oil production are strongly influenced by the geologic framework and by the boundary conditions. The permeability data collected from the southern Utah outcrop, support a new concept for representing natural heterogeneity, which is called the fractal/facies concept. This hypothesis is one of the few potentially simplifying concepts to emerge from recent studies of geological heterogeneity. Further investigation of this concept should be done to more fully apply fractal analysis to reservoir modeling and simulation. Additional outcrop permeability data sets and further analysis of the data from distinct facies will be needed in order to fully develop

  6. A colony multiplex quantitative PCR-Based 3S3DBC method and variations of it for screening DNA libraries.

    PubMed

    An, Yang; Toyoda, Atsushi; Zhao, Chen; Fujiyama, Asao; Agata, Kiyokazu

    2015-01-01

    A DNA library is a collection of DNA fragments cloned into vectors and stored individually in host cells, and is a valuable resource for molecular cloning, gene physical mapping, and genome sequencing projects. To take the best advantage of a DNA library, a good screening method is needed. After describing pooling strategies and issues that should be considered in DNA library screening, here we report an efficient colony multiplex quantitative PCR-based 3-step, 3-dimension, and binary-code (3S3DBC) method we used to screen genes from a planarian genomic DNA fosmid library. This method requires only 3 rounds of PCR reactions and only around 6 hours to distinguish one or more desired clones from a large DNA library. According to the particular situations in different research labs, this method can be further modified and simplified to suit their requirements. PMID:25646755

  7. A Colony Multiplex Quantitative PCR-Based 3S3DBC Method and Variations of It for Screening DNA Libraries

    PubMed Central

    An, Yang; Toyoda, Atsushi; Zhao, Chen; Fujiyama, Asao; Agata, Kiyokazu

    2015-01-01

    A DNA library is a collection of DNA fragments cloned into vectors and stored individually in host cells, and is a valuable resource for molecular cloning, gene physical mapping, and genome sequencing projects. To take the best advantage of a DNA library, a good screening method is needed. After describing pooling strategies and issues that should be considered in DNA library screening, here we report an efficient colony multiplex quantitative PCR-based 3-step, 3-dimension, and binary-code (3S3DBC) method we used to screen genes from a planarian genomic DNA fosmid library. This method requires only 3 rounds of PCR reactions and only around 6 hours to distinguish one or more desired clones from a large DNA library. According to the particular situations in different research labs, this method can be further modified and simplified to suit their requirements. PMID:25646755

  8. Comparison of two solid-phase microextraction methods for the quantitative analysis of VOCs in indoor air.

    PubMed

    Larroque, Virginie; Desauziers, Valérie; Mocho, Pierre

    2006-11-01

    Competitive adsorption on adsorptive solid-phase microextraction (SPME) fibres implies careful determination of operating conditions for reliable quantitative analysis of VOCs in indoor air. With this objective, two analytical approaches, involving non-equilibrium and equilibrium extraction, were compared. The average detection limit obtained for GC-MS analysis of nine VOCs by the equilibrium method is 0.2 microg m(-3), compared with 1.9 microg m(-3) with the non-equilibrium method. The effect of the relative humidity of the air on the calibration plots was studied, and shown to affect acetone adsorption only. Hence, the concentrations that can be accurately determined are up to 9 micromol m(-3). The methods were then applied to indoor air containing different concentrations of VOCs. The non-equilibrium method, involving short extraction time, can be used for detection of pollution peaks whereas equilibrium extraction is preferable for measurement of sub-microg m(-3) ground concentration levels. PMID:16955258

  9. HPLC Method for Simultaneous Quantitative Detection of Quercetin and Curcuminoids in Traditional Chinese Medicines

    PubMed Central

    Ang, Lee Fung; Yam, Mun Fei; Fung, Yvonne Tan Tze; Kiang, Peh Kok; Darwin, Yusrida

    2014-01-01

    Objectives: Quercetin and curcuminoids are important bioactive compounds found in many herbs. Previously reported high performance liquid chromatography ultraviolet (HPLC-UV) methods for the detection of quercetin and curcuminoids have several disadvantages, including unsatisfactory separation times and lack of validation according the standard guidelines of the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use. Methods: A rapid, specific, reversed phase, HPLC-UV method with an isocratic elution of acetonitrile and 2% v/v acetic acid (40% : 60% v/v) (pH 2.6) at a flow rate of 1.3 mL/minutes, a column temperature of 35°C, and ultraviolet (UV) detection at 370 nm was developed. The method was validated and applied to the quantification of different types of market available Chinese medicine extracts, pills and tablets. Results: The method allowed simultaneous determination of quercetin, bisdemethoxycurcumin, demethoxycurcumin and curcumin in the concentration ranges of 0.00488 ? 200 ?g/mL, 0.625 ? 320 ?g/mL, 0.07813 ? 320 ?g/mL and 0.03906 ? 320 ?g/mL, respectively. The limits of detection and quantification, respectively, were 0.00488 and 0.03906 ?g/mL for quercetin, 0.62500 and 2.50000 ?g/mL for bisdemethoxycurcumin, 0.07813 and 0.31250 ?g/mL for demethoxycurcumin, and 0.03906 and 0.07813 ?g/mL for curcumin. The percent relative intra day standard deviation (% RSD) values were 0.432 ? 0.806 ?g/mL, 0.576 ? 0.723 ?g/mL, 0.635 ? 0.752 ?g/mL and 0.655 ? 0.732 ?g/mL for quercetin, bisdemethoxycurcumin, demethoxycurcumin and curcumin, respectively, and those for intra day precision were 0.323 ? 0.968 ?g/mL, 0.805 ? 0.854 ?g/mL, 0.078 ? 0.844 ?g/mL and 0.275 ? 0.829 ?g/mL, respectively. The intra day accuracies were 99.589% ? 100.821%, 98.588% ? 101.084%, 9.289% ? 100.88%, and 98.292% ? 101.022% for quercetin, bisdemethoxycurcumin, demethoxycurcumin and curcumin, respectively, and the inter day accuracy were 99.665% ? 103.06%, 97.669% ? 103.513%, 99.569% ? 103.617%, and 97.929% ? 103.606%, respectively. Conclusion: The method was found to be simple, accurate and precise and is recommended for routine quality control analysis of commercial Chinese medicine products containing the flour flavonoids as their principle components in the extracts. PMID:25780718

  10. A novel quantitative analysis method of three-dimensional fluorescence spectra for vegetable oils contents in edible blend oil

    NASA Astrophysics Data System (ADS)

    Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei

    2015-04-01

    Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.

  11. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    NASA Technical Reports Server (NTRS)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  12. Cross-talk of expression quantitative trait loci w...[Hypertension. 2007] -PubMed Result 1: Hypertension. 2007 Dec;50(6):1126-33. Epub 2007 Oct 15. Links

    E-print Network

    Abraham, Nader G.

    Cross-talk of expression quantitative trait loci w...[Hypertension. 2007] - PubMed Result 1: Hypertension. 2007 Dec;50(6):1126-33. Epub 2007 Oct 15. Links Cross-talk of expression quantitative trait loci

  13. Transcending the Quantitative-Qualitative Divide with Mixed Methods Research: A Multidimensional Framework for Understanding Congruence and Completeness in the Study of Values

    ERIC Educational Resources Information Center

    McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.

    2010-01-01

    Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…

  14. On Quantitizing

    ERIC Educational Resources Information Center

    Sandelowski, Margarete; Voils, Corrine I.; Knafl, George

    2009-01-01

    "Quantitizing", commonly understood to refer to the numerical translation, transformation, or conversion of qualitative data, has become a staple of mixed methods research. Typically glossed are the foundational assumptions, judgments, and compromises involved in converting disparate data sets into each other and whether such conversions advance…

  15. HUNTING THE COOLEST DWARFS: METHODS AND EARLY RESULTS

    SciTech Connect

    Schneider, A.; Song, Inseok; Melis, Carl; Zuckerman, B. E-mail: song@physast.uga.edu E-mail: ben@astro.ucla.edu

    2011-12-20

    We present the methods and first results of a survey of nearby high proper motion main-sequence stars to probe for cool companions with the Gemini camera at Lick Observatory. This survey uses a sample of old (age > 2 Gyr) stars as targets to probe for companions down to temperatures of 500 K. Multi-epoch observations allow us to discriminate comoving companions from background objects. So far, our survey has successfully rediscovered the wide T8.5 companion to GJ 1263 and has discovered a companion to the nearby M0V star GJ 660.1. The companion to GJ 660.1 (GJ 660.1B) is {approx}4 mag fainter than its host star in the J-band and is located at a projected separation of {approx}120 AU. Known trigonometric parallax and Two Micron All Sky Survey magnitudes for the GJ 660.1 system indicate a spectral type for the companion of M9 {+-} 2.

  16. HUMAN FECAL SOURCE IDENTIFICATION: REAL-TIME QUANTITATIVE PCR METHOD STANDARDIZATION - abstract

    EPA Science Inventory

    Method standardization or the formal development of a protocol that establishes uniform performance benchmarks and practices is necessary for widespread adoption of a fecal source identification approach. Standardization of a human-associated fecal identification method has been...

  17. Human Fecal Source Identification: Real-Time Quantitative PCR Method Standardization

    EPA Science Inventory

    Method standardization or the formal development of a protocol that establishes uniform performance benchmarks and practices is necessary for widespread adoption of a fecal source identification approach. Standardization of a human-associated fecal identification method has been...

  18. Development and validation of a fast and sensitive bioanalytical method for the quantitative determination of glucocorticoids-quantitative measurement of dexamethasone in rabbit ocular matrices by liquid chromatography tandem mass spectrometry

    PubMed Central

    Earla, Ravinder; Boddu, Sai HS.; Cholkar, Kishore; Hariharan, Sudharshan; Jwala, Jwala; Mitra, Ashim K.

    2010-01-01

    A sensitive, selective, accurate and robust LC-MS/MS method was developed and validated for the quantitative determination of glucocorticoids in rabbit ocular tissues. Samples were processed by a simple liquid- liquid extraction procedure. Chromatographic separation was performed on Phenomenex reversed phase C18 gemini column (50 × 4.6 mm i.d.,) with an isocratic mobile phase composed of 30% of acetonitrile in water containing 0.1% of formic acid, at a flow rate 0.2 mL/min. Dexamethasone (DEX), prednisolone (PD) and hydrocortisone (HD) were detected with proton adducts at m/z 393.20?355.30, 361.30?147.20 and 363.20?121.0 in multiple reaction monitoring (MRM) positive mode respectively. Finally, 50µL of 0.1% novel DEX mixed micellar formulation was topically administered to a rabbit eye and concentrations were measured. The method was validated over a linear concentration range of 2.7–617.6 ng/mL. Lower limit of quantitation (LLOQ) of DEX and PD was measured in the concentration range of 2.7 and 11.0 ng/mL respectively. The resulting method demonstrated intra and inter-day precision within 13.3 % and 11.1 % and accuracy within 19.3 % and 12.5 % for DEX and PD, respectively. Both analytes were found to be stable throughout freeze–thaw cycles and during bench top and postoperative stability studies (r2 > 0.999). DEX concentrations in various ocular tissue samples i.e., aqueous humor, cornea, iris ciliary body, sclera and retina choroid were found to be 344.0, 1050.07, 529.6, 103.9 and 48.5 ng/mg protein respectively. Absorption of DEX after topical administration from a novel aqueous mixed micellar formulation achieved therapeutic concentration levels in posterior segment of the rabbit eye. PMID:20172680

  19. A quantitative structure- property relationship of gas chromatographic/mass spectrometric retention data of 85 volatile organic compounds as air pollutant materials by multivariate methods

    PubMed Central

    2012-01-01

    A quantitative structure-property relationship (QSPR) study is suggested for the prediction of retention times of volatile organic compounds. Various kinds of molecular descriptors were calculated to represent the molecular structure of compounds. Modeling of retention times of these compounds as a function of the theoretically derived descriptors was established by multiple linear regression (MLR) and artificial neural network (ANN). The stepwise regression was used for the selection of the variables which gives the best-fitted models. After variable selection ANN, MLR methods were used with leave-one-out cross validation for building the regression models. The prediction results are in very good agreement with the experimental values. MLR as the linear regression method shows good ability in the prediction of the retention times of the prediction set. This provided a new and effective method for predicting the chromatography retention index for the volatile organic compounds. PMID:22594439

  20. Quantitative spectrally resolved intraoperative fluorescence imaging for neurosurgical guidance in brain tumor surgery: pre-clinical and clinical results

    NASA Astrophysics Data System (ADS)

    Valdés, Pablo A.; Jacobs, Valerie L.; Leblond, Frederic; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.

    2014-03-01

    Fluorescence-guidance is a useful adjunct to maximize brain tumor resection but current commercial systems are limited by subjective assessment of fluorescence, low sensitivity and non-spectrally-resolved detection. We present a quantitative, spectrally-resolved system integrated onto a commercial neurosurgical microscope that performs spectrallyresolved detection and corrects for effects of tissue optical absorption and scattering on the detected fluorescence signal to image the true fluorophore concentration. Pre-clinical studies in rodent glioma models using multiple fluorophores (PpIX, fluorescein) and clinical studies demonstrate improved residual tumor tissue detection. This quantitative, spectrally-resolved technique opens the door to simultaneous image-guided surgery of multiple fluorophores in the visible and near infrared.

  1. Longitudinal, intermodality registration of quantitative breast PET and MRI data acquired before and during neoadjuvant chemotherapy: Preliminary results

    SciTech Connect

    Atuegwu, Nkiruka C.; Williams, Jason M.; Li, Xia; Arlinghaus, Lori R.; Abramson, Richard G.; Department of Radiology and Radiological Sciences, Vanderbilt University Medical Center, Nashville, Tennessee 37232-2675; Vanderbilt Ingram Cancer Center, Vanderbilt University Medical Center, Nashville, Tennessee 37232-6838 ; Chakravarthy, A. Bapsi; Abramson, Vandana G.; Yankeelov, Thomas E.

    2014-05-15

    Purpose: The authors propose a method whereby serially acquired DCE-MRI, DW-MRI, and FDG-PET breast data sets can be spatially and temporally coregistered to enable the comparison of changes in parameter maps at the voxel level. Methods: First, the authors aligned the PET and MR images at each time point rigidly and nonrigidly. To register the MR images longitudinally, the authors extended a nonrigid registration algorithm by including a tumor volume-preserving constraint in the cost function. After the PET images were aligned to the MR images at each time point, the authors then used the transformation obtained from the longitudinal registration of the MRI volumes to register the PET images longitudinally. The authors tested this approach on ten breast cancer patients by calculating a modified Dice similarity of tumor size between the PET and MR images as well as the bending energy and changes in the tumor volume after the application of the registration algorithm. Results: The median of the modified Dice in the registered PET and DCE-MRI data was 0.92. For the longitudinal registration, the median tumor volume change was ?0.03% for the constrained algorithm, compared to ?32.16% for the unconstrained registration algorithms (p = 8 × 10{sup ?6}). The medians of the bending energy were 0.0092 and 0.0001 for the unconstrained and constrained algorithms, respectively (p = 2.84 × 10{sup ?7}). Conclusions: The results indicate that the proposed method can accurately spatially align DCE-MRI, DW-MRI, and FDG-PET breast images acquired at different time points during therapy while preventing the tumor from being substantially distorted or compressed.

  2. Validation of Quantitative HPLC Method for Bacosides in KeenMind.

    PubMed

    Dowell, Ashley; Davidson, George; Ghosh, Dilip

    2015-01-01

    Brahmi (Bacopa monnieri) has been used by Ayurvedic medical practitioners in India for almost 3000 years. The pharmacological properties of Bacopa monnieri were studied extensively and the activities were attributed mainly due to the presence of characteristic saponins called "bacosides." Bacosides are complex mixture of structurally closely related compounds, glycosides of either jujubogenin or pseudojujubogenin. The popularity of herbal medicines and increasing clinical evidence to support associated health claims require standardisation of the phytochemical actives contained in these products. However, unlike allopathic medicines which typically contain a single active compound, herbal medicines are typically complex mixtures of various phytochemicals. The assay for bacosides in the British Pharmacopoeia monograph for Bacopa monnieri exemplifies that only a subset of bacosides present are included in the calculation of total bacosides. These results in calculated bacoside values are significantly lower than those attained for the same material using more inclusive techniques such as UV spectroscopy. This study illustrates some of the problems encountered when applying chemical analysis for standardisation of herbal medicines, particularly in relation to the new method development and validation of bacosides from KeenMind. PMID:26448776

  3. Validation of Quantitative HPLC Method for Bacosides in KeenMind

    PubMed Central

    Dowell, Ashley; Davidson, George; Ghosh, Dilip

    2015-01-01

    Brahmi (Bacopa monnieri) has been used by Ayurvedic medical practitioners in India for almost 3000 years. The pharmacological properties of Bacopa monnieri were studied extensively and the activities were attributed mainly due to the presence of characteristic saponins called “bacosides.” Bacosides are complex mixture of structurally closely related compounds, glycosides of either jujubogenin or pseudojujubogenin. The popularity of herbal medicines and increasing clinical evidence to support associated health claims require standardisation of the phytochemical actives contained in these products. However, unlike allopathic medicines which typically contain a single active compound, herbal medicines are typically complex mixtures of various phytochemicals. The assay for bacosides in the British Pharmacopoeia monograph for Bacopa monnieri exemplifies that only a subset of bacosides present are included in the calculation of total bacosides. These results in calculated bacoside values are significantly lower than those attained for the same material using more inclusive techniques such as UV spectroscopy. This study illustrates some of the problems encountered when applying chemical analysis for standardisation of herbal medicines, particularly in relation to the new method development and validation of bacosides from KeenMind. PMID:26448776

  4. A new quantitative method for the rapid evaluation of buildings against earthquakes

    SciTech Connect

    Mahmoodzadeh, Amir; Mazaheri, Mohammad Mehdi

    2008-07-08

    At the present time there exist numerous weak buildings which are not able to withstand earthquakes. At the same time, both private and public developers are trying to use scientific methods to prioritize and allocate budget in order to reinforce the above mentioned structures. This is because of the limited financial resources and time. In the recent years the procedure of seismic assessment before rehabilitation of vulnerable buildings has been implemented in many countries. Now, it seems logical to reinforce the existing procedures with the mass of available data about the effects caused by earthquakes on buildings. The main idea is driven from FMEA (Failure Mode and Effect Analysis) in quality management where the main procedure is to recognize the failure, the causes, and the priority of each cause and failure. Specifying the causes and effects which lead to a certain shortcoming in structural behavior during earthquakes, an inventory is developed and each building is rated through a yes-or-no procedure. In this way, the rating of the structure is based on some standard forms which along with relative weights are developed in this study. The resulted criteria by rapid assessment will indicate whether the structure is to be demolished, has a high, medium or low vulnerability or is invulnerable.

  5. Quantitation of Synthetic Cannabinoids in Plant Materials Using High Performance Liquid Chromatography with UV Detection (Validated Method).

    PubMed

    Ciolino, Laura A

    2015-09-01

    Plant based products laced with synthetic cannabinoids have become popular substances of abuse over the last decade. Quantitative analysis for synthetic cannabinoid content in the laced materials is necessary for health hazard assessments addressing overall exposure and toxicity when the products are smoked. A validated, broadly applicable HPLC-UV method for the determination of synthetic cannabinoids in plant materials is presented, using acetonitrile extraction and separation on a commercial phenylhexyl stationary phase. UV detection provides excellent sensitivity with limits of quantitation (LOQs) less than 10 ?g/g for many cannabinoids. The method was validated for several structural classes (dibenzopyrans, cyclohexylphenols, naphthoylindoles, benzoylindoles, phenylacetylindoles, tetramethylcyclopropylindoles) based on spike recovery experiments in multiple plant materials over a wide cannabinoid contents range (0.1-81 mg/g). Average recovery across 32 cannabinoids was 94% for marshmallow leaf, 95% for damiana leaf, and 92% for mullein leaf. The method was applied to a series of case-related products with determined amounts ranging from 0.2 to >100 mg/g. PMID:26175160

  6. Real-Time PCR-Based Quantitation Method for the Genetically Modified Soybean Line GTS 40-3-2.

    PubMed

    Kitta, Kazumi; Takabatake, Reona; Mano, Junichi

    2016-01-01

    This chapter describes a real-time PCR-based method for quantitation of the relative amount of genetically modified (GM) soybean line GTS 40-3-2 [Roundup Ready(®) soybean (RRS)] contained in a batch. The method targets a taxon-specific soybean gene (lectin gene, Le1) and the specific DNA construct junction region between the Petunia hybrida chloroplast transit peptide sequence and the Agrobacterium 5-enolpyruvylshikimate-3-phosphate synthase gene (epsps) sequence present in GTS 40-3-2. The method employs plasmid pMulSL2 as a reference material in order to quantify the relative amount of GTS 40-3-2 in soybean samples using a conversion factor (Cf) equal to the ratio of the RRS-specific DNA to the taxon-specific DNA in representative genuine GTS 40-3-2 seeds. PMID:26614294

  7. How does carbon dioxide permeate cell membranes? A discussion of concepts, results and methods

    PubMed Central

    Endeward, Volker; Al-Samir, Samer; Itel, Fabian; Gros, Gerolf

    2013-01-01

    We review briefly how the thinking about the permeation of gases, especially CO2, across cell and artificial lipid membranes has evolved during the last 100 years. We then describe how the recent finding of a drastic effect of cholesterol on CO2 permeability of both biological and artificial membranes fundamentally alters the long-standing idea that CO2—as well as other gases—permeates all membranes with great ease. This requires revision of the widely accepted paradigm that membranes never offer a serious diffusion resistance to CO2 or other gases. Earlier observations of “CO2-impermeable membranes” can now be explained by the high cholesterol content of some membranes. Thus, cholesterol is a membrane component that nature can use to adapt membrane CO2 permeability to the functional needs of the cell. Since cholesterol serves many other cellular functions, it cannot be reduced indefinitely. We show, however, that cells that possess a high metabolic rate and/or a high rate of O2 and CO2 exchange, do require very high CO2 permeabilities that may not be achievable merely by reduction of membrane cholesterol. The article then discusses the alternative possibility of raising the CO2 permeability of a membrane by incorporating protein CO2 channels. The highly controversial issue of gas and CO2 channels is systematically and critically reviewed. It is concluded that a majority of the results considered to be reliable, is in favor of the concept of existence and functional relevance of protein gas channels. The effect of intracellular carbonic anhydrase, which has recently been proposed as an alternative mechanism to a membrane CO2 channel, is analysed quantitatively and the idea considered untenable. After a brief review of the knowledge on permeation of O2 and NO through membranes, we present a summary of the 18O method used to measure the CO2 permeability of membranes and discuss quantitatively critical questions that may be addressed to this method. PMID:24409149

  8. Real time quantitative PCR as a method to evaluate simian virus 40 removal during pharmaceutical protein purification.

    PubMed

    Shi, L; Norling, L A; Lau, A S; Krejci, S; Laney, A J; Xu, Y

    1999-09-01

    Continuous cell lines used for pharmaceutical protein manufacturing have the potential to be contaminated by viruses. To ensure the safety of pharmaceutical proteins derived from continuous cell lines, validation of the ability of the manufacturing process to clear potential contaminating viruses is required for product registration. In this paper, a real time quantitative PCR method has been applied to the evaluation of simian virus 40 (SV40) removal during chromatography and filtration procedures. This method takes advantage of the 5'-3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 sequence detection system of PE Applied Biosystems for automated SV40 DNA quantification through a dual-labeled fluorogenic probe. This method provides accurate and reproducible quantification of SV40 DNA. The SV40 clearance during chromatography and filtration procedures determined by this method is highly comparable with that determined by the cell-based infectivity assay. This method offers significant advantages over cell-based infectivity assays, such as higher sensitivity, greater reliability, higher sample throughput and lower cost. This method can be potentially used to evaluate the clearance of all model viruses during chromatography and filtration procedures. This method can be used to substitute cell-based infectivity assays for process validation of viral removal procedures and the availability of this method should greatly facilitate and reduce the cost of viral clearance evaluations required for new biologic product development. PMID:10652180

  9. Methods for the quantitative comparison of molecular estimates of clade age and the fossil record.

    PubMed

    Clarke, Julia A; Boyd, Clint A

    2015-01-01

    Approaches quantifying the relative congruence, or incongruence, of molecular divergence estimates and the fossil record have been limited. Previously proposed methods are largely node specific, assessing incongruence at particular nodes for which both fossil data and molecular divergence estimates are available. These existing metrics, and other methods that quantify incongruence across topologies including entirely extinct clades, have so far not taken into account uncertainty surrounding both the divergence estimates and the ages of fossils. They have also treated molecular divergence estimates younger than previously assessed fossil minimum estimates of clade age as if they were the same as cases in which they were older. However, these cases are not the same. Recovered divergence dates younger than compared oldest known occurrences require prior hypotheses regarding the phylogenetic position of the compared fossil record and standard assumptions about the relative timing of morphological and molecular change to be incorrect. Older molecular dates, by contrast, are consistent with an incomplete fossil record and do not require prior assessments of the fossil record to be unreliable in some way. Here, we compare previous approaches and introduce two new descriptive metrics. Both metrics explicitly incorporate information on uncertainty by utilizing the 95% confidence intervals on estimated divergence dates and data on stratigraphic uncertainty concerning the age of the compared fossils. Metric scores are maximized when these ranges are overlapping. MDI (minimum divergence incongruence) discriminates between situations where molecular estimates are younger or older than known fossils reporting both absolute fit values and a number score for incompatible nodes. DIG range (divergence implied gap range) allows quantification of the minimum increase in implied missing fossil record induced by enforcing a given set of molecular-based estimates. These metrics are used together to describe the relationship between time trees and a set of fossil data, which we recommend be phylogenetically vetted and referred on the basis of apomorphy. Differences from previously proposed metrics and the utility of MDI and DIG range are illustrated in three empirical case studies from angiosperms, ostracods, and birds. These case studies also illustrate the ways in which MDI and DIG range may be used to assess time trees resultant from analyses varying in calibration regime, divergence dating approach or molecular sequence data analyzed. PMID:25281846

  10. A new quantitative evaluation method of spiral drawing for patients with Parkinson’s disease based on a polar coordinate system with varying origin

    NASA Astrophysics Data System (ADS)

    Wang, Min; Wang, Bei; Zou, Junzhong; Nakamura, Masatoshi

    2012-09-01

    Parkinson's disease (PD) is a common disease of the central nervous system among the elderly, and its complex symptoms bring up challenges for the clinical diagnosis. In this paper, a new method based on a polar coordinate system with varying origin was proposed in order to quantitatively evaluate the performance in spiral drawing tasks for patients with Parkinson's disease, since this method can assess the movement ability of spiral drawing before and after deep brain stimulation (DBS) among the patients. In this paper, three normal subjects and twelve PD patients participated in spiral drawing experiment. The hand movements of patients, before and after DBS, were recorded by a digitized tablet respectively in this experiment. And the variation of origin, radius, degree and other characteristics of hand movements were evaluated by introducing a set of parameters for feature extraction. The result showed that the proposed polar coordinate system embraced good performance in the quantitative evaluation of spiral drawing. Therefore, the proposed method overcame the limitation of data processes with fixed origin for diagnosis and evaluation, and by combining with extraction and analysis of characteristic parameters it had clinical significance in measuring the effectiveness of operation or treatment for the PD patients.

  11. An improved method for quantitative determination of urinary porphyrins by use of second-derivative spectroscopy.

    PubMed

    van de Giessen, A W; van Wijk, E M

    1990-09-01

    An improved assay for quantification of urinary porphyrins by use of second-derivative spectroscopy is described. A new method for calculation of the porphyrin concentration is developed and the whole procedure is computerized. Acidified urine samples can be assayed within a few minutes by using this method. Precision and recoveries for both uro- and coproporphyrin are good. The method is presented as a very fast and accurate assay for the screening and quantification of urinary porphyrins. PMID:2290079

  12. Annual Logging Symposium, May 14-18, 2011 QUANTITATIVE METHOD FOR ESTIMATING TOTAL ORGANIC

    E-print Network

    Torres-Verdín, Carlos

    a technical challenge to the petroleum industry. Conventional log interpretation methods are not applicable shales. Accurate mineralogy evaluation in addition to geological knowledge can improve the evaluation

  13. Quantitative Determination of Mithramycin in Human Plasma by a Novel, Sensitive ultra-HPLC-MS/MS Method for Clinical Pharmacokinetic Application

    PubMed Central

    Roth, Jeffrey; Peer, Cody J.; Widemann, Brigitte; Cole, Diane E.; Ershler, Rachel; Helman, Lee; Schrump, David; Figg, William D.

    2014-01-01

    Mithramycin is a neoplastic antibiotic synthesized by various Streptomyces bacteria. It is under investigation as a chemotherapeutic treatment for a wide variety of cancers. Ongoing and forthcoming clinical trials will require pharmacokinetic analysis of mithramycin in humans, both to see if target concentrations are achieved and to optimize dosing and correlate outcomes (response/toxicity) with pharmacokinetics. Two published methods for mithramycin quantitation exist, but both are immunoassays that lack current bioanalytical standards of selectivity and sensitivity. To provide an upgraded and more widely applicable assay, a UPLC-MS/MS method for quantitation of mithramycin in human plasma was developed. Solid phase extraction allowed for excellent recoveries (>90%) necessary for high throughput analyses on sensitive instrumentation. However, a ~55% reduction in analyte signal was observed as a result of plasma matrix effects. Mithramycin and the internal standard chromomycin were separated on a Waters Acquity BEH C18 column (2.1x50mm, 1.7um) and detected using electrospray ionization operated in the negative mode at mass transitions m/z 1083.5?268.9 and 1181.5?269.0, respectively, on an AB Sciex QTrap 5500. The assay range was 0.5–500 ng/mL and proved to be linear (r2>0.996), accurate (?10% deviation), and precise (CV<15%). Mithramycin was stable in plasma at room temperature for 24 hours, as well as through three freeze-thaw cycles. This method was subsequently used to quantitate mithramycin plasma concentrations from patients enrolled on two clinical trials at the NCI. PMID:25247492

  14. Quantitative determination of mithramycin in human plasma by a novel, sensitive ultra-HPLC-MS/MS method for clinical pharmacokinetic application.

    PubMed

    Roth, Jeffrey; Peer, Cody J; Widemann, Brigitte; Cole, Diane E; Ershler, Rachel; Helman, Lee; Schrump, David; Figg, William D

    2014-11-01

    Mithramycin is a neoplastic antibiotic synthesized by various Streptomyces bacteria. It is under investigation as a chemotherapeutic treatment for a wide variety of cancers. Ongoing and forthcoming clinical trials will require pharmacokinetic analysis of mithramycin in humans, both to see if target concentrations are achieved and to optimize dosing and correlate outcomes (response/toxicity) with pharmacokinetics. Two published methods for mithramycin quantitation exist, but both are immunoassays that lack current bioanalytical standards of selectivity and sensitivity. To provide an upgraded and more widely applicable assay, a UPLC-MS/MS method for quantitation of mithramycin in human plasma was developed. Solid-phase extraction allowed for excellent recoveries (>90%) necessary for high throughput analyses on sensitive instrumentation. However, a ?55% reduction in analyte signal was observed as a result of plasma matrix effects. Mithramycin and the internal standard chromomycin were separated on a Waters Acquity BEH C18 column (2.1×50 mm, 1.7 ?m) and detected using electrospray ionization operated in the negative mode at mass transitions m/z 1083.5?268.9 and 1181.5?269.0, respectively, on an AB Sciex QTrap 5500. The assay range was 0.5-500 ng/mL and proved to be linear (r(2)>0.996), accurate (?10% deviation), and precise (CV<15%). Mithramycin was stable in plasma at room temperature for 24 h, as well as through three freeze-thaw cycles. This method was subsequently used to quantitate mithramycin plasma concentrations from patients enrolled on two clinical trials at the NCI. PMID:25247492

  15. Qualitative and quantitative determination of human biomarkers by laser photoacoustic spectroscopy methods

    NASA Astrophysics Data System (ADS)

    Popa, C.; Bratu, A. M.; Matei, C.; Cernat, R.; Popescu, A.; Dumitras, D. C.

    2011-07-01

    The hypothesis that blood, urine and other body fluids and tissues can be sampled and analyzed to produce clinical information for disease diagnosis or therapy monitoring is the basis of modern clinical diagnosis and medical practice. The analysis of breath air has major advantages because it is a non-invasive method, represents minimal risk to personnel collecting the samples and can be often sampled. Breath air samples from the human subjects were collected using aluminized bags from QuinTron and analyzed using the laser photoacoustic spectroscopy (LPAS) technique. LPAS is used to detect traces of ethylene in breath air resulting from lipid peroxidation in lung epithelium following the radiotherapy and also traces of ammonia from patients subjected to hemodialysis for treatment of renal failure. In the case of patients affected by cancer and treated by external radiotherapy, all measurements were done at 10P(14) CO2 laser line, where the ethylene absorption coefficient has the largest value (30.4 cm-1 atm-1), whereas for patients affected by renal failure and treated by standard dialysis, all measurements were performed at 9R(30) CO2 laser line, where the ammonia absorption coefficient has the maximum value of 57 cm-1 atm-1. The levels of ethylene and ammonia in exhaled air, from patients with cancer and renal failure, respectively, were measured and compared with breath air contents from healthy humans. Human gas biomarkers were measured at sub-ppb (parts per billion) concentration sensitivities. It has been demonstrated that LPAS technique will play an important role in the future of exhaled breath air analysis. The key attributes of this technique are sensitivity, selectivity, fast and real time response, as well as its simplicity.

  16. D-glucose, D-galactose, and D-lactose non-enzyme quantitative and qualitative analysis method based on Cu foam electrode.

    PubMed

    Jiaojiao, Jin; Yangyang, Ge; Gangying, Zheng; Yanping, Cai; Wei, Liu; Guohua, Hui

    2015-05-15

    Here, D-glucose, D-galactose, and D-lactose non-enzyme quantitative and qualitative analysis method using Cu foam electrode had been investigated. Porous Cu foam material was prepared by electrodeposition strategy, and used as working electrode. Cyclic voltammetry (CV) explained sweetener electro-oxidation process occurring on Cu foam electrode. Amperometric i-t scanning results demonstrated that Cu foam electrode fast responded to D-glucose, D-galactose, and D-lactose in linear concentration range between 0.18 mM and 3.47 mM with significant sensitivity of 1.79 mA cm(-2)mM(-1), 0.57 mA cm(-2)mM(-1), and 0.64 mA cm(-2)mM(-1), respectively. Limit of detection (LOD) was 9.30 ?M, 29.40 ?M, and 26 ?M respectively (S/N=3). Sweetener species was decided by stochastic resonance (SR) signal-to-noise ratio (SNR) eigen peak located noise intensities. Interference experiment results demonstrated that Cu foam electrode selectively responded to sweeteners against interference chemicals. The proposed method provides a promising way for sweetener non-enzyme quantitative and qualitative analysis. PMID:25577110

  17. QUANTITATIVE TOXICOLOGIC PATHOLOGY-METHODS AND INTERPRETATION' SESSION AT THE JOINT MEETING OF SOCIETY OF TOXICOLOGIC PATHOLOGISTS AND THE INTERNATIONAL FEDERATION OF SOCIETIES OF TOXICOLOGIC PATHOLOGISTS

    EPA Science Inventory

    Report of the 'Quantitative Toxicologic Pathology - Methods and Interpretation' session at the Joint meeting of Society of Toxicologic Pathologists and the International Federation of Societies of Toxicologic Pathologists, Orlando, Florida, USA, June 24-28, 2001. Douglas C. Wolf,...

  18. A RAPID METHOD FOR THE EXTRACTION OF FUNGAL DNA FROM ENVIRONMENTAL SAMPLES: EVALUATION IN THE QUANTITATIVE ANALYSIS OF MEMNONIELLA ECHINATA CONIDIA USING REAL TIME DETECTION OF PCR PRODUCTS

    EPA Science Inventory

    New technologies are creating the potential for using nucleic acid sequence detection to perform routine microbiological analyses of environmental samples. Our laboratory has recently reported on the development of a method for the quantitative detection of Stachybotrys chartarum...

  19. A Fuzzy-Based Fusion Method of Multimodal Sensor-Based Measurements for the Quantitative Evaluation of Eye Fatigue on 3D Displays

    PubMed Central

    Bang, Jae Won; Choi, Jong-Suk; Heo, Hwan; Park, Kang Ryoung

    2015-01-01

    With the rapid increase of 3-dimensional (3D) content, considerable research related to the 3D human factor has been undertaken for quantitatively evaluating visual discomfort, including eye fatigue and dizziness, caused by viewing 3D content. Various modalities such as electroencephalograms (EEGs), biomedical signals, and eye responses have been investigated. However, the majority of the previous research has analyzed each modality separately to measure user eye fatigue. This cannot guarantee the credibility of the resulting eye fatigue evaluations. Therefore, we propose a new method for quantitatively evaluating eye fatigue related to 3D content by combining multimodal measurements. This research is novel for the following four reasons: first, for the evaluation of eye fatigue with high credibility on 3D displays, a fuzzy-based fusion method (FBFM) is proposed based on the multimodalities of EEG signals, eye blinking rate (BR), facial temperature (FT), and subjective evaluation (SE); second, to measure a more accurate variation of eye fatigue (before and after watching a 3D display), we obtain the quality scores of EEG signals, eye BR, FT and SE; third, for combining the values of the four modalities we obtain the optimal weights of the EEG signals BR, FT and SE using a fuzzy system based on quality scores; fourth, the quantitative level of the variation of eye fatigue is finally obtained using the weighted sum of the values measured by the four modalities. Experimental results confirm that the effectiveness of the proposed FBFM is greater than other conventional multimodal measurements. Moreover, the credibility of the variations of the eye fatigue using the FBFM before and after watching the 3D display is proven using a t-test and descriptive statistical analysis using effect size. PMID:25961382

  20. A Fuzzy-Based Fusion Method of Multimodal Sensor-Based Measurements for the Quantitative Evaluation of Eye Fatigue on 3D Displays.

    PubMed

    Bang, Jae Won; Choi, Jong-Suk; Heo, Hwan; Park, Kang Ryoung

    2015-01-01

    With the rapid increase of 3-dimensional (3D) content, considerable research related to the 3D human factor has been undertaken for quantitatively evaluating visual discomfort, including eye fatigue and dizziness, caused by viewing 3D content. Various modalities such as electroencephalograms (EEGs), biomedical signals, and eye responses have been investigated. However, the majority of the previous research has analyzed each modality separately to measure user eye fatigue. This cannot guarantee the credibility of the resulting eye fatigue evaluations. Therefore, we propose a new method for quantitatively evaluating eye fatigue related to 3D content by combining multimodal measurements. This research is novel for the following four reasons: first, for the evaluation of eye fatigue with high credibility on 3D displays, a fuzzy-based fusion method (FBFM) is proposed based on the multimodalities of EEG signals, eye blinking rate (BR), facial temperature (FT), and subjective evaluation (SE); second, to measure a more accurate variation of eye fatigue (before and after watching a 3D display), we obtain the quality scores of EEG signals, eye BR, FT and SE; third, for combining the values of the four modalities we obtain the optimal weights of the EEG signals BR, FT and SE using a fuzzy system based on quality scores; fourth, the quantitative level of the variation of eye fatigue is finally obtained using the weighted sum of the values measured by the four modalities. Experimental results confirm that the effectiveness of the proposed FBFM is greater than other conventional multimodal measurements. Moreover, the credibility of the variations of the eye fatigue using the FBFM before and after watching the 3D display is proven using a t-test and descriptive statistical analysis using effect size. PMID:25961382

  1. Quantitative Comparison of Three Standardization Methods Using a One-Way ANOVA for Multiple Mean Comparisons

    ERIC Educational Resources Information Center

    Barrows, Russell D.

    2007-01-01

    A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…

  2. An average enumeration method of hyperspectral imaging data for quantitative evaluation of medical device surface contamination

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We propose a quantification method called Mapped Average Principal Component Analysis Score (MAPS) to enumerate the contamination coverage on common medical device surfaces. The method was adapted from conventional Principal Component Analysis (PCA) on non-overlapped regions on a full frame hyperspe...

  3. IDENTIFICATION OF TOXICANTS IN WHOLE MARINE SEDIMENTS: METHODS AND RESULTS

    EPA Science Inventory

    Identification of stressors in aquatic systems is critical to sound assessment and management of our nation's waterways. Information from stressor identification can be useful in designing effective sediment remediation methods, assessing options for sediment disposal, allowing m...

  4. Real time PCR method for simultaneous detection, quantitation and differentiation of capripoxviruses.

    PubMed

    Lamien, Charles Euloge; Lelenta, Mamadou; Goger, Wilfried; Silber, Roland; Tuppurainen, Eeva; Matijevic, Mirta; Luckins, Antony George; Diallo, Adama

    2011-01-01

    The genus Capripoxvirus (CaPV) comprises three members namely, sheep poxvirus (SPPV), goat poxvirus (GTPV) and lumpy skin disease virus (LSDV) affecting sheep, goats and cattle, respectively. CaPV infections produce similar symptoms in sheep and goats, and the three viruses cannot be distinguished serologically. Since there are conflicting opinions regarding the host specificity of CaPVs, particularly for goatpox and sheeppox viruses, the development of rapid genotyping tools will facilitate more accurate disease diagnosis and surveillance for better management of capripox outbreaks. This paper describes a species-specific, real time polymerase chain reaction (PCR), based on unique molecular markers that were found in the G-protein-coupled chemokine receptor (GPCR) gene sequences of CaPVs, that uses dual hybridization probes for their simultaneous detection, quantitation and genotyping. The assay can differentiate between CaPV strains based on differences in the melting point temperature (Tm) obtained after fluorescence melting curve analysis (FMCA). It is highly sensitive and presents low intra- and inter-run variation. This real time PCR assay will make a significant contribution to CaPV diagnosis and to the better understanding of the epidemiology of CaPVs by enabling rapid genotyping and gene-based classification of viral strains and unequivocal identification of isolates. PMID:21029751

  5. Quantitative interpretation of fossil pollen spectra: Dissimilarity coefficients and the method of modern analogs

    NASA Astrophysics Data System (ADS)

    Overpeck, J. T.; Webb, T.; Prentice, I. C.

    1985-01-01

    Dissimilarity coefficients measure the difference between multivariate samples and provide a quantitative aid to the identification of modern analogs for fossil pollen samples. How eight coefficients responded to differences among modern pollen samples from eastern North America was tested. These coefficients represent three different classes: (1) unweighted coefficients that are most strongly influenced by large-valued pollen types, (2) equal-weight coefficients that weight all pollen types equally but can be too sensitive to variations among rare types, and (3) signal0to-noise coefficients that are intermediate in their weighting of pollen types. The studies with modern pollen allowed definition of critical values for each coefficient, which, when not exceeded, indicate that two pollen samples originate from the same vegetation region. Dissimilarity coefficients were used to compare modern and fossil pollen samples, and modern samples so similar to fossil samples were found that most of three late Quaternary pollen diagrams could be "reconstructed" by substituting modern samples for fossil samples. When the coefficients indicated that the fossil spectra had no modern analogs, then the reconstructed diagrams did not match all aspects of the originals. No modern analogs existed for samples from before 9300 yr B.P. at Kirchner Marsh, Minnesota, and from before 11,000 yr B.P. at Wintergreen Lake, Michigan, but modern analogs existed for almost all Holocene samples from these two sites and Brandreth Bog. New York.

  6. Quantitative characterizations of phasic structure developments by local measurement methods in two-phase flow

    SciTech Connect

    Eberle, C.S.; Leung, W.H.; Wu, Q.; Ueno, T.; Ishii, M.

    1995-06-01

    An experimental study on the internal structure an a out in a 25.4 mm ID pipe. The local void fraction and interfacial area concentration were measured by a double-sensor probe. The flow structure development was visualized by measuring the radial distribution of these two parameters at three axial, locations (L/D = 12, 62, and 112). A more detailed study on the fully developed flow structure was conducted at L/D = 120. The interfacial structure were measured by the double- and four-sensor probes. A bubbly to-=slug transition region was defined according to the local data.The area-averaged void fraction measurements were given by a gamma densitometer. Other parameters such as the Taylor bubble film thickness, bubble length and slug unit length in slug flow were measured by a film robe. The redundant measurements were made to calibrate the local probe measurements. The quantitative representation of the phasic structure can then be used for modeling.

  7. Development of a Quantitative SRM-Based Proteomics Method to Study Iron Metabolism of Synechocystis sp. PCC 6803.

    PubMed

    Vuorijoki, Linda; Isojärvi, Janne; Kallio, Pauli; Kouvonen, Petri; Aro, Eva-Mari; Corthals, Garry L; Jones, Patrik R; Muth-Pawlak, Dorota

    2016-01-01

    The cyanobacterium Synechocystis sp. PCC 6803 (S. 6803) is a well-established model species in oxygenic photosynthesis research and a potential host for biotechnological applications. Despite recent advances in genome sequencing and microarray techniques applied in systems biology, quantitative proteomics approaches with corresponding accuracy and depth are scarce for S. 6803. In this study, we developed a protocol to screen changes in the expression of 106 proteins representing central metabolic pathways in S. 6803 with a targeted mass spectrometry method, selected reaction monitoring (SRM). We evaluated the response to the exposure of both short- and long-term iron deprivation. The experimental setup enabled the relative quantification of 96 proteins, with 87 and 92 proteins showing adjusted p-values <0.01 under short- and long-term iron deficiency, respectively. The high sensitivity of the SRM method for S. 6803 was demonstrated by providing quantitative data for altogether 64 proteins that previously could not be detected with the classical data-dependent MS approach under similar conditions. This highlights the effectiveness of SRM for quantification and extends the analytical capability to low-abundance proteins in unfractionated samples of S. 6803. The SRM assays and other generated information are now publicly available via PASSEL and Panorama. PMID:26652789

  8. A Validated Reverse Phase HPLC Analytical Method for Quantitation of Glycoalkaloids in Solanum lycocarpum and Its Extracts

    PubMed Central

    Tiossi, Renata Fabiane Jorge; Miranda, Mariza Abreu; de Sousa, Joăo Paulo Barreto; Praça, Fabíola Silva Garcia; Bentley, Maria Vitória Lopes Badra; McChesney, James Dewey; Bastos, Jairo Kenupp

    2012-01-01

    Solanum lycocarpum (Solanaceae) is native to the Brazilian Cerrado. Fruits of this species contain the glycoalkaloids solasonine (SN) and solamargine (SM), which display antiparasitic and anticancer properties. A method has been developed for the extraction and HPLC-UV analysis of the SN and SM in different parts of S. lycocarpum, mainly comprising ripe and unripe fruits, leaf, and stem. This analytical method was validated and gave good detection response with linearity over a dynamic range of 0.77–1000.00??g?mL?1 and recovery in the range of 80.92–91.71%, allowing a reliable quantitation of the target compounds. Unripe fruits displayed higher concentrations of glycoalkaloids (1.04%?±?0.01 of SN and 0.69%?±?0.00 of SM) than the ripe fruits (0.83%?±?0.02 of SN and 0.60%?±?0.01 of SM). Quantitation of glycoalkaloids in the alkaloidic extract gave 45.09%?±?1.14 of SN and 44.37%?±?0.60 of SM, respectively. PMID:22567576

  9. REGIONAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION: EVALUATION OF INTEGRATION METHODS AND ASSESSMENTS RESULTS

    EPA Science Inventory

    This report describes methods for quantitative regional assessment developed by the Regional Vulnerability Assessment (ReVA) program. The goal of ReVA is to develop regional-scale assessments of the magnitude, extent, distribution, and uncertainty of current and anticipated envir...

  10. A simple method for quantitating the propensity for calcium oxalate crystallization in urine

    NASA Technical Reports Server (NTRS)

    Wabner, C. L.; Pak, C. Y.

    1991-01-01

    To assess the propensity for spontaneous crystallization of calcium oxalate in urine, the permissible increment in oxalate is calculated. The previous method required visual observation of crystallization with the addition of oxalate, this warranted the need for a large volume of urine and a sacrifice in accuracy in defining differences between small incremental changes of added oxalate. Therefore, this method has been miniaturized and spontaneous crystallization is detected from the depletion of radioactive oxalate. The new "micro" method demonstrated a marked decrease (p < 0.001) in the permissible increment in oxalate in urine of stone formers versus normal subjects. Moreover, crystallization inhibitors added to urine, in vitro (heparin or diphosphonate) or in vivo (potassium citrate administration), substantially increased the permissible increment in oxalate. Thus, the "micro" method has proven reliable and accurate in discriminating stone forming from control urine and in distinguishing changes of inhibitory activity.

  11. Novel quantitative methods for characterization of chemical induced functional alteration in developing neuronal cultures

    EPA Science Inventory

    ABSTRACT BODY: Thousands of chemicals lack adequate testing for adverse effects on nervous system development, stimulating research into alternative methods to screen chemicals for potential developmental neurotoxicity. Microelectrode arrays (MEA) collect action potential spiking...

  12. Evolution of quantitative methods for the study and management of avian populations: on the importance of individual contributions

    USGS Publications Warehouse

    Nichols, J.D.

    2004-01-01

    Evolution of quantitative methods for the study and management of avian populations: on the importance of individual contributions.-The EURING meetings and the scientists who have attended them have contributed substantially to the growth of knowledge in the field of estimating parameters of animal populations. The contributions of David R. Anderson to process modeling, parameter estimation and decision analysis are briefly reviewed. Metrics are considered for assessing individual contributions to a field of inquiry, and it is concluded that Anderson's contributions have been substantial. Important characteristics of Anderson and his career are the ability to identify and focus on important topics, the premium placed on dissemination of new methods to prospective users, the ability to assemble teams of complementary researchers, and the innovation and vision that characterized so much of his work. The paper concludes with a list of interesting current research topics for consideration by EURING participants.

  13. Methods of quantitative and qualitative analysis of bird migration with a tracking radar

    NASA Technical Reports Server (NTRS)

    Bruderer, B.; Steidinger, P.

    1972-01-01

    Methods of analyzing bird migration by using tracking radar are discussed. The procedure for assessing the rate of bird passage is described. Three topics are presented concerning the grouping of nocturnal migrants, the velocity of migratory flight, and identification of species by radar echoes. The height and volume of migration under different weather conditions are examined. The methods for studying the directions of migration and the correlation between winds and the height and direction of migrating birds are presented.

  14. Beam-modulation methods in quantitative and flow-visualization holographic interferometry

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    1986-01-01

    Heterodyne holographic interferometry and time-average holography with a frequency shifted reference beam are discussed. Both methods will be used for the measurement and visualization of internal transonic flows where the target facility is a flutter cascade. The background and experimental requirements for both methods are reviewed. Measurements using heterodyne holographic interferometry are presented. The performance of the laser required for time-average holography of time-varying transonic flows is discussed.

  15. The Value of Methodical Management: Optimizing Science Results

    NASA Astrophysics Data System (ADS)

    Saby, Linnea

    2016-01-01

    As science progresses, making new discoveries in radio astronomy becomes increasingly complex. Instrumentation must be incredibly fine-tuned and well-understood, scientists must consider the skills and schedules of large research teams, and inter-organizational projects sometimes require coordination between observatories around the globe. Structured and methodical management allows scientists to work more effectively in this environment and leads to optimal science output. This report outlines the principles of methodical project management in general, and describes how those principles are applied at the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia.

  16. A quantitative evaluation method of flood risks in low-lying areas associated with increase of heavy rainfall in Japan

    NASA Astrophysics Data System (ADS)

    Minakawa, H.; Masumoto, T.

    2012-12-01

    An increase in flood risk, especially in low-lying areas, is predicted as a consequence of global climate change or other causes. Immediate measures such as strengthening of drainage capacity are needed to minimize the damage caused by more-frequent flooding. Typically, drainage pump capacities of in paddy areas are planned by using a result of drainage analysis with design rainfall (e.g. 3-day rainfall amount with a 10-year return period). However, the result depends on a hyetograph of input rainfall even if a total amount of rainfall is equal, and the flood risk may be different with rainfall patterns. Therefore, it is important to assume various patterns of heavy rainfall for flood risk assessment. On the other hand, a rainfall synthesis simulation is useful to generate many patterns of rainfall data for flood studies. We previously proposed a rainfall simulation method called diurnal rainfall pattern generator which can generate short-time step rainfall and internal pattern of them. This study discusses a quantitative evaluation method for detecting a relationship between flood damage risk and heavy rainfall scale by using the diurnal rainfall pattern generator. In addition, we also approached an estimation of flood damage which focused on rice yield. Our study area was in the Kaga three-lagoon basin in Ishikawa Prefecture, Japan. There are two lagoons in the study area, and the low-lying paddy areas extend over about 4,000 ha in the lower reaches of the basin. First, we developed a drainage analysis model that incorporates kinematic and diffusive runoff models for calculating water level on channels and paddies. Next, the heavy rainfall data for drainage analysis were generated. Here, the 3-day rainfalls amounts with 9 kinds of different return periods (2-, 3-, 5-, 8-, 10-, 15-, 50-, 100-, and 200-year) were derived, and three hundred hyetograph patterns were generated for each rainfall amount by using the diurnal rainfall pattern generator. Finally, all data were input to the drainage model to estimate flood risk; hence, the resultant data would include the influence of different of internal patterns of rainfall on the flood risk. Simultaneously, we tried to clarify economic losses in paddies by using a yield loss curve showing the relation between submerged duration in paddies and rice yield reduction. In particular, we focused on the frequency distribution of peak water levels that exceed allowable flood levels at the lagoons as a flood occurrence risk in this area. The results showed that the risk would increase with rainfall amount, and we got a curve which showed the relation between rainfall amounts and the flood occurrence risk. By using this curve, we can estimate this risk easily in any rainfall amount or climate change scenario. Furthermore, the averaged inundation duration over a depth of more than 30 cm and decrease ratio of rice yield were estimated for paddies. It is indicated that paddies in low-lying areas were damaged so that they are particularly vulnerable to the increase of heavy rainfall amount. Mitigation measures such as revision of drainage planning and/or changing design standards for the capacity of drainage pumps would be necessary in the future.

  17. Evaluation of Quantitative Exposure Assessment Method for Nanomaterials in Mixed Dust Environments: Application in Tire Manufacturing Facilities.

    PubMed

    Kreider, Marisa L; Cyrs, William D; Tosiano, Melissa A; Panko, Julie M

    2015-11-01

    Current recommendations for nanomaterial-specific exposure assessment require adaptation in order to be applied to complicated manufacturing settings, where a variety of particle types may contribute to the potential exposure. The purpose of this work was to evaluate a method that would allow for exposure assessment of nanostructured materials by chemical composition and size in a mixed dust setting, using carbon black (CB) and amorphous silica (AS) from tire manufacturing as an example. This method combined air sampling with a low pressure cascade impactor with analysis of elemental composition by size to quantitatively assess potential exposures in the workplace. This method was first pilot-tested in one tire manufacturing facility; air samples were collected with a Dekati Low Pressure Impactor (DLPI) during mixing where either CB or AS were used as the primary filler. Air samples were analyzed via scanning transmission electron microscopy (STEM) coupled with energy dispersive spectroscopy (EDS) to identify what fraction of particles were CB, AS, or 'other'. From this pilot study, it was determined that ~95% of all nanoscale particles were identified as CB or AS. Subsequent samples were collected with the Dekati Electrical Low Pressure Impactor (ELPI) at two tire manufacturing facilities and analyzed using the same methodology to quantify exposure to these materials. This analysis confirmed that CB and AS were the predominant nanoscale particle types in the mixing area at both facilities. Air concentrations of CB and AS ranged from ~8900 to 77600 and 400 to 22200 particles cm(-3), respectively. This method offers the potential to provide quantitative estimates of worker exposure to nanoparticles of specific materials in a mixed dust environment. With pending development of occupational exposure limits for nanomaterials, this methodology will allow occupational health and safety practitioners to estimate worker exposures to specific materials, even in scenarios where many particle types are present. PMID:26209596

  18. What Identifies a Counseling Psychologist: Method or Results?

    ERIC Educational Resources Information Center

    Krumboltz, John D.; Peltier, Bruce

    1977-01-01

    The author discusses the inaccuracy and handicap of the Bob Hartley image of the counseling psychologist. He discusses expanding this image along several dimensions: flexible use of time, use of different settings, greater use of groups, a wider variety of methods and helpers, counselor initiative, team cooperation, and specialization. (Author/JEL)

  19. INFLUENCE OF SEDIMENT EXTRACT FRACTIONATION METHODS ON BIOASSAY RESULTS

    EPA Science Inventory

    Four bioassays [Microtax(tm), Mutatox(tm), sister chromatid exchange (SCE), and metabolic cooperation] were used to analyze marine sediment extracts fractionated by two different methods: silica gel column chromatography and acid-base fractionation. esults indicated that a sedime...

  20. Application of NDE methods to green ceramics: initial results

    SciTech Connect

    Kupperman, D.S.; Karplus, H.B.; Poeppel, R.B.; Ellingson, W.A.; Berger, H.; Robbins, C.; Fuller, E.

    1984-03-01

    This paper describes a preliminary investigation to assess the effectiveness of microradiography, ultrasonic methods, nuclear magnetic resonance, and neutron radiography for the nondestructive evaluation of green (unfired), ceramics. Objective is to obtain useful information on defects, cracking, delaminations, agglomerates, inclusions, regions of high porosity, and anisotropy.

  1. Coupling finite element and spectral methods: First results

    NASA Technical Reports Server (NTRS)

    Bernardi, Christine; Debit, Naima; Maday, Yvon

    1987-01-01

    A Poisson equation on a rectangular domain is solved by coupling two methods: the domain is divided in two squares, a finite element approximation is used on the first square and a spectral discretization is used on the second one. Two kinds of matching conditions on the interface are presented and compared. In both cases, error estimates are proved.

  2. A simplified method for the quantitative determination of urinary coproporphyrin in lead workers

    PubMed Central

    Soulsby, Joan; Smith, R. L.

    1974-01-01

    Soulsby, Joan and Smith, R. L. (1974).British Journal of Industrial Medicine,31, 72-74. A simplified method of estimating urinary coproporphyrin is described, based on the method of Rimington (1971). Coproporphyrin and coproporphyrinogen are extracted into ether from acidified urine; the ether is then shaken with a solution of iodine in hydrochloric acid to oxidize any coproporphyrinogen to coproporphyrin and to extract the coproporphyrin. The solution is examined spectrophotometrically for coproporphyrin at the peak of the Soret band and at wavelengths on either side to correct for any impurities present. A comparison with the method of Rimington (1971) in 94 urine samples with coproporphyrin levels up to 2·5 mg/l showed good agreement. Correlation coefficient (r) = +0·986. One hundred estimations can be carried out in five hours. PMID:4821413

  3. Maillard reaction products in bread: A novel semi-quantitative method for evaluating melanoidins in bread.

    PubMed

    Helou, Cynthia; Jacolot, Philippe; Niquet-Léridon, Céline; Gadonna-Widehem, Pascale; Tessier, Frédéric J

    2016-01-01

    The aim of this study was to test the methods currently in use and to develop a new protocol for the evaluation of melanoidins in bread. Markers of the early and advanced stages of the Maillard reaction were also followed in the crumb and the crust of bread throughout baking, and in a crust model system. The crumb of the bread contained N(?)-fructoselysine and N(?)-carboxymethyllysine but at levels 7 and 5 times lower than the crust, respectively. 5-Hydroxymethylfurfural was detected only in the crust and its model system. The available methods for the semi-quantification of melanoidins were found to be unsuitable for their analysis in bread. Our new method based on size exclusion chromatography and fluorescence measures soluble fluorescent melanoidins in bread. These melanoidin macromolecules (1.7-5.6 kDa) were detected intact in both crust and model system. They appear to contribute to the dietary fibre in bread. PMID:26213055

  4. Development of a rapid, reliable and quantitative method--"SPOTi" for testing antifungal efficacy.

    PubMed

    Rizi, Khalida; Murdan, Sudaxshina; Danquah, Cynthia A; Faull, Jane; Bhakta, Sanjib

    2015-10-01

    A reference method for the antimicrobial susceptibility testing of common fungal pathogens such as dermatophytes, is currently lacking. In this study, we report the successful adaptation of solid agar-based spot culture growth inhibition assay (SPOTi) for dermatophytes, currently being used as a gold-standard in the anti-tubercular drug discovery field. The fungal-SPOTi assay correlated with the disc-diffusion method, and is validated using mycelial plugs. We propose the fungal-SPOTi as a high-throughput alternative to the disc-diffusion and broth micro-dilution anti-fungal assays to screen novel anti-fungals. PMID:26183763

  5. Quantitative phase analysis of Mg:ZrO{sub 2} nanoparticles by Rietveld refinement method

    SciTech Connect

    Balaji, V. Senthilkumaran, S. Thangadurai, P.

    2014-04-24

    To quantify the structural phases of nanocrystalline ZrO{sub 2} doped with Mg ions of varying concentrations (3, 5, 10, 15 and 20%) and annealed at different temperatures. Magnesia doped zirconia was prepared by chemical co-precipitation method and annealed up to 1000°C. The monoclinic and tetragonal phases present in Mg:ZrO{sub 2} were quantified using Rietveld refinement analysis of the X-ray diffraction data and compared with the Direct method based on peak intensity calculations. Tetragonal phase was dominant in the 600°C annealed Mg:ZrO{sub 2} for all Mg concentrations.

  6. Messages that increase women’s intentions to abstain from alcohol during pregnancy: results from quantitative testing of advertising concepts

    PubMed Central

    2014-01-01

    Background Public awareness-raising campaigns targeting alcohol use during pregnancy are an important part of preventing prenatal alcohol exposure and Fetal Alcohol Spectrum Disorder. Despite this, there is little evidence on what specific elements contribute to campaign message effectiveness. This research evaluated three different advertising concepts addressing alcohol and pregnancy: a threat appeal, a positive appeal promoting a self-efficacy message, and a concept that combined the two appeals. The primary aim was to determine the effectiveness of these concepts in increasing women’s intentions to abstain from alcohol during pregnancy. Methods Women of childbearing age and pregnant women residing in Perth, Western Australia participated in a computer-based questionnaire where they viewed either a control or one of the three experimental concepts. Following exposure, participants’ intentions to abstain from and reduce alcohol intake during pregnancy were measured. Other measures assessed included perceived main message, message diagnostics, and potential to promote defensive responses or unintended consequences. Results The concepts containing a threat appeal were significantly more effective at increasing women’s intentions to abstain from alcohol during pregnancy than the self-efficacy message and the control. The concept that combined threat and self-efficacy is recommended for development as part of a mass-media campaign as it has good persuasive potential, provides a balance of positive and negative emotional responses, and is unlikely to result in defensive or unintended consequences. Conclusions This study provides important insights into the components that enhance the persuasiveness and effectiveness of messages aimed at preventing prenatal alcohol exposure. The recommended concept has good potential for use in a future campaign aimed at promoting women’s intentions to abstain from alcohol during pregnancy. PMID:24410764

  7. Comparison of a multipoint identity-by-descent method with parametric multipoint linkage analysis for mapping quantitative traits.

    PubMed Central

    Goldgar, D E; Oniki, R S

    1992-01-01

    We previously developed a method of partitioning genetic variance of a quantitative trait to loci in specific chromosomal regions. In this paper, we compare this method--multipoint IBD (identical by descent) method (MIM)--with parametric multipoint linkage analysis (MLINK). A simulation study was performed comparing the methods for the major-locus, mixed, and two-locus models. The criterion for comparisons between MIM and MLINK was the average lod score from multiple replicates of simulated data sets. The effect of gene frequency, dominance, model misspecification, marker spacing, and informativeness are also considered in a smaller set of simulations. Within the context of the models examined, the MIM approach was found to be comparable in power with parametric multipoint linkage analysis when (a) parental data are unknown, (b) the effect of the major locus is small and there is additional genetic variation, or (c) the parameters of the major-locus model are misspecified. The performance of the MIM method relative to MLINK was markedly lower when the allele frequency at the trait locus was .2 versus .5, particularly for the case when parental data were assumed to be known. Dominance at the trait major locus, as well as marker spacing and heterozygosity, did not appear to have a large effect on the ELOD comparisons. PMID:1539596

  8. Groundwater vulnerability and pollution risk assessment of porous aquifers to nitrate: Modifying the DRASTIC method using quantitative parameters

    NASA Astrophysics Data System (ADS)

    Kazakis, Nerantzis; Voudouris, Konstantinos S.

    2015-06-01

    In the present study the DRASTIC method was modified to estimate vulnerability and pollution risk of porous aquifers to nitrate. The qualitative parameters of aquifer type, soil and impact of the vadose zone were replaced with the quantitative parameters of aquifer thickness, nitrogen losses from soil and hydraulic resistance. Nitrogen losses from soil were estimated based on climatic, soil and topographic data using indices produced by the GLEAMS model. Additionally, the class range of each parameter and the final index were modified using nitrate concentration correlation with four grading methods (natural breaks, equal interval, quantile and geometrical intervals). For this reason, seventy-seven (77) groundwater samples were collected and analyzed for nitrate. Land uses were added to estimate the pollution risk to nitrates. The two new methods, DRASTIC-PA and DRASTIC-PAN, were then applied in the porous aquifer of Anthemountas basin together with the initial versions of DRASTIC and the LOSN-PN index. The two modified methods displayed the highest correlations with nitrate concentrations. The two new methods provided higher discretisation of the vulnerability and pollution risk, whereas the high variance of the (ANOVA) F statistic confirmed the increase of the average concentrations of NO3-, increasing from low to high between the vulnerability and pollution risk classes. The importance of the parameters of hydraulic resistance of the vadose zone, aquifer thickness and land use was confirmed by single-parameter sensitivity analysis.

  9. Viscous wing theory development. Volume 1: Analysis, method and results

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  10. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    USGS Publications Warehouse

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small volume grab samples were collected for direct-plating analyses for bacterial indicators and coliphage. After filtration, the viruses were eluted from the filter and further concentrated. The final concentrated sample volume (FCSV) was used for enteric virus analysis by use of two methods-cell culture and a molecular method, polymerase chain reaction (PCR). Quantitative PCR (qPCR) for DNA viruses and quantitative reverse-transcriptase PCR (qRT-PCR) for RNA viruses were used in this study. To support data interpretations, the assay limit of detection (ALOD) was set for each virus assay and used to determine sample reporting limits (SRLs). For qPCR and qRT-PCR the ALOD was an estimated value because it was not established according to established method detection limit procedures. The SRLs were different for each sample because effective sample volumes (the volume of the original sample that was actually used in each analysis) were different for each sample. Effective sample volumes were much less than the original sample volumes because of reductions from processing steps and (or) from when dilutions were made to minimize the effects from PCR-inhibiting substances. Codes were used to further qualify the virus data and indicate the level of uncertainty associated with each measurement. Quality-control samples were used to support data interpretations. Field and laboratory blanks for bacteria, coliphage, and enteric viruses were all below detection, indicating that it was unlikely that samples were contaminated from equipment or processing procedures. The absolute value log differences (AVLDs) between concurrent replicate pairs were calculated to identify the variability associated with each measurement. For bacterial indicators and coliphage, the AVLD results indicated that concentrations <10 colony-forming units or plaque-forming units per 100 mL can differ between replicates by as much as 1 log, whereas higher concentrations can differ by as much as 0.3 log. The AVLD results for viruses indicated that differences between replicates can be as great as 1.2 log g

  11. Quantitative Method To Determine Sporicidal Decontamination of Building Surfaces by Gaseous Fumigants, and Issues Related to Laboratory-Scale Studies?

    PubMed Central

    Rastogi, Vipin K.; Wallace, Lalena; Smith, Lisa S.; Ryan, Shawn P.; Martin, Blair

    2009-01-01

    Chlorine dioxide gas and vaporous hydrogen peroxide sterilant have been used in the cleanup of building interiors contaminated with spores of Bacillus anthracis. A systematic study, in collaboration with the U.S. Environmental Protection Agency, was jointly undertaken by the U.S. Army-Edgewood Chemical Biological Center to determine the sporicidal efficacies of these two fumigants on six building structural materials: carpet, ceiling tile, unpainted cinder block, painted I-beam steel, painted wallboard, and unpainted pinewood. Critical issues related to high-throughput sample processing and spore recovery from porous and nonporous surfaces included (i) the extraction of spores from complex building materials, (ii) the effects of titer challenge levels on fumigant efficacy, and (iii) the impact of bioburden inclusion on spore recovery from surfaces and spore inactivation. Small pieces (1.3 by 1.3 cm of carpet, ceiling tile, wallboard, I-beam steel, and pinewood and 2.5 by 1.3 cm for cinder block) of the materials were inoculated with an aliquot of 50 ?l containing the target number (1 × 106, 1 × 107, or 1 × 108) of avirulent spores of B. anthracis NNR1?1. The aliquot was dried overnight in a biosafety cabinet, and the spores were extracted by a combination of a 10-min sonication and a 2-min vortexing using 0.5% buffered peptone water as the recovery medium. No statistically significant drop in the kill efficacies of the fumigants was observed when the spore challenge level was increased from 6 log units to 8 log units, even though a general trend toward inhibition of fumigant efficacy was evident. The organic burden (0 to 5%) in the spore inoculum resulted in a statistically significant drop in spore recovery (at the 2 or 5% level). The effect on spore killing was a function of the organic bioburden amount and the material type. In summary, a high-throughput quantitative method was developed for determining the efficacies of fumigants, and the spore recoveries from five porous materials and one nonporous material ranged between 20 and 80%. PMID:19346341

  12. An integrated methodology for quantitative assessment of proliferation resistance of advanced nuclear systems using probabilistic methods

    E-print Network

    Ham, Hyeongpil

    2005-01-01

    Proliferation is the results of a competition between the proliferating country (proliferation) and the party to resist the proliferation efforts (safeguarder). An integrated evaluation methodology to evaluate proliferation ...

  13. "Inject-Mix-React-Separate-and-Quantitate" (IMReSQ) Method for Screening Enzyme Inhibitors

    E-print Network

    Krylov, Sergey

    are considered attractive therapeutic targets and their inhibitors are potential drug candidates.1 Screening-of-principle work, we applied the method to study inhibition of recently cloned protein farnesyltransferase (FT) from parasite Entamoeba histolytica (Eh); this enzyme is a potential therapeutic target

  14. An Automatic Image Based Single Dilution Method for End Point Titre Quantitation of Antinuclear

    E-print Network

    Sanderson, Conrad

    Antibodies Tests using HEp-2 Cells Arnold Wiliem, Peter Hobson, Rodney F. Minchin and Brian C. Lovell- ithelial (HEp-2) cells test has been the golden standard for identifying the presence of Anti Immunofluorescence (IIF) on HEp-2 cells test has been the hallmark method for the detection of antinuclear antibodies

  15. A novel 3D absorption correction method for quantitative EDX-STEM tomography.

    PubMed

    Burdet, Pierre; Saghi, Z; Filippin, A N; Borrás, A; Midgley, P A

    2016-01-01

    This paper presents a novel 3D method to correct for absorption in energy dispersive X-ray (EDX) microanalysis of heterogeneous samples of unknown structure and composition. By using STEM-based tomography coupled with EDX, an initial 3D reconstruction is used to extract the location of generated X-rays as well as the X-ray path through the sample to the surface. The absorption correction needed to retrieve the generated X-ray intensity is then calculated voxel-by-voxel estimating the different compositions encountered by the X-ray. The method is applied to a core/shell nanowire containing carbon and oxygen, two elements generating highly absorbed low energy X-rays. Absorption is shown to cause major reconstruction artefacts, in the form of an incomplete recovery of the oxide and an erroneous presence of carbon in the shell. By applying the correction method, these artefacts are greatly reduced. The accuracy of the method is assessed using reference X-ray lines with low absorption. PMID:26484792

  16. Quantitation of brinzolamide in dried blood spots by a novel LC-QTOF-MS/MS method.

    PubMed

    Foivas, Anargyros; Malenovi?, An?elija; Kosti?, Na?a; Boži?, Marija; Kneževi?, Miroslav; Loukas, Yannis L; Dotsikas, Yannis

    2016-02-01

    In the current study, a rapid and sensitive LC-QTOF-MS/MS method for the determination of brinzolamide in dried blood spots (DBS) was developed and validated. This novel sample collection, storage and transfer technique was suitable for analyzing a drug with high distribution into red blood cells and negligible plasma levels. The method included an isocratic mobile phase consisting of methanol and 10mM ammonium formate (90:10, v/v) and detection in positive electrospray mode (ESI+). The flow rate was adjusted to 0.350mL/min yielding retention times of 1.7min for both brinzolamide and internal standard (IS) rabeprazole on a Cyano analytical column, respectively. The validation of the proposed method over the concentration range 0.500-20.0?g/mL was performed in compliance with EMEA and FDA guidelines, assessing all major performance characteristics. Inter- and intra- assay precisions were less than 14%, while inter- and intra- assay accuracies varied from 92.2 to 111%. No matrix effect was observed and the mean brinzolamide extraction recovery was 93.5%. The method was successfully applied to real DBS samples from patients in steady state condition, receiving brinzolamide ophthalmic suspension 1% (w/v) for several months. Initial concentrations were corrected due to hematocrit effect, using image processing algorithm written in Matlab. PMID:26669612

  17. Quantitative evaluation of the sensitivity of library-based Raman spectral correlation methods.

    PubMed

    Rodriguez, Jason D; Westenberger, Benjamin J; Buhse, Lucinda F; Kauffman, John F

    2011-06-01

    Library-based Raman spectral correlation methods are widely used in surveillance applications in multiple areas including the pharmaceutical industry, where Raman spectroscopy is commonly used in verification screening of incoming raw materials. While these spectral correlation methods are rapid and require little or no sample preparation, their sensitivity to the presence of contaminants has not been adequately evaluated. This is particularly important when dealing with pharmaceutical excipients, which are susceptible to economically motivated adulteration by substances having similar physical/chemical/spectroscopic properties. We report a novel approach to evaluating the sensitivity of library-based Raman spectral correlation methods to contaminants in binary systems using a hit-quality index model. We examine three excipient/contaminant systems, glycerin/diethylene glycol, propylene glycol/diethylene glycol, and lactose/melamine and find that the sensitivity to contaminant for each system is 18%, 32%, and 4%, respectively. These levels are well-correlated to the minimum contaminant composition that can be detected by both verification and identification methods. Our studies indicate that the most important factor that determines the sensitivity of a spectral correlation measurement to the presence of contaminant is the relative Raman scattering cross section of the contaminant. PMID:21548558

  18. Quantitation of DNA sequences in environmental PCR products by a multiplexed, bead-based method.

    PubMed

    Spiro, Alexander; Lowe, Mary

    2002-02-01

    A first application of a multiplexed, bead-based method is described for determining the abundances of target sequences in an environmental PCR product. Target sequences as little as 0.3% of the total amount of DNA can be quantified. Tests were conducted on 16S ribosomal DNA sequences from microorganisms collected from contaminated groundwater. PMID:11823255

  19. Methods and results of boundary layer measurements on a glider

    NASA Technical Reports Server (NTRS)

    Nes, W. V.

    1978-01-01

    Boundary layer measurements were carried out on a glider under natural conditions. Two effects are investigated: the effect of inconstancy of the development of static pressure within the boundary layer and the effect of the negative pressure difference in a sublaminar boundary layer. The results obtained by means of an ion probe in parallel connection confirm those results obtained by means of a pressure probe. Additional effects which have occurred during these measurements are briefly dealt with.

  20. Phase development in normal and ultra high performance cementitious systems by quantitative X-ray analysis and thermoanalytical methods

    SciTech Connect

    Korpa, A. Kowald, T.; Trettin, R.

    2009-02-15

    Quantitative X-ray diffraction (QXRD) and thermogravimetry (TG) methods are used to determine the phase development up to 28 days of hydration in normal and ultra high performance cementitious systems (UHPC) that do not contain aggregate. The phase development in ultra high performance cementitious formulation is quantitatively and kinetically different from that in normal concrete formulation. This is related to the different components employed and their associated reactions. For both formulations the most remarkable changes of the phase contents are recorded between the first and second hydration day and up to the seventh day. After the seventh day less phase content changes are measured. Because of the non sufficient water amount for hydration, considerable amount of cement remains non hydrated in the UHPC formulation. The portlandite content, which is present in the UHPC specimen, gives evidence for non complete pozzolanic reactions even after 28 days of hydration, whereas the absence of calcite in the UHPC specimen indicates an insignificant carbonation in this specimen.

  1. Method of fabricating nested shells and resulting product

    DOEpatents

    Henderson, Timothy M. (Ann Arbor, MI); Kool, Lawrence B. (Ann Arbor, MI)

    1982-01-01

    A multiple shell structure and a method of manufacturing such structure wherein a hollow glass microsphere is surface treated in an organosilane solution so as to render the shell outer surface hydrophobic. The surface treated glass shell is then suspended in the oil phase of an oil-aqueous phase dispersion. The oil phase includes an organic film-forming monomer, a polymerization initiator and a blowing agent. A polymeric film forms at each phase boundary of the dispersion and is then expanded in a blowing operation so as to form an outer homogeneously integral monocellular substantially spherical thermoplastic shell encapsulating an inner glass shell of lesser diameter.

  2. Band-limited Green's Functions for Quantitative Evaluation of Acoustic Emission Using the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Leser, William P.; Yuan, Fuh-Gwo; Leser, William P.

    2013-01-01

    A method of numerically estimating dynamic Green's functions using the finite element method is proposed. These Green's functions are accurate in a limited frequency range dependent on the mesh size used to generate them. This range can often match or exceed the frequency sensitivity of the traditional acoustic emission sensors. An algorithm is also developed to characterize an acoustic emission source by obtaining information about its strength and temporal dependence. This information can then be used to reproduce the source in a finite element model for further analysis. Numerical examples are presented that demonstrate the ability of the band-limited Green's functions approach to determine the moment tensor coefficients of several reference signals to within seven percent, as well as accurately reproduce the source-time function.

  3. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, James W.; Molz, Fred J.

    2003-02-07

    Improved prediction of interwell reservoir heterogeneity is needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contain approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley. This investigation involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation.

  4. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, James W.; Molz, Fred J.; Brame, Scott; Current, Caitlin J.

    2003-02-07

    Improved prediction of interwell reservoir heterogeneity was needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contain approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley. This investigation involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation.

  5. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  6. Method for beam hardening correction in quantitative computed X-ray tomography

    NASA Technical Reports Server (NTRS)

    Yan, Chye Hwang (Inventor); Whalen, Robert T. (Inventor); Napel, Sandy (Inventor)

    2001-01-01

    Each voxel is assumed to contain exactly two distinct materials, with the volume fraction of each material being iteratively calculated. According to the method, the spectrum of the X-ray beam must be known, and the attenuation spectra of the materials in the object must be known, and be monotonically decreasing with increasing X-ray photon energy. Then, a volume fraction is estimated for the voxel, and the spectrum is iteratively calculated.

  7. Simple method for the quantitative examination of extra column band broadening in microchromatographic systems.

    PubMed

    Beisler, Amy T; Schaefer, Kathleen E; Weber, Stephen G

    2003-02-01

    In recent years capillary chromatography has gained popularity for trace analyses. Most often UV or electrochemical detection is employed because the small peak volumes make post-column derivatization challenging. We have developed a simple method based on flow injection for determining contributions to peak broadening from post-column reactors. The only requirement for application of our methodology is that diffusion be in the Taylor regime so that radial concentration gradients are relaxed enabling mixing purely by diffusion. PMID:12597631

  8. Quantitative method of determining beryllium or a compound thereof in a sample

    DOEpatents

    McCleskey, T. Mark (Los Alamos, NM); Ehler, Deborah S. (Los Alamos, NM); John, Kevin D. (Santa Fe, NM); Burrell, Anthony K. (Los Alamos, NM); Collis, Gavin E. (Los Alamos, NM); Minogue, Edel M. (Los Alamos, NM); Warner, Benjamin P. (Los Alamos, NM)

    2010-08-24

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  9. Quantitative method of determining beryllium or a compound thereof in a sample

    DOEpatents

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.

    2006-10-31

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  10. A quantitative method to evaluate microbial electrolysis cell effectiveness for energy recovery

    E-print Network

    for treatment was 2:17 kW h kgŔ1 COD for industrial wastewater and 2:59 kW h kgŔ1 COD for domestic wastewater, was the easiest and most direct method to optimize MEC performance for industrial wastewater treatment. A pre the industrial wastewater (1839 Ć 57 mg LŔ1 ), but treatment was achieved in significantly less time (70 h versus

  11. A Comparative Evaluation of Stress-Strain and Acoustic Emission Methods for Quantitative Damage Assessments of Brittle Rock

    NASA Astrophysics Data System (ADS)

    Kim, Jin-Seop; Lee, Kyung-Soo; Cho, Won-Jin; Choi, Heui-Joo; Cho, Gye-Chun

    2015-03-01

    The purpose of this study is to identify the crack initiation and damage stress thresholds of granite from the Korea atomic energy research institute's Underground Research Tunnel (KURT). From this, a quantitative damage evolution was inferred using various methods, including the crack volumetric strain, b value, the damage parameter from the moment tensor, and the acoustic emission (AE) energy. Uniaxial compression tests were conducted, during which both the stress-strain and AE activity were recorded simultaneously. The crack initiation threshold was found at a stress level of 0.42-0.53 ? c, and the crack damage threshold was identified at 0.62-0.84 ? c. The normalized integrity of KURT granite was inferred at each stress level from the damage parameter by assuming that the damage is accumulated beyond the crack initiation stress threshold. The maximum deviation between the crack volumetric strain and the AE method was 16.0 %, which was noted at a stress level of 0.84 ? c. The damage parameters of KURT granite derived from a mechanically measured stress-strain relationship (crack volumetric strain) were successfully related and compared to those derived from physically detected acoustic emission waves. From a comprehensive comparison of damage identification and quantification methods, it was finally suggested that damage estimations using the AE energy method are preferred from the perspectives of practical field applicability and the reliability of the obtained damage values.

  12. A new quantitative method for evaluating freezing of gait and dual-attention task deficits in Parkinson's disease.

    PubMed

    Chomiak, Taylor; Pereira, Fernando Vieira; Meyer, Nicole; de Bruin, Natalie; Derwent, Lorelei; Luan, Kailie; Cihal, Alexandra; Brown, Lesley A; Hu, Bin

    2015-11-01

    People with Parkinson's disease (PD) can exhibit disabling gait symptoms such as freezing of gait especially when distracted by a secondary task. Quantitative measurement method of this type of cognitive-motor abnormality, however, remains poorly developed. Here we examined whether stepping-in-place (SIP) with a concurrent mental task (e.g., subtraction) can be used as a simple method for evaluating cognitive-motor deficits in PD. We used a 4th generation iPod Touch sensor system to capture hip flexion data and obtain step height (SH) measurements (z axis). The accuracy of the method was compared to and validated by kinematic video analysis software. We found a general trend of reduced SH for PD subjects relative to controls under all conditions. However, the SH of PD freezers was significantly worse than PD non-freezers and controls during concurrent serial 7 subtraction and SIP tasking. During serial 7 subtraction, SH was significantly associated with whether or not a PD patient was a self-reported freezer even when controlling for disease severity. Given that this SIP-based dual-task paradigm is not limited by space requirements and can be quantified using a mobile tracking device that delivers specifically designed auditory task instructions, the method reported here may be used to standardize clinical assessment of cognitive-motor deficits under a variety of dual-task conditions in PD. PMID:26206604

  13. An Automated and Quantitative Method to Evaluate Progression of Striatal Pathology in Huntington’s Disease Transgenic Mice

    PubMed Central

    Egorova, Polina; Bezprozvanny, Ilya

    2015-01-01

    Huntington’s disease (HD) is a progressive neurodegenerative disorder caused by a polyglutamine expansion in the Huntingtin protein which results in the selective degeneration of striatal medium spiny neurons (MSN). A number of genetic mouse models have been developed to model HD phenotype. Most of these models display impaired performance in motor coordination assays and variety of neuropathological abnormalities. Quantitative neuropathological assessment in these mice requires application of stereological techniques and very labor-intensive and time consuming. Here, we report a development of a novel paradigm that simplifies and accelerates quantitative evaluation of striatal atrophy in HD mice. To achieve this goal, we crossed YAC128 HD transgenic mice with Rgs9-EGFP mice. In Rgs9-EGFP mice the EGFP transgene is expressed selectively in MSN neurons at high levels. Using high resolution fluorescence laser scanning imager, we have been able to precisely measure striatal area and intensity of EGFP expression in coronal slices from these mice at 2 months, 4 months and 9 months of age. Using this approach, we demonstrated significant reduction in striatal volume in YAC128 mice at 4 months and 9 months of age when compared to wild type littermates. We evaluated behavior performance of these mice at 2 months, 4 months and 6 months of age and demonstrated significant impairment of YAC128 mice in beam walk assay at 4 months and 6 months of age. This new mouse model and the quantitative neuropathological scoring paradigm may simplify and accelerate discovery of novel neuroprotective agents for HD. PMID:25575955

  14. Historical Sulfur Dioxide Emissions 1850-2000: Methods and Results

    SciTech Connect

    Smith, Steven J.; Andres, Robert; Conception , Elvira; Lurz, Joshua

    2004-01-25

    A global, self-consistent estimate of sulfur dioxide emissions over the last one and a half century were estimated by using a combination of bottom-up and best available inventory methods including all anthropogenic sources. We find that global sulfur dioxide emissions peaked about 1980 and have generally declined since this time. Emissions were extrapolated to a 1{sup o} x 1{sup o} grid for the time period 1850-2000 at annual resolution with two emission height levels and by season. Emissions are somewhat higher in the recent past in this new work as compared with some comprehensive estimates. This difference is largely due to our use of emissions factors that vary with time to account for sulfur removals from fossil fuels and industrial smelting processes.

  15. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, James W.; Molz, Fred W.; Bridges, Robert A.; Dinwiddie, Cynthia L.; Lorinovich, Caitlin J.; Lu, Silong

    2003-02-07

    This project involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation. The investigation was performed in collaboration with Chevron Production Company U.S.A. as an industrial partner, and incorporates data from the Temblor Formation in Chevron's West Coalinga Field, California. Improved prediction of interwell reservoir heterogeneity was needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contained approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley.

  16. A Recurrence Method for Generalizing Known Scientific Results

    E-print Network

    Florentin Smarandache

    2010-03-26

    A great number of articles widen a known scientific result $P(a)$ (such as: a theorem, an inequality, or a math/physics/chemical etc. proposition or formula) by a simple recurrence procedure and using, in the proof, the proposition $P(a)$ itself. We present, as examples, the generalizations of H\\"older's inequality, of Minkovski's inequality, of Tchebychev's inequality, and of the Theorem of Menelaus respectively.

  17. An exploratory method to detect tephras from quantitative XRD scans: Examples from Iceland and east Greenland marine sediments

    USGS Publications Warehouse

    Andrews, John T.; Eberl, D.D.; Kristjansdottir, G.B.

    2006-01-01

    Tephras, mainly from Iceland, are becoming increasingly important in interpreting leads and lags in the Holocene climate system across NW Europe. Here we demonstrate that Quantitative Phase Analysis of x-ray diffractograms of the 150 um fraction and identify these same peaks in XRD scans - two of these correlate geochemically and chronologically with Hekla 1104 and 3. At a distal site to the WNW of Iceland, on the East Greenland margin (core MD99-2317), the weight% of volcanic glass reaches values of 11% at about the time of the Saksunarvatn tephra. The XRD method identifies the presence of volcanic glass but not its elemental composition; hence it will assist in focusing attention on specific sections of sediment cores for subsequent geochemical fingerprinting of tephras. ?? 2006 SAGE Publications.

  18. An ultrasensitive method for quantitating circulating tumor DNA with broad patient coverage

    PubMed Central

    Newman, Aaron M.; Bratman, Scott V.; To, Jacqueline; Wynne, Jacob F.; Eclov, Neville C. W.; Modlin, Leslie A.; Liu, Chih Long; Neal, Joel W.; Wakelee, Heather A.; Merritt, Robert E.; Shrager, Joseph B.; Loo, Billy W.

    2013-01-01

    Circulating tumor DNA (ctDNA) represents a promising biomarker for noninvasive assessment of cancer burden, but existing methods have insufficient sensitivity or patient coverage for broad clinical applicability. Here we introduce CAncer Personalized Profiling by deep Sequencing (CAPP-Seq), an economical and ultrasensitive method for quantifying ctDNA. We implemented CAPP-Seq for non-small cell lung cancer (NSCLC) with a design covering multiple classes of somatic alterations that identified mutations in >95% of tumors. We detected ctDNA in 100% of stage II–IV and 50% of stage I NSCLC patients, with 96% specificity for mutant allele fractions down to ~0.02%. Levels of ctDNA significantly correlated with tumor volume, distinguished between residual disease and treatment-related imaging changes, and provided earlier response assessment than radiographic approaches. Finally, we explored biopsy-free tumor screening and genotyping with CAPP-Seq. We envision that CAPP-Seq could be routinely applied clinically to detect and monitor diverse malignancies, thus facilitating personalized cancer therapy. PMID:24705333

  19. Determining concentrations of 2-bromoallyl alcohol and dibromopropene in ground water using quantitative methods

    USGS Publications Warehouse

    Panshin, Sandra Y.

    1997-01-01

    A method for determining levels of 2-bromoallyl alcohol and 2,3-dibromopropene from ground-water samples using liquid/liquid extraction followed by gas chromatography/mass spectrometry is described. Analytes were extracted from the water using three aliquots of dichloromethane. The aliquots were combined and reduced in volume by rotary evaporation followed by evaporation using a nitrogen stream. The extracts were analyzed by capillary-column gas chromatography/mass spectrometry in the full-scan mode. Estimated method detection limits were 30 nanograms per liter for 2-bromoallyl alcohol and 10 nanograms per liter for 2,3-dibromopropene. Recoveries were determined by spiking three matrices at two concentration levels (0.540 and 5.40 micrograms per liter for 2-bromoallyl alcohol; and 0.534 and 5.34micro-grams per liter for dibromopropene). For seven replicates of each matrix at the high concentration level, the mean percent recoveries ranged from 43.9 to 64.9 percent for 2-bromoallyl alcohol, and from 87.5 to 99.3 percent for dibromopropene. At the low concentration level, the mean percent recoveries ranged from 43.8 to 95.2 percent for 2-bromoallyl alcohol, and from 71.3 to 84.9 percent for dibromopropene.

  20. Quantitative analysis of trace bulk oxygen in silicon wafers using an inert gas fusion method.

    PubMed

    Uchihara, Hiroshi; Ikeda, Masahiko; Nakahara, Taketoshi

    2003-11-01

    This paper describes a method for removing oxide film from the surface of silicon wafers using an inert gas fusion impulse furnace and precise determination of bulk oxygen within the wafer. A silicon wafer was cut to about 0.35 g (6 x 13 x 2 mm) and dropped into a graphite crucible. The sample was then heated for 40 s at 1300 degrees C. The wafer's oxide film was reduced by carbon and removed as carbon monoxide. The treated silicon sample was taken out of the graphite crucible and maintained again with the holder of the oxygen analyzer. The graphite crucible was then heated to 2100 degrees C. The treated silicon sample was dropped into the heated graphite crucible and the trace bulk oxygen in the wafer was measured using the inert gas fusion infrared absorption method. The relative standard deviations of the oxygen in silicon wafer samples with the removed surface oxide film were determined to be 0.8% for 9.8 x 10(17) atoms/cm3, and 2.7% for 13.0 x 10(17) atoms/cm3. PMID:14640456

  1. A method for quantitative mapping of thick oil spills using imaging spectroscopy

    USGS Publications Warehouse

    Clark, Roger N.; Swayze, Gregg A.; Leifer, Ira; Livo, K. Eric; Kokaly, Raymond F.; Hoefen, Todd; Lundeen, Sarah; Eastwood, Michael; Green, Robert O.; Pearson, Neil; Sarture, Charles; McCubbin, Ian; Roberts, Dar; Bradley, Eliza; Steele, Denis; Ryan, Thomas; Dominguez, Roseanne; The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) Team

    2010-01-01

    In response to the Deepwater Horizon oil spill in the Gulf of Mexico, a method of near-infrared imaging spectroscopic analysis was developed to map the locations of thick oil floating on water. Specifically, this method can be used to derive, in each image pixel, the oil-to-water ratio in oil emulsions, the sub-pixel areal fraction, and its thicknesses and volume within the limits of light penetration into the oil (up to a few millimeters). The method uses the shape of near-infrared (NIR) absorption features and the variations in the spectral continuum due to organic compounds found in oil to identify different oil chemistries, including its weathering state and thickness. The method is insensitive to complicating conditions such as moderate aerosol scattering and reflectance level changes from other conditions, including moderate sun glint. Data for this analysis were collected by the NASA Airborne Visual Infrared Imaging Spectrometer (AVIRIS) instrument, which was flown over the oil spill on May 17, 2010. Because of the large extent of the spill, AVIRIS flight lines could cover only a portion of the spill on this relatively calm, nearly cloud-free day. Derived lower limits for oil volumes within the top few millimeters of the ocean surface directly probed with the near-infrared light detected in the AVIRIS scenes were 19,000 (conservative assumptions) to 34,000 (aggressive assumptions) barrels of oil. AVIRIS covered about 30 percent of the core spill area, which consisted of emulsion plumes and oil sheens. Areas of oil sheen but lacking oil emulsion plumes outside of the core spill were not evaluated for oil volume in this study. If the core spill areas not covered by flight lines contained similar amounts of oil and oil-water emulsions, then extrapolation to the entire core spill area defined by a MODIS (Terra) image collected on the same day indicates a minimum of 66,000 to 120,000 barrels of oil was floating on the surface. These estimates are preliminary and subject to revision pending further analysis. Based on laboratory measurements, near-infrared (NIR) photons penetrate only a few millimeters into oil-water emulsions. As such, the oil volumes derived with this method are lower limits. Further, the detection is only of thick surface oil and does not include sheens, underwater oil, or oil that had already washed onto beaches and wetlands, oil that had been burned or evaporated as of May 17. Because NIR light penetration within emulsions is limited, and having made field observations that oil emulsions sometimes exceeded 20 millimeters in thickness, we estimate that the volume of oil, including oil thicker than can be probed in the AVIRIS imagery, is possibly as high as 150,000 barrels in the AVIRIS scenes. When this value is projected to the entire spill, it gives a volume of about 500,000 barrels for thick oil remaining on the sea surface as of May 17. AVIRIS data cannot be used to confirm this higher volume, and additional field work including more in-situ measurements of oil thickness would be required to confirm this higher oil volume. Both the directly detected minimum range of oil volume, and the higher possible volume projection for oil thicker than can be probed with NIR spectroscopy imply a significantly higher total volume of oil relative to that implied by the early NOAA (National Oceanic and Atmospheric Administration) estimate of 5,000 barrels per day reported on their Web site.

  2. Quantitative J correlation methods for the accurate measurement of 13C'-13Calpha dipolar couplings in proteins.

    PubMed

    Jaroniec, Christopher P; Ulmer, Tobias S; Bax, Ad

    2004-10-01

    Methods are described for the precise and accurate measurement of one-bond dipolar (13)C'-(13)C(alpha) couplings in weakly aligned proteins. The experiments are based on the principle of quantitative J correlation, where (1)J(C'C(alpha)) (or (1)J(C'C(alpha)) + 1D(C'C(alpha)) is measured from the relative intensity of two interleaved 3D TROSY-HN(CO)CA or 3DTROSY-HNCO spectra recorded with dephasing intervals of zero (reference spectrum) and approximately 3/(2(1)J(C'C(alpha)) (attenuated spectrum). In analogy to other quantitative J correlation techniques, the random error in the measured (1)J(C'C(alpha)) value is inversely proportional to the signal-to-noise ratio in the reference spectrum. It is shown that for weakly aligned proteins, with the magnitude of the alignment tensor of D(a)(NH) < or = 10-15 Hz, the systematic errors are typically negligible. The methods are demonstrated for the third IgG-binding domain of protein G (GB3) and a-synuclein in complex with a detergent micelle, where errors in (1)D(C'C(alpha)) of less than 0.1 Hz and ca. 0.2 Hz,respectively, are estimated. Remarkably, the dipolar couplings determined for GB3 are in even better agreement with the recently refined 1.1-angstroms X-ray structure than the input (13)C'-(13)C(alpha) couplings used for the refinement. PMID:15666562

  3. First results from the Very Small Array -- I. Observational methods

    E-print Network

    Robert A. Watson; Pedro Carreira; Kieran Cleary; Rod D. Davies; Richard J. Davis; Clive Dickinson; Keith Grainge; Carlos M. Gutierrez; Michael P. Hobson; Michael E. Jones; Rudiger Kneissl; Anthony Lasenby; Klaus Maisinger; Guy G. Pooley; Rafael Rebolo; Jose Alberto Rubino-Martin; Ben Rusholme; Richard D. E. Saunders; Richard Savage; Paul F. Scott; Anze Slosar; Pedro J. Sosa Molina; Angela C. Taylor; David Titterington; Elizabeth Waldram; Althea Wilkinson

    2003-03-04

    The Very Small Array (VSA) is a synthesis telescope designed to image faint structures in the cosmic microwave background on degree and sub-degree angular scales. The VSA has key differences from other CMB interferometers with the result that different systematic errors are expected. We have tested the operation of the VSA with a variety of blank-field and calibrator observations and cross-checked its calibration scale against independent measurements. We find that systematic effects can be suppressed below the thermal noise level in long observations; the overall calibration accuracy of the flux density scale is 3.5 percent and is limited by the external absolute calibration scale.

  4. Light scattering by Prorocentrum micans : a new method and results

    SciTech Connect

    Lofftus, K.D.; Quinby-Hunt, M.S.; Hunt, A.J. ); Livolant, F.; Maestre, M. )

    1992-05-20

    Striking light-scattering behavior was observed from a marine dinoflagellate, {ital Prorocentrum} {ital micans}. Measurements of the angular dependence of the 16 Mueller matrix elements were performed on single cells with a polarization-modulation nephelometer by using a new method for cell immobilization. First the dinoflagellate cells were immobilized in a transparent silica gel containing alcohol, and then a second liquid was diffused into the gel to match the index of refraction of the gel network, thereby producing a transparent support medium that scatters less than one tenth the amount of light scattered by a single cell at 90{degree}. Measurements of scattering by a single cell revealed that all 16 matrix elements were significantly nonzero and different from each other. All matrix elements have an extremely rich, reproducible structure that is highly dependent on cell orientation. The matrix elements symmetrically across the diagonal were not equivalent. Striking features of the measurements are the large peak values of S{sub 13}, S{sub 14}, and other off-diagonal block elements. We believe that this is the first report of such scattering signals by single, suspended marine microorganisms.

  5. Metabolic scaling in animals: methods, empirical results, and theoretical explanations.

    PubMed

    White, Craig R; Kearney, Michael R

    2014-01-01

    Life on earth spans a size range of around 21 orders of magnitude across species and can span a range of more than 6 orders of magnitude within species of animal. The effect of size on physiology is, therefore, enormous and is typically expressed by how physiological phenomena scale with mass(b). When b ? 1 a trait does not vary in direct proportion to mass and is said to scale allometrically. The study of allometric scaling goes back to at least the time of Galileo Galilei, and published scaling relationships are now available for hundreds of traits. Here, the methods of scaling analysis are reviewed, using examples for a range of traits with an emphasis on those related to metabolism in animals. Where necessary, new relationships have been generated from published data using modern phylogenetically informed techniques. During recent decades one of the most controversial scaling relationships has been that between metabolic rate and body mass and a number of explanations have been proposed for the scaling of this trait. Examples of these mechanistic explanations for metabolic scaling are reviewed, and suggestions made for comparing between them. Finally, the conceptual links between metabolic scaling and ecological patterns are examined, emphasizing the distinction between (1) the hypothesis that size- and temperature-dependent variation among species and individuals in metabolic rate influences ecological processes at levels of organization from individuals to the biosphere and (2) mechanistic explanations for metabolic rate that may explain the size- and temperature-dependence of this trait. PMID:24692144

  6. Quantitative radiochemical methods for determination of the sources of natural radioactivity

    USGS Publications Warehouse

    Rosholt, J.N., Jr.

    1957-01-01

    Study of the state of equilibrium of any natural radioactive source requires determination of several key nuclides or groups of nuclides to find their contribution to the total amount of radioactivity. Alpha activity measured by scintillation counting is used for determination of protactinium-231, thorium-232, thorium-230, and radium-226. The chemical procedures for the separations of the specific elements are described, as well as the measurement techniques used to determine the abundances of the individual isotopes. To correct for deviations in the ore standards, an independent means of evaluating the efficiencies of the individual separations and measurements is used. The development of these methods of radiochemical analysis facilitates detailed investigation of the major sources of natural radioactivity.

  7. Mass Spectrometry Applications for the Identification and Quantitation of Biomarkers Resulting from Human Exposure to Chemical Warfare Agents

    NASA Astrophysics Data System (ADS)

    Smith, J. Richard; Capacio, Benedict R.

    In recent years, a number of analytical methods using biomedical samples such as blood and urine have been developed for the verification of exposure to chemical warfare agents. The majority of methods utilize gas or liquid chromatography in conjunction with mass spectrometry. In a small number of cases of suspected human exposure to chemical warfare agents, biomedical specimens have been made available for testing. This chapter provides an overview of biomarkers that have been verified in human biomedical samples, details of the exposure incidents, the methods utilized for analysis, and the biomarker concentration levels determined in the blood and/or urine.

  8. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; Francesco, J. Di; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  9. A quantitative radioluminographic imaging method for evaluating lateral diffusion rates in skin.

    PubMed

    Rush, Allison K; Miller, Matthew A; Smith, Edward D; Kasting, Gerald B

    2015-10-28

    A method is presented for measuring the lateral diffusion coefficients of exogenously applied compounds on excised skin. The method involves sequential high resolution imaging of the spatial distribution of ?-radiation associated with [(14)C]-labeled compounds to monitor the development of the concentration profile on the skin surface. It is exemplified by measurements made on three radiolabeled test compounds--caffeine, testosterone, and zinc pyrithione (ZnPT)--administered as solutions. Lateral diffusivity is expected to be an important determinant of the topical bioavailability of ZnPT, which is characteristically administered as a fine suspension and must reach microorganisms in molecular form to exert biocidal activity. Application of the test compounds at levels below and above their estimated saturation doses in the upper stratum corneum allows one to distinguish between diffusion-limited and dissolution rate-limited kinetics. The effective lateral diffusivities of the two chemically stable reference compounds, caffeine and testosterone, were (1-4) × 10(-9) cm(2)/s and (3-9) × 10(-9) cm(2)/s, respectively. Lateral transport of [(14)C] associated with ZnPT was formulation-dependent, with effective diffusivities of (1-2) × 10(-9) cm(2)/s in water and (3-9) × 10(-9) cm(2)/s in a 1% body wash solution. These differences are thought to be related to molecular speciation and/or the presence of a residual surfactant phase on the skin surface. All values were greater than those estimated for the transverse diffusivities of these compounds in stratum corneum by factors ranging from 250 to over 2000. Facile lateral transport on skin, combined with a low transdermal permeation rate, may thus be seen to be a key factor in the safe and effective use of ZnPT as a topical antimicrobial agent. PMID:26241749

  10. A Simultaneous Metabolic Profiling and Quantitative Multimetabolite Metabolomic Method for Human Plasma Using Gas-Chromatography Tandem Mass Spectrometry.

    PubMed

    Savolainen, Otto I; Sandberg, Ann-Sofie; Ross, Alastair B

    2016-01-01

    For the first time it is possible to simultaneously collect targeted and nontargeted metabolomics data from plasma based on GC with high scan speed tandem mass spectrometry (GC-MS/MS). To address the challenge of getting broad metabolome coverage while quantifying known biomarker compounds in high-throughput GC-MS metabolomics, we developed a novel GC-MS/MS metabolomics method using a high scan speed (20 000 Da/second) GC-MS/MS that enables simultaneous data acquisition of both nontargeted full scan and targeted quantitative tandem mass spectrometry data. The combination of these two approaches has hitherto not been demonstrated in metabolomics. This method allows reproducible quantification of at least 37 metabolites using multiple reaction monitoring (MRM) and full mass spectral scan-based detection of 601 reproducible metabolic features from human plasma. The method showed good linearity over normal concentrations in plasma (0.06-343 to 0.86-4800 ?M depending on the metabolite) and good intra- and interbatch precision (0.9-16.6 and 2.6-29.6% relative standard deviation). Based on the parameters determined for this method, targeted quantification using MRM can be expanded to cover at least 508 metabolites while still collecting full scan data. The new simultaneous targeted and nontargeted metabolomics method enables more sensitive and accurate detection of predetermined metabolites and biomarkers of interest, while still allowing detection and identification of unknown metabolites. This is the first validated GC-MS/MS metabolomics method with simultaneous full scan and MRM data collection, and clearly demonstrates the utility of GC-MS/MS with high scanning rates for complex analyses. PMID:26615962

  11. Short Time Region Sparkover Characteristics of Compressed N2 and CO2 using Square Impulse and their Quantitative Estimation Method

    NASA Astrophysics Data System (ADS)

    Shinkai, Hiroyuki; Goshima, Hisashi; Yashima, Masafumi

    SF6 is used as a main insulation gas for gas-insulated switchgears (GIS), but it has recently become a gas to be restricted because of its greenhouse effect. Up to now, we have studied the insulation characteristics of compressed N2 and CO2 as a possible SF6-alternative gas. As GIS are subjected to very fast transient voltage due to incoming lightning surges or at disconnector switching operation, it is necessary to clarify the sparkover voltage-time (V-t) characteristics in the short time region. In this report, we describe the V-t curves ranging from 30ns to 10?s for high pressure (0.6 and 1.0MPa) N2 and CO2 obtained by applying square impulse voltage. We next studied the V-t curves for standard lightning and oscillating impulses for the same experimental conditions. Based on these results, we investigated the possibility of quantitatively estimating of the V-t curves for these waveforms by applying the “equal-area criterion”. The minimum sparkover voltage estimated by the criterion very well agreed with the measured characteristic in all conditions studied, and thus it has become clear that the quantitative estimation of the sparkover characteristics is possible for high pressure N2 and CO2.

  12. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 ?g/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  13. Assessing Internet energy intensity: A review of methods and results

    SciTech Connect

    Coroama, Vlad C.; Hilty, Lorenz M.; Empa, Swiss Federal Laboratories for Materials Science and Technology, Lerchenfeldstr. 5, 9014 St. Gallen; Centre for Sustainable Communications, KTH Royal Institute of Technology, Lindstedtsvägen 5, 100 44 Stockholm

    2014-02-15

    Assessing the average energy intensity of Internet transmissions is a complex task that has been a controversial subject of discussion. Estimates published over the last decade diverge by up to four orders of magnitude — from 0.0064 kilowatt-hours per gigabyte (kWh/GB) to 136 kWh/GB. This article presents a review of the methodological approaches used so far in such assessments: i) top–down analyses based on estimates of the overall Internet energy consumption and the overall Internet traffic, whereby average energy intensity is calculated by dividing energy by traffic for a given period of time, ii) model-based approaches that model all components needed to sustain an amount of Internet traffic, and iii) bottom–up approaches based on case studies and generalization of the results. Our analysis of the existing studies shows that the large spread of results is mainly caused by two factors: a) the year of reference of the analysis, which has significant influence due to efficiency gains in electronic equipment, and b) whether end devices such as personal computers or servers are included within the system boundary or not. For an overall assessment of the energy needed to perform a specific task involving the Internet, it is necessary to account for the types of end devices needed for the task, while the energy needed for data transmission can be added based on a generic estimate of Internet energy intensity for a given year. Separating the Internet as a data transmission system from the end devices leads to more accurate models and to results that are more informative for decision makers, because end devices and the networking equipment of the Internet usually belong to different spheres of control. -- Highlights: • Assessments of the energy intensity of the Internet differ by a factor of 20,000. • We review top–down, model-based, and bottom–up estimates from literature. • Main divergence factors are the year studied and the inclusion of end devices. • We argue against extending the Internet system boundary beyond data transmission. • Decision-makers need data that differentiates between end devices and transmission.

  14. Effects of the Forecasting Methods, Precipitation Character, and Satellite Resolution on the Predictability of Short-Term Quantitative Precipitation Nowcasting (QPN) from a Geostationary Satellite

    PubMed Central

    Liu, Yu; Xi, Du-Gang; Li, Zhao-Liang; Ji, Wei

    2015-01-01

    The prediction of the short-term quantitative precipitation nowcasting (QPN) from consecutive gestational satellite images has important implications for hydro-meteorological modeling and forecasting. However, the systematic analysis of the predictability of QPN is limited. The objective of this study is to evaluate effects of the forecasting model, precipitation character, and satellite resolution on the predictability of QPN usingimages of a Chinese geostationary meteorological satellite Fengyun-2F (FY-2F) which covered all intensive observation since its launch despite of only a total of approximately 10 days. In the first step, three methods were compared to evaluate the performance of the QPN methods: a pixel-based QPN using the maximum correlation method (PMC); the Horn-Schunck optical-flow scheme (PHS); and the Pyramid Lucas-Kanade Optical Flow method (PPLK), which is newly proposed here. Subsequently, the effect of the precipitation systems was indicated by 2338 imageries of 8 precipitation periods. Then, the resolution dependence was demonstrated by analyzing the QPN with six spatial resolutions (0.1atial, 0.3a, 0.4atial rand 0.6). The results show that the PPLK improves the predictability of QPN with better performance than the other comparison methods. The predictability of the QPN is significantly determined by the precipitation system, and a coarse spatial resolution of the satellite reduces the predictability of QPN. PMID:26447470

  15. Patient and carer satisfaction with 'hospital at home': quantitative and qualitative results from a randomised controlled trial.

    PubMed Central

    Wilson, Andrew; Wynn, Alison; Parker, Hilda

    2002-01-01

    BACKGROUND: 'Hospital At Home' schemes are set to increase in the United Kingdom (UK) in response to the NHS Plan. To date, little detailed work has been done on the acceptability of these schemes to patients and their carers. AIM: To compare Hospital at Home patient and carer satisfaction with hospital care. DESIGN OF STUDY: Pragmatic randomised controlled trial. SETTING: Consecutive patients assessed as suitablefor the Leicester Hospital at Home scheme were randomised to Hospital at Home or one of three acute hospitals in the city. METHOD: Patient satisfaction was assessed two weeks after randomisation, or at discharge if later using a six-item questionnaire. Patients' and carers' views of the services were assessed by semistructured interviews. RESULTS: One hundred and two patients were randomised to Hospital at Home and 97 to hospital. Forty-eight (47%) patients in the Hospital at Home arm and 35 (36%) in the hospital arm completed the satisfaction questionnaire, representing 96% and 85% of those eligible, respectively. Total scores were significantly higher in the Hospital at Home (median = 15) than in the hospital group (median = 12). (P<0.001, Mann-Whitney U-test.) Responses to all six questions favoured Hospital at Home, with all but one of these differences being statistically significant. In the Hospital at Homegroup, 24 patients and 18 of their carers were interviewed; in the hospital group 18 patients and seven of their carers were interviewed. Themes emerging from these interviews were that patients appreciated the more personal care and better communication offered by Hospital at Home and placed great value on staying at home, which was seen to be therapeutic. Patients largely felt safe in Hospital at Home, although some would have felt safer in hospital. Some patients and carers felt that better medical care would have been provided in hospital. Carers felt that the workload imposed by Hospital at Home was no greater than by hospital admission and that the relief from care duties at home would be counterbalanced by the added strain of hospital visiting. CONCLUSIONS: Patient satisfaction was greater with Hospital at Home than with hospital. Reasons included a more personal style of care and a feeling that staying at home was therapeutic. Carers did not feel that Hospital at Home imposed an extra workload. PMID:11791829

  16. Quantitative real-time polymerase chain reaction is an alternative method for the detection of HER-2 amplification in formalin-fixed paraffin-embedded breast cancer samples

    PubMed Central

    Pu, Tianjie; Guo, Peng; Qiu, Yan; Chen, Shinan; Yang, Libo; Sun, Linyong; Ye, Feng; Bu, Hong

    2015-01-01

    Fluorescent in situ hybridization (FISH) and immunohistochemistry (IHC) are the most common methods that are used to quantify HER-2 gene and protein levels, respectively, in human breast cancer. However, due to bad sample quality, some samples are unable to be subjected to a FISH assay. We evaluated 71 formalin-fixed paraffin-embedded (FFPE) breast carcinoma specimens by quantitative real-time polymerase chain reaction (qPCR), IHC, and FISH. We also performed qPCR and FISH assays on delayed formalin-fixed (DDF) samples. The qPCR results were in complete concordance with the results of IHC and FISH. In regards to the DDF samples, the HER-2 fluorescent signal seemed decayed compared with that of the DDF samples after 1 h. However, the qPCR method still works well up to 12 hours. Our results indicated that qPCR was obviously superior to FISH in cases that were not fixed in a reasonable amount of time. However, qPCR can be an alternative method by which to perform HER2 amplification assays in breast cancer.

  17. Contribution of Quantitative Methods of Estimating Mortality Dynamics to Explaining Mechanisms of Aging.

    PubMed

    Shilovsky, G A; Putyatina, T S; Markov, A V; Skulachev, V P

    2015-12-01

    Accumulation of various types of unrepaired damage of the genome because of increasing production of reactive oxygen species and decreasing efficiency of the antioxidant defense system and repair systems can cause age-related diseases and emergence of phenotypic signs of senescence. This should lead to increasing vulnerability and to mortality monotonously increasing with age independently of the position of the species on the evolutionary tree. In this light, the survival, mortality, and fertility curves for 45 animal and plant species and one alga published by the Max Planck Institute for Demographic Research (Germany/Denmark) are of special interest (Jones, O. R., et al. (2014) Nature, 505, 169-173). We divided all species treated in that study into four groups according to the ratio of mortality at the terminal age (which corresponds to 5% survival) and average mortality during the entire studied period. For animals of group IV (long-lived and senescent), including humans, the Jones method makes it possible to trace mortality during the entire life cycle. The same applies to short-lived animals (e.g. nematodes or the tundra vole), whether they display the Gompertz type of senescence or not. However, in long-lived species with a less pronounced increase in mortality with age (e.g. the freshwater crocodile, hermit crab, or Scots pine), as well as in animals of average lifespan that reach the terminal age earlier than they could have enough time to become senescent, the Jones method is capable of characterizing only a small part of the life cycle and does not allow judging how senescence manifests itself at late stages of the life cycle. Thus, it is known that old trees display signs of biological senescence rather clearly; although Jones et al. consider them non-senescent organisms because less than 5% of sexually mature individuals survive to display the first manifestations of these characters. We have concluded that the classification proposed by Jones et al. makes it possible to approximately divide animals and plants only by their levels of the Gompertz type of senescence (i.e. actuarial senescence), whereas susceptibility to biological senescence can be estimated only when principally different models are applied. PMID:26638679

  18. A rapid LC-MS/MS method for quantitation of eszopiclone in human plasma: application to a human pharmacokinetic study.

    PubMed

    Hotha, Kishore Kumar; Vijaya Bharathi, D; Jagadeesh, B; Ravindranath, L K; Jaya Veera, K N; Venkateswarulu, V

    2012-02-01

    A highly reproducible, specific and cost-effective LC-MS/MS method was developed for simultaneous estimation of eszopiclone (ESZ) with 50 ?L of human plasma using paroxetine as an internal standard (IS). The API-4000 LC-MS/MS was operated under the multiple reaction-monitoring mode using the electrospray ionization technique. A simple liquid-liquid extraction process was used to extract ESZ and IS from human plasma. The total run time was 1.5 min and the elution of ESZ and IS occurred at 0.90 min; this was achieved with a mobile phase consisting of 0.1% formic acid-methanol (15:85, v/v) at a flow rate of 0.50 mL/min on a Discover C(18) (50 × 4.6 mm, 5 µm) column. The developed method was validated in human plasma with a lower limit of quantitation of 0.1 ng/mL for ESZ. A linear response function was established for the range of concentrations 0.10-120 ng/mL (r > 0.998) for ESZ. The intra- and inter-day precision values for ESZ were acceptable as per FDA guidelines. Eszopiclone was stable in the battery of stability studies, viz. bench-top, autosampler and freeze-thaw cycles. The developed assay method was applied to an oral bioequivalence study in humans. PMID:21618564

  19. High-throughput multiplex quantitative polymerase chain reaction method for Giardia lamblia and Cryptosporidium species detection in stool samples.

    PubMed

    Nurminen, Noora; Juuti, Rosa; Oikarinen, Sami; Fan, Yue-Mei; Lehto, Kirsi-Maarit; Mangani, Charles; Maleta, Kenneth; Ashorn, Per; Hyöty, Heikki

    2015-06-01

    Giardia lamblia and Cryptosporidium species belong to a complex group of pathogens that cause diseases hampering development and socioeconomic improvements in the developing countries. Both pathogens are recognized as significant causes of diarrhea and nutritional disorders. However, further studies are needed to clarify the role of parasitic infections, especially asymptomatic infections in malnutrition and stunting. We developed a high-throughput multiplex quantitative polymerase chain reaction (qPCR) method for G. lamblia and Cryptosporidium spp. detection in stool samples. The sensitivity and specificity of the method were ensured by analyzing confirmed positive samples acquired from diagnostics laboratories and participating in an external quality control round. Its capability to detect asymptomatic G. lamblia and Cryptosporidium spp. infections was confirmed by analyzing stool samples collected from 44 asymptomatic 6-month-old infants living in an endemic region in Malawi. Of these, five samples were found to be positive for G. lamblia and two for Cryptosporidium spp. In conclusion, the developed method is suitable for large-scale studies evaluating the occurrence of G. lamblia and Cryptosporidium spp. in endemic regions and for clinical diagnostics of these infections. PMID:25918202

  20. An improved LC-MS/MS method for quantitation of indapamide in whole blood: application for a bioequivalence study.

    PubMed

    Pinto, Guilherme Araújo; Pastre, Kátia Isabel Fercondini; Bellorio, Karini Bruno; de Souza Teixeira, Leonardo; de Souza, Weidson Carlo; de Abreu, Fernanda Crunivel; de Santana E Silva Cardoso, Fabiana Fernandes; Pianetti, Gerson Antônio; César, Isabela Costa

    2014-09-01

    An improved LC-MS/MS method for the quantitation of indapamide in human whole blood was developed and validated. Indapamide-d3 was used as internal standard (IS) and liquid-liquid extraction was employed for sample preparation. LC separation was performed on Synergi Polar RP-column (50?×?4.6?mm i.d.; 4?µm) and mobile phase composed of methanol and 5?mm aqueous ammonium acetate containing 1?mm formic acid (60:40), at flow rate of 1?mL/min. The run time was 3.0?min and the injection volume was 20??L. Mass spectrometric detection was performed using electrospray ion source in negative ionization mode, using the transitions m/z 364.0???m/z 188.9 and m/z 367.0???m/z 188.9 for indapamide and IS, respectively. Calibration curve was constructed over the range 0.25-50?ng/mL. The method was precise and accurate, and provided recovery rates >80% for indapamide and IS. The method was applied to determine blood concentrations of indapamide in a bioequivalence study with two sustained release tablet formulations. The 90% confidence interval for the geometric mean ratios for maximum concentration was 95.78% and for the area under the concentration-time curve it was 97.91%. The tested indapamide tablets (Eurofarma Laboratórios S.A.) were bioequivalent to Natrilix®, according to the rate and extent of absorption. PMID:24752891

  1. Optimization of a quantitative PCR based method for plasmid copy number determination in human cell lines.

    PubMed

    Fliedl, Lukas; Kast, Florian; Grillari, Johannes; Wieser, Matthias; Grillari-Voglauer, Regina

    2015-12-25

    Transient gene expression (TGE) is an essential tool for the production of recombinant proteins, especially in early drug discovery and development phases of biopharmaceuticals. The need for fast production of sufficient recombinant protein for initial tests has dramatically increased with increase in the identification of potential novel pharmaceutical targets. One of the critical factors for transient transfection is plasmid copy number (PCN), for which we here provide an optimized qPCR based protocol. Thereby, we show the loss of PCN during a typical batch process of HEK293 cells after transfection from 606,000 to 4560 copies per cell within 5 days. Finally two novel human kidney cell lines, RS and RPTEC/TERT1 were compared to HEK293 and proved competitive in terms of PCN and specific productivity. In conclusion, since trafficking and degradation of plasmid DNA is not fully understood yet, improved methods for analysis of PCN may contribute to design specific and more stable plasmids for high yield transient gene expression systems. PMID:25796475

  2. Advanced Mass Spectrometric Methods for the Rapid and Quantitative Characterization of Proteomes

    DOE PAGESBeta

    Smith, Richard D.

    2002-01-01

    Progress is reviewed towards the development of a global strategy that aims to extend the sensitivity, dynamic range, comprehensiveness and throughput of proteomic measurements based upon the use of high performance separations and mass spectrometry. The approach uses high accuracy mass measurements from Fourier transform ion cyclotron resonance mass spectrometry (FTICR) to validate peptide ‘accurate mass tags’ (AMTs) produced by global protein enzymatic digestions for a specific organism, tissue or cell type from ‘potential mass tags’ tentatively identified using conventional tandem mass spectrometry (MS/MS). This provides the basis for subsequent measurements without the need for MS/ MS. High resolutionmore »capillary liquid chromatography separations combined with high sensitivity, and high resolution accurate FTICR measurements are shown to be capable of characterizing peptide mixtures of more than 10 5 components. The strategy has been initially demonstrated using the microorganisms Saccharomyces cerevisiae and Deinococcus radiodurans. Advantages of the approach include the high confidence of protein identification, its broad proteome coverage, high sensitivity, and the capability for stableisotope labeling methods for precise relative protein abundance measurements. Abbreviations : LC, liquid chromatography; FTICR, Fourier transform ion cyclotron resonance; AMT, accurate mass tag; PMT, potential mass tag; MMA, mass measurement accuracy; MS, mass spectrometry; MS/MS, tandem mass spectrometry; ppm, parts per million. « less

  3. The JCMT Gould Belt Survey: A Quantitative Comparison Between SCUBA-2 Data Reduction Methods

    E-print Network

    Mairs, S; Kirk, H; Graves, S; Buckle, J; Beaulieu, S F; Berry, D S; Broekhoven-Fiene, H; Currie, M J; Fich, M; Hatchell, J; Jenness, T; Mottram, J C; Nutter, D; Pattle, K; Pineda, J E; Salji, C; Di Francesco, J; Hogerheijde, M R; Ward-Thompson, D

    2015-01-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artifacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software, Starlink, but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth ...

  4. COSMIC EVOLUTION OF DUST IN GALAXIES: METHODS AND PRELIMINARY RESULTS

    SciTech Connect

    Bekki, Kenji

    2015-02-01

    We investigate the redshift (z) evolution of dust mass and abundance, their dependences on initial conditions of galaxy formation, and physical correlations between dust, gas, and stellar contents at different z based on our original chemodynamical simulations of galaxy formation with dust growth and destruction. In this preliminary investigation, we first determine the reasonable ranges of the most important two parameters for dust evolution, i.e., the timescales of dust growth and destruction, by comparing the observed and simulated dust mass and abundances and molecular hydrogen (H{sub 2}) content of the Galaxy. We then investigate the z-evolution of dust-to-gas ratios (D), H{sub 2} gas fraction (f{sub H{sub 2}}), and gas-phase chemical abundances (e.g., A {sub O} = 12 + log (O/H)) in the simulated disk and dwarf galaxies. The principal results are as follows. Both D and f{sub H{sub 2}} can rapidly increase during the early dissipative formation of galactic disks (z ? 2-3), and the z-evolution of these depends on initial mass densities, spin parameters, and masses of galaxies. The observed A {sub O}-D relation can be qualitatively reproduced, but the simulated dispersion of D at a given A {sub O} is smaller. The simulated galaxies with larger total dust masses show larger H{sub 2} and stellar masses and higher f{sub H{sub 2}}. Disk galaxies show negative radial gradients of D and the gradients are steeper for more massive galaxies. The observed evolution of dust masses and dust-to-stellar-mass ratios between z = 0 and 0.4 cannot be reproduced so well by the simulated disks. Very extended dusty gaseous halos can be formed during hierarchical buildup of disk galaxies. Dust-to-metal ratios (i.e., dust-depletion levels) are different within a single galaxy and between different galaxies at different z.

  5. Source amplitudes of volcano-seismic signals determined by the amplitude source location method as a quantitative measure of event size

    NASA Astrophysics Data System (ADS)

    Kumagai, Hiroyuki; Lacson, Rudy; Maeda, Yuta; Figueroa, Melquiades S.; Yamashina, Tadashi; Ruiz, Mario; Palacios, Pablo; Ortiz, Hugo; Yepes, Hugo

    2013-05-01

    The amplitude source location (ASL) method, which uses high-frequency amplitudes under the assumption of isotropic S-wave radiation, has been shown to be useful for locating the sources of various types of volcano-seismic signals. We tested the ASL method by using synthetic seismograms and examined the source amplitudes determined by this method for various types of volcano-seismic signals observed at different volcanoes. Our synthetic tests indicated that, although ASL results are not strongly influenced by velocity structure and noise, they do depend on site amplification factors at individual stations. We first applied the ASL method to volcano-tectonic (VT) earthquakes at Taal volcano, Philippines. Our ASL results for the largest VT earthquake showed that a frequency range of 7-12 Hz and a Q value of 50 were appropriate for the source location determination. Using these values, we systematically estimated source locations and amplitudes of VT earthquakes at Taal. We next applied the ASL method to long-period events at Cotopaxi volcano and to explosions at Tungurahua volcano in Ecuador. We proposed a practical approach to minimize the effects of site amplifications among different volcano seismic networks, and compared the source amplitudes of these various volcano-seismic events with their seismic magnitudes. We found a proportional relation between seismic magnitude and the logarithm of the source amplitude. The ASL method can be used to determine source locations of small events for which onset measurements are difficult, and thus can estimate the sizes of events over a wider range of sizes compared with conventional hypocenter determination approaches. Previously, there has been no parameter widely used to quantify the sources of volcano-seismic signals. This study showed that the source amplitude determined by the ASL method may be a useful quantitative measure of volcano-seismic event size.

  6. A compressed sensing method with analytical results for lidar feature classification

    NASA Astrophysics Data System (ADS)

    Allen, Josef D.; Yuan, Jiangbo; Liu, Xiuwen; Rahmes, Mark

    2011-04-01

    We present an innovative way to autonomously classify LiDAR points into bare earth, building, vegetation, and other categories. One desirable product of LiDAR data is the automatic classification of the points in the scene. Our algorithm automatically classifies scene points using Compressed Sensing Methods via Orthogonal Matching Pursuit algorithms utilizing a generalized K-Means clustering algorithm to extract buildings and foliage from a Digital Surface Models (DSM). This technology reduces manual editing while being cost effective for large scale automated global scene modeling. Quantitative analyses are provided using Receiver Operating Characteristics (ROC) curves to show Probability of Detection and False Alarm of buildings vs. vegetation classification. Histograms are shown with sample size metrics. Our inpainting algorithms then fill the voids where buildings and vegetation were removed, utilizing Computational Fluid Dynamics (CFD) techniques and Partial Differential Equations (PDE) to create an accurate Digital Terrain Model (DTM) [6]. Inpainting preserves building height contour consistency and edge sharpness of identified inpainted regions. Qualitative results illustrate other benefits such as Terrain Inpainting's unique ability to minimize or eliminate undesirable terrain data artifacts.

  7. A COMPRESSED SENSING METHOD WITH ANALYTICAL RESULTS FOR LIDAR FEATURE CLASSIFICATION

    SciTech Connect

    Allen, Josef D

    2011-01-01

    We present an innovative way to autonomously classify LiDAR points into bare earth, building, vegetation, and other categories. One desirable product of LiDAR data is the automatic classification of the points in the scene. Our algorithm automatically classifies scene points using Compressed Sensing Methods via Orthogonal Matching Pursuit algorithms utilizing a generalized K-Means clustering algorithm to extract buildings and foliage from a Digital Surface Models (DSM). This technology reduces manual editing while being cost effective for large scale automated global scene modeling. Quantitative analyses are provided using Receiver Operating Characteristics (ROC) curves to show Probability of Detection and False Alarm of buildings vs. vegetation classification. Histograms are shown with sample size metrics. Our inpainting algorithms then fill the voids where buildings and vegetation were removed, utilizing Computational Fluid Dynamics (CFD) techniques and Partial Differential Equations (PDE) to create an accurate Digital Terrain Model (DTM) [6]. Inpainting preserves building height contour consistency and edge sharpness of identified inpainted regions. Qualitative results illustrate other benefits such as Terrain Inpainting s unique ability to minimize or eliminate undesirable terrain data artifacts. Keywords: Compressed Sensing, Sparsity, Data Dictionary, LiDAR, ROC, K-Means, Clustering, K-SVD, Orthogonal Matching Pursuit

  8. Quantitative and qualitative greywater characterization in Greek households and investigation of their treatment using physicochemical methods.

    PubMed

    Antonopoulou, Georgia; Kirkou, Amalia; Stasinakis, Athanasios S

    2013-06-01

    Data for the quantity of greywater produced in Greek households was collected from two different cities, while samples from different residences were taken for greywater's quality characterization. Laboratory experiments were also performed to investigate the use of coagulation on COD and TSS removal from two different types of greywater, while a combined treatment consisting of coagulation, sand filtration and adsorption on granular activated carbon (GAC) was applied to achieve adequate quality for greywater reuse. According to the results, average greywater production in Greek residences was 82.6 ± 49.3 L per inhabitant and day, while the major sources were shower/bathtub and laundry, contributing to 41% and 26%, respectively. On the other hand, blackwater production was estimated at 59.4 ± 29.6L per inhabitant and day. Greywater produced in shower/bathtub and hand basin had similar quality characteristics, while kitchen sink's greywater were more contaminated, presenting lower pH values and higher concentrations of TSS and total COD. Coagulation experiments with FeCl3 and Al2(SO4)3 showed that process efficiency was differentiated significantly according to the type of greywater and the coagulant used. The highest removal efficiency (COD: 81%; TSS: 79%) was achieved for greywater that did not contain wastewater from the laundry and for Al2(SO4)3×14 H2O dosage of 800 mg L(-1). The application of coagulation, sand filtration and GAC adsorption resulted to average concentrations of COD and TSS equal to 28 ± 11 and 11 ± 3 mg L(-1), respectively, in treated greywater. PMID:23563256

  9. Performance of two quantitative PCR methods for microbial source tracking of human sewage and implications for microbial risk assessment in recreational waters

    EPA Science Inventory

    Before new, rapid quantitative PCR (qPCR) methods for recreational water quality assessment and microbial source tracking (MST) can be useful in a regulatory context, an understanding of the ability of the method to detect a DNA target (marker) when the contaminant soure has been...

  10. A method for direct, semi-quantitative analysis of gas phase samples using gas chromatography-inductively coupled plasma-mass spectrometry

    SciTech Connect

    Carter, Kimberly E.; Gerdes, Kirk

    2013-07-01

    A new and complete GC–ICP-MS method is described for direct analysis of trace metals in a gas phase process stream. The proposed method is derived from standard analytical procedures developed for ICP-MS, which are regularly exercised in standard ICP-MS laboratories. In order to implement the method, a series of empirical factors were generated to calibrate detector response with respect to a known concentration of an internal standard analyte. Calibrated responses are ultimately used to determine the concentration of metal analytes in a gas stream using a semi-quantitative algorithm. The method was verified using a traditional gas injection from a GC sampling valve and a standard gas mixture containing either a 1 ppm Xe + Kr mix with helium balance or 100 ppm Xe with helium balance. Data collected for Xe and Kr gas analytes revealed that agreement of 6–20% with the actual concentration can be expected for various experimental conditions. To demonstrate the method using a relevant “unknown” gas mixture, experiments were performed for continuous 4 and 7 hour periods using a Hg-containing sample gas that was co-introduced into the GC sample loop with the xenon gas standard. System performance and detector response to the dilute concentration of the internal standard were pre-determined, which allowed semi-quantitative evaluation of the analyte. The calculated analyte concentrations varied during the course of the 4 hour experiment, particularly during the first hour of the analysis where the actual Hg concentration was under predicted by up to 72%. Calculated concentration improved to within 30–60% for data collected after the first hour of the experiment. Similar results were seen during the 7 hour test with the deviation from the actual concentration being 11–81% during the first hour and then decreasing for the remaining period. The method detection limit (MDL) was determined for the mercury by injecting the sample gas into the system following a period of equilibration. The MDL for Hg was calculated as 6.8 ?g · m{sup ? 3}. This work describes the first complete GC–ICP-MS method to directly analyze gas phase samples, and detailed sample calculations and comparisons to conventional ICP-MS methods are provided.

  11. Quantitative assessment of alkali-reactive aggregate mineral content through XRD using polished sections as a supplementary tool to RILEM AAR-1 (petrographic method)

    SciTech Connect

    Castro, Nelia; Sorensen, Bjorn E.; Broekmans, Maarten A.T.M.

    2012-11-15

    The mineral content of 5 aggregate samples from 4 different countries, including reactive and non-reactive aggregate types, was assessed quantitatively by X-ray diffraction (XRD) using polished sections. Additionally, electron probe microanalyzer (EPMA) mapping and cathodoluminescence (CL) were used to characterize the opal-CT identified in one of the aggregate samples. Critical review of results from polished sections against traditionally powdered specimen has demonstrated that for fine-grained rocks without preferred orientation the assessment of mineral content by XRD using polished sections may represent an advantage over traditional powder specimens. Comparison of data on mineral content and silica speciation with expansion data from PARTNER project confirmed that the presence of opal-CT plays an important role in the reactivity of one of the studied aggregates. Used as a complementary tool to RILEM AAR-1, the methodology suggested in this paper has the potential to improve the strength of the petrographic method.

  12. Streaming visualisation of quantitative mass spectrometry data based on a novel raw signal decomposition method

    PubMed Central

    Zhang, Yan; Bhamber, Ranjeet; Riba-Garcia, Isabel; Liao, Hanqing; Unwin, Richard D; Dowsey, Andrew W

    2015-01-01

    As data rates rise, there is a danger that informatics for high-throughput LC-MS becomes more opaque and inaccessible to practitioners. It is therefore critical that efficient visualisation tools are available to facilitate quality control, verification, validation, interpretation, and sharing of raw MS data and the results of MS analyses. Currently, MS data is stored as contiguous spectra. Recall of individual spectra is quick but panoramas, zooming and panning across whole datasets necessitates processing/memory overheads impractical for interactive use. Moreover, visualisation is challenging if significant quantification data is missing due to data-dependent acquisition of MS/MS spectra. In order to tackle these issues, we leverage our seaMass technique for novel signal decomposition. LC-MS data is modelled as a 2D surface through selection of a sparse set of weighted B-spline basis functions from an over-complete dictionary. By ordering and spatially partitioning the weights with an R-tree data model, efficient streaming visualisations are achieved. In this paper, we describe the core MS1 visualisation engine and overlay of MS/MS annotations. This enables the mass spectrometrist to quickly inspect whole runs for ionisation/chromatographic issues, MS/MS precursors for coverage problems, or putative biomarkers for interferences, for example. The open-source software is available from http://seamass.net/viz/. PMID:25663356

  13. Sensitive detection and quantitation of EZH2 expression in cancer cell by an electrochemiluminescent method

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Zhou, Xiaoming

    2009-08-01

    The polycomb group protein enhancer of zeste homolog 2 (EZH2) regulating cell cycle and functioning as a transcriptional repressor, is overexpressed in several human cancers. Therefore it can be a molecular marker for detection of cancer progression and metastasis. Here the electrochemiluminescence (ECL) assay was developed to detect and quantify the amount of EZH2 mRNA expression in cancer cell. Total mRNA was reverse transcribed into cDNA. The cDNA was amplified using a forward and reverse primer pair which were labeled with biotin and Tris (2, 2-bipyridine) ruthenium (II) (TBR) on the 5' end, respectively. The amplification product was captured on streptavidin coated magnetic beads and then separated using a magnetic field. The TBR labels were reacted with the most efficient coreactant, TPA, on the electrode. Photons were produced and detected by a custom-built ECL system. The housekeeping gene hydroxymethylbilane synthase (HMBS) was used as an approximate reference to quantify the amount of EZH2 mRNA expression, whose primer pairs were labeled the same as EZH2. Result indicated that the EZH2 mRNA was overexpressed in MCF-7 cells relative to normal blood cells. This assay is specific and sensitive and could be used for the clinical diagnosis and prognosis of cancer.

  14. A relative quantitative method to detect OCT4A gene expression by exon-junction primer and locked nucleic acid-modified probe*

    PubMed Central

    Ren, Jian-jun; Meng, Xing-kai

    2011-01-01

    Objective: OCT4A has been known to play a critical role in the maintenance of pluripotency of embryonic stem cells. Recent research has shown that OCT4A is also expressed in partial tumor cell lines and tissues. This study is aimed to develop a real-time reverse transcriptase polymerase chain reaction (RT-PCR) assay for relative quantitative detection of OCT4A mRNA and discrimination from OCT4B, pseudogene, and genomic contaminations. Methods: A locked nucleic acid (LNA)-modified probe was designed to discern the single base difference 352A/C to identify OCT4A mRNA. An exon-junction primer was designed to avoid false positive caused by genomic contaminations. In addition, a house keeping gene glyceraldehyde-3-phosphate dehydrogenase (GAPDH) was measured in parallel to normalize the differences between samples and operations. Results: Experiments showed that the newly established RT-PCR assay amplified the OCT4A mRNA selectively; OCT4A analogues gave negative signals. Cell lines nTERA-2 and HepG2 showed positive results in OCT4A expression, while for HeLa and 293 cell lines, as well as primary peripheral blood mononuclear cells (PBMCs), OCT4A expression was negative. Additionally, the relative quantity of OCT4A mRNA was calculated by cycle threshold (C t) method and house keeping gene normalization. Conclusions: This technique proved to be effective for relative quantitation of OCT4A mRNA with high specificity. PMID:21265047

  15. Efficiency of peracetic acid in inactivating bacteria, viruses, and spores in water determined with ATP bioluminescence, quantitative PCR, and culture-based methods.

    PubMed

    Park, Eunyoung; Lee, Cheonghoon; Bisesi, Michael; Lee, Jiyoung

    2014-03-01

    The disinfection efficiency of peracetic acid (PAA) was investigated on three microbial types using three different methods (filtration-based ATP (adenosine-triphosphate) bioluminescence, quantitative polymerase chain reaction (qPCR), culture-based method). Fecal indicator bacteria (Enterococcus faecium), virus indicator (male-specific (F(+)) coliphages (coliphages)), and protozoa disinfection surrogate (Bacillus subtilis spores (spores)) were tested. The mode of action for spore disinfection was visualized using scanning electron microscopy. The results indicated that PAA concentrations of 5 ppm (contact time: 5 min), 50 ppm (10 min), and 3,000 ppm (5 min) were needed to achieve 3-log reduction of E. faecium, coliphages, and spores, respectively. Scanning electron microscopy observation showed that PAA targets the external layers of spores. The lower reduction rates of tested microbes measured with qPCR suggest that qPCR may overestimate the surviving microbes. Collectively, PAA showed broad disinfection efficiency (susceptibility: E. faecium > coliphages > spores). For E. faecium and spores, ATP bioluminescence was substantially faster (?5 min) than culture-based method (>24 h) and qPCR (2-3 h). This study suggests PAA as an effective alternative to inactivate broad types of microbial contaminants in water. Together with the use of rapid detection methods, this approach can be useful for urgent situations when timely response is needed for ensuring water quality. PMID:24642428

  16. Application of a rapid and efficient quantitative analysis method for traditional Chinese medicines: the case study of quality assessment of Salvia miltiorrhiza Bunge.

    PubMed

    Jing, Wen-Guang; Zhang, Jun; Zhang, Li-Yan; Wang, Dong-Zhe; Wang, Yue-Sheng; Liu, An

    2013-01-01

    A reference extractive, containing multiple active known compounds, has been considered to be an alternative to individual reference standards. However, in the Chinese Pharmacopoeia (ChP) the great majority of reference extractives have been primarily used for qualitative identification by thin-layer chromatography (TLC) and few studies on the applicability of reference extractives for quantitative analysis have been presented. Using Salvia miltiorrhiza Bunge as an example in this paper, we first present a preliminary discussion on the feasibility and applicability of reference extractives for the quantitative analysis of TCMs. The reference extractive of S. miltiorrhiza Bunge, comprised of three pharmacological marker compounds, namely cryptotanshinone, tanshinone I and tanshinone IIA, was prepared from purchased Salvia miltiorrhiza Bunge by extraction with acetone under reflux, followed by silica gel column chromatography with stepwise elution with petroleum ether-ethyl acetate (25:1, v/v, 4.5 BV) to remove the non-target components and chloroform-methanol (10:1, v/v; 3 BV) to yield a crude reference extractive solution. After concentration, the solution was further purified by preparative reversed-phase HPLC on a C18 column with isocratic elution with 77% methanol aqueous solution to yield the total reference extractive of S. miltiorrhiza Bunge. Thereafter, the reference extractive was applied to the quality assessment of S. miltiorrhiza Bunge using high-performance liquid chromatography (HPLC) coupled with diode array detection (DAD). The validation of the method, including linearity, sensitivity, repeatability, stability and recovery testing, indicated that this method was valid, reliable and sensitive, with good reproducibility. The developed method was successfully applied to quantify seven batches of samples collected from different regions in China and the results were also similar to those obtained using reference standards, with relative standard deviation (RSD) <3%. Preparation of a reference extractive of S. miltiorrhiza Bunge was significantly less expensive and time consuming than preparation of a corresponding reference standard. Quantitative analysis using a reference extractive was shown to be simple, low-cost, time-saving and practical, with high sensitivity and good stability; and is, therefore, a strong alternative to the use of reference standards. PMID:23765231

  17. The breaking load method - Results and statistical modification from the ASTM interlaboratory test program

    NASA Technical Reports Server (NTRS)

    Colvin, E. L.; Emptage, M. R.

    1992-01-01

    The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.

  18. Leveraging Random Number Generation for Mastery of Learning in Teaching Quantitative Research Courses via an E-Learning Method

    ERIC Educational Resources Information Center

    Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.

    2014-01-01

    E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative

  19. Development and validation of sensitive LC/MS/MS method for quantitative bioanalysis of levonorgestrel in rat plasma and application to pharmacokinetics study.

    PubMed

    Ananthula, Suryatheja; Janagam, Dileep R; Jamalapuram, Seshulatha; Johnson, James R; Mandrell, Timothy D; Lowe, Tao L

    2015-10-15

    Rapid, sensitive, selective and accurate LC/MS/MS method was developed for quantitative determination of levonorgestrel (LNG) in rat plasma and further validated for specificity, linearity, accuracy, precision, sensitivity, matrix effect, recovery efficiency and stability. Liquid-liquid extraction procedure using hexane:ethyl acetate mixture at 80:20 v:v ratio was employed to efficiently extract LNG from rat plasma. Reversed phase Luna column C18(2) (50×2.0mm i.d., 3?M) installed on a AB SCIEX Triple Quad™ 4500 LC/MS/MS system was used to perform chromatographic separation. LNG was identified within 2min with high specificity. Linear calibration curve was drawn within 0.5-50ng·mL(-1) concentration range. The developed method was validated for intra-day and inter-day accuracy and precision whose values fell in the acceptable limits. Matrix effect was found to be minimal. Recovery efficiency at three quality control (QC) concentrations 0.5 (low), 5 (medium) and 50 (high) ng·mL(-1) was found to be >90%. Stability of LNG at various stages of experiment including storage, extraction and analysis was evaluated using QC samples, and the results showed that LNG was stable at all the conditions. This validated method was successfully used to study the pharmacokinetics of LNG in rats after SubQ injection, providing its applicability in relevant preclinical studies. PMID:26409262

  20. Quantitative characterization of the protein contents of the exocrine pancreatic acinar cell by soft x-ray microscopy and advanced digital imaging methods

    SciTech Connect

    Loo Jr., Billy W.

    2000-06-09

    The study of the exocrine pancreatic acinar cell has been central to the development of models of many cellular processes, especially of protein transport and secretion. Traditional methods used to examine this system have provided a wealth of qualitative information from which mechanistic models have been inferred. However they have lacked the ability to make quantitative measurements, particularly of the distribution of protein in the cell, information critical for grounding of models in terms of magnitude and relative significance. This dissertation describes the development and application of new tools that were used to measure the protein content of the major intracellular compartments in the acinar cell, particularly the zymogen granule. Soft x-ray microscopy permits image formation with high resolution and contrast determined by the underlying protein content of tissue rather than staining avidity. A sample preparation method compatible with x-ray microscopy was developed and its properties evaluated. Automatic computerized methods were developed to acquire, calibrate, and analyze large volumes of x-ray microscopic images of exocrine pancreatic tissue sections. Statistics were compiled on the protein density of several organelles, and on the protein density, size, and spatial distribution of tens of thousands of zymogen granules. The results of these measurements, and how they compare to predictions of different models of protein transport, are discussed.