Science.gov

Sample records for quantitative methods results

  1. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  2. Quantitative fuel motion determination with the CABRI fast neutron hodoscope; Evaluation methods and results

    SciTech Connect

    Baumung, K. ); Augier, G. )

    1991-12-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized.

  3. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  4. Quantitative imaging methods in osteoporosis

    PubMed Central

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M. Carola

    2016-01-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research. PMID:28090446

  5. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  6. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  7. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  8. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from Midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents.

    PubMed

    Sivaganesan, Mano; Sivaganensan, Mano; Siefring, Shawn; Varma, Manju; Haugland, Richard A

    2014-06-01

    Enterococci target sequence density estimates from analyses of diluted river water DNA extracts by EPA Methods 1611 and 1609 and estimates with lower detection limits from undiluted DNA extracts by Method 1609 were indistinguishable. These methods should be equally suitable for comparison with U.S. EPA 2012 Recreational Water Quality Criteria values.

  9. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  10. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  11. Quantitative methods in classical perturbation theory.

    NASA Astrophysics Data System (ADS)

    Giorgilli, A.

    Poincaré proved that the series commonly used in Celestial mechanics are typically non convergent, although their usefulness is generally evident. Recent work in perturbation theory has enlightened this conjecture of Poincaré, bringing into evidence that the series of perturbation theory, although non convergent in general, furnish nevertheless valuable approximations to the true orbits for a very large time, which in some practical cases could be comparable with the age of the universe. The aim of the author's paper is to introduce the quantitative methods of perturbation theory which allow to obtain such powerful results.

  12. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  13. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  14. Quantitative Methods for Software Selection and Evaluation

    DTIC Science & Technology

    2006-09-01

    Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program...5 2 Evaluation Methods ...Abstract When performing a “buy” analysis and selecting a product as part of a software acquisition strategy , most organizations will consider primarily

  15. A new HPLC method for azithromycin quantitation.

    PubMed

    Zubata, Patricia; Ceresole, Rita; Rosasco, Maria Ana; Pizzorno, Maria Teresa

    2002-02-01

    A simple liquid chromatographic method was developed for the estimation of azithromycin raw material and in pharmaceutical forms. The sample was chromatographed on a reverse phase C18 column and eluants monitored at a wavelength of 215 nm. The method was accurate, precise and sufficiently selective. It is applicable for its quantitation, stability and dissolution tests.

  16. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field.

  17. Quantitative laser-induced breakdown spectroscopy data using peak area step-wise regression analysis: an alternative method for interpretation of Mars science laboratory results

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Dyar, Melinda D; Schafer, Martha W; Tucker, Jonathan M

    2008-01-01

    The ChemCam instrument on the Mars Science Laboratory (MSL) will include a laser-induced breakdown spectrometer (LIBS) to quantify major and minor elemental compositions. The traditional analytical chemistry approach to calibration curves for these data regresses a single diagnostic peak area against concentration for each element. This approach contrasts with a new multivariate method in which elemental concentrations are predicted by step-wise multiple regression analysis based on areas of a specific set of diagnostic peaks for each element. The method is tested on LIBS data from igneous and metamorphosed rocks. Between 4 and 13 partial regression coefficients are needed to describe each elemental abundance accurately (i.e., with a regression line of R{sup 2} > 0.9995 for the relationship between predicted and measured elemental concentration) for all major and minor elements studied. Validation plots suggest that the method is limited at present by the small data set, and will work best for prediction of concentration when a wide variety of compositions and rock types has been analyzed.

  18. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  19. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    PubMed Central

    Braisted, John C; Kuntumalla, Srilatha; Vogel, Christine; Marcotte, Edward M; Rodrigues, Alan R; Wang, Rong; Huang, Shih-Ting; Ferlanti, Erik S; Saeed, Alexander I; Fleischmann, Robert D; Peterson, Scott N; Pieper, Rembert

    2008-01-01

    Background Mass spectrometry (MS) based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS) data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007) described a modified spectral counting technique, Absolute Protein Expression (APEX), which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value) that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi). This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a utility to merge multiple

  20. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  1. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  2. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  3. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  4. Extended Rearrangement Inequalities and Applications to Some Quantitative Stability Results

    NASA Astrophysics Data System (ADS)

    Lemou, Mohammed

    2016-12-01

    In this paper, we prove a new functional inequality of Hardy-Littlewood type for generalized rearrangements of functions. We then show how this inequality provides quantitative stability results of steady states to evolution systems that essentially preserve the rearrangements and some suitable energy functional, under minimal regularity assumptions on the perturbations. In particular, this inequality yields a quantitative stability result of a large class of steady state solutions to the Vlasov-Poisson systems, and more precisely we derive a quantitative control of the L 1 norm of the perturbation by the relative Hamiltonian (the energy functional) and rearrangements. A general non linear stability result has been obtained by Lemou et al. (Invent Math 187:145-194, 2012) in the gravitational context, however the proof relied in a crucial way on compactness arguments which by construction provides no quantitative control of the perturbation. Our functional inequality is also applied to the context of 2D-Euler systems and also provides quantitative stability results of a large class of steady-states to this system in a natural energy space.

  5. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  6. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  7. [Progress in stable isotope labeled quantitative proteomics methods].

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  8. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  9. A general radiochemical-color method for quantitation of immunoblots.

    PubMed

    Esmaeli-Azad, B; Feinstein, S C

    1991-12-01

    Quantitative interpretation of protein immunoblotting procedures is hampered by a variety of technical liabilities inherent in the use of photographic and densitometric methods. In this paper, we present a novel, simple, and generally applicable alternative procedure to acquire quantitative data from immunoblots. Our strategy employs both the standard alkaline phosphatase color reaction and radiolabelled Protein A. The color reaction is used to localize the polypeptide of interest after transfer to a solid support. The colored bands are then excised and the radioactivity in the colocalized Protein A is quantitated in a gamma counter. In addition to avoiding the problems associated with photographic and densitometric procedures, our assay also overcomes common problems associated with variable gel lane width and individual band distortion. The resulting data is linear over a range of at least 50-fold (10-500 ng of specific protein, for the example used in this study) and is highly reproducible.

  10. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  11. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  12. Novel method for ANA quantitation using IIF imaging system.

    PubMed

    Peng, Xiaodong; Tang, Jiangtao; Wu, Yongkang; Yang, Bin; Hu, Jing

    2014-02-01

    A variety of antinuclear antibodies (ANAs) are found in the serum of patients with autoimmune diseases. The detection of abnormal ANA titers is a critical criterion for diagnosis of systemic lupus erythematosus (SLE) and other connective tissue diseases. Indirect immunofluorescence assay (IIF) on HEp-2 cells is the gold standard method to determine the presence of ANA and therefore provides information about the localization of autoantigens that are useful for diagnosis. However, its utility was limited in prognosing and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. On the other hand, ELISA for the detection of ANA can quantitate ANA but could not provide further information about the localization of the autoantigens. It would be ideal to integrate both of the quantitative and qualitative methods. To address this issue, this study was conducted to quantitatively detect ANAs by using IIF imaging analysis system. Serum samples from patients with ANA positive (including speckled, homogeneous, nuclear mixture and cytoplasmic mixture patterns) and negative were detected for ANA titers by the classical IIF and analyzed by an image system, the image of each sample was acquired by the digital imaging system and the green fluorescence intensity was quantified by the Image-Pro plus software. A good correlation was found in between two methods and the correlation coefficients (R(2)) of various ANA patterns were 0.942 (speckled), 0.942 (homogeneous), 0.923 (nuclear mixture) and 0.760 (cytoplasmic mixture), respectively. The fluorescence density was linearly correlated with the log of ANA titers in various ANA patterns (R(2)>0.95). Moreover, the novel ANA quantitation method showed good reproducibility (F=0.091, p>0.05) with mean±SD and CV% of positive, and negative quality controls were equal to 126.4±9.6 and 7.6%, 10.4±1.25 and 12

  13. Informatics Methods to Enable Sharing of Quantitative Imaging Research Data

    PubMed Central

    Levy, Mia A.; Freymann, John B.; Kirby, Justin S.; Fedorov, Andriy; Fennessy, Fiona M.; Eschrich, Steven A.; Berglund, Anders E.; Fenstermacher, David A.; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L.; Brown, Bartley J.; Braun, Terry A.; Dekker, Andre; Roelofs, Erik; Mountz, James M.; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-01-01

    Introduction The National Cancer Institute (NCI) Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. Methods We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. Results There area variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. Conclusions As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. PMID:22770688

  14. An improved quantitative analysis method for plant cortical microtubules.

    PubMed

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  15. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  16. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    PubMed

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  17. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  18. A rapid chemiluminescent method for quantitation of human DNA.

    PubMed Central

    Walsh, P S; Varlaro, J; Reynolds, R

    1992-01-01

    A sensitive and simple method for the quantitation of human DNA is described. This method is based on probe hybridization to a human alpha satellite locus, D17Z1. The biotinylated probe is hybridized to sample DNA immobilized on nylon membrane. The subsequent binding of streptavidin-horseradish peroxidase to the bound probe allows for chemiluminescent detection using a luminol-based reagent and X-ray film. Less than 150 pg of human DNA can easily be detected with a 15 minute exposure. The entire procedure can be performed in 1.5 hours. Microgram quantities of nonhuman DNA have been tested and the results indicate very high specificity for human DNA. The data on film can be scanned into a computer and a commercially available program can be used to create a standard curve where DNA quantity is plotted against the mean density of each slot blot signal. The methods described can also be applied to the very sensitive determination of quantity and quality (size) of DNA on Southern blots. The high sensitivity of this quantitation method requires the consumption of only a fraction of sample for analysis. Determination of DNA quantity is necessary for RFLP and many PCR-based tests where optimal results are obtained only with a relatively narrow range of DNA quantities. The specificity of this quantitation method for human DNA will be useful for the analysis of samples that may also contain bacterial or other non-human DNA, for example forensic evidence samples, ancient DNA samples, or clinical samples. Images PMID:1408822

  19. A rapid chemiluminescent method for quantitation of human DNA.

    PubMed

    Walsh, P S; Varlaro, J; Reynolds, R

    1992-10-11

    A sensitive and simple method for the quantitation of human DNA is described. This method is based on probe hybridization to a human alpha satellite locus, D17Z1. The biotinylated probe is hybridized to sample DNA immobilized on nylon membrane. The subsequent binding of streptavidin-horseradish peroxidase to the bound probe allows for chemiluminescent detection using a luminol-based reagent and X-ray film. Less than 150 pg of human DNA can easily be detected with a 15 minute exposure. The entire procedure can be performed in 1.5 hours. Microgram quantities of nonhuman DNA have been tested and the results indicate very high specificity for human DNA. The data on film can be scanned into a computer and a commercially available program can be used to create a standard curve where DNA quantity is plotted against the mean density of each slot blot signal. The methods described can also be applied to the very sensitive determination of quantity and quality (size) of DNA on Southern blots. The high sensitivity of this quantitation method requires the consumption of only a fraction of sample for analysis. Determination of DNA quantity is necessary for RFLP and many PCR-based tests where optimal results are obtained only with a relatively narrow range of DNA quantities. The specificity of this quantitation method for human DNA will be useful for the analysis of samples that may also contain bacterial or other non-human DNA, for example forensic evidence samples, ancient DNA samples, or clinical samples.

  20. A quantitative method for optimized placement of continuous air monitors.

    PubMed

    Whicker, Jeffrey J; Rodgers, John C; Moxley, John S

    2003-11-01

    Alarming continuous air monitors (CAMs) are a critical component for worker protection in facilities that handle large amounts of hazardous materials. In nuclear facilities, continuous air monitors alarm when levels of airborne radioactive materials exceed alarm thresholds, thus prompting workers to exit the room to reduce inhalation exposures. To maintain a high level of worker protection, continuous air monitors are required to detect radioactive aerosol clouds quickly and with good sensitivity. This requires that there are sufficient numbers of continuous air monitors in a room and that they are well positioned. Yet there are no published methodologies to quantitatively determine the optimal number and placement of continuous air monitors in a room. The goal of this study was to develop and test an approach to quantitatively determine optimal number and placement of continuous air monitors in a room. The method we have developed uses tracer aerosol releases (to simulate accidental releases) and the measurement of the temporal and spatial aspects of the dispersion of the tracer aerosol through the room. The aerosol dispersion data is then analyzed to optimize continuous air monitor utilization based on simulated worker exposure. This method was tested in a room within a Department of Energy operated plutonium facility at the Savannah River Site in South Carolina, U.S. Results from this study show that the value of quantitative airflow and aerosol dispersion studies is significant and that worker protection can be significantly improved while balancing the costs associated with CAM programs.

  1. Quantitative results for square gradient models of fluids

    NASA Astrophysics Data System (ADS)

    Kong, Ling-Ti; Vriesinga, Dan; Denniston, Colin

    2011-03-01

    Square gradient models for fluids are extensively used because they are believed to provide a good qualitative understanding of the essential physics. However, unlike elasticity theory for solids, there are few quantitative results for specific (as opposed to generic) fluids. Indeed the only numerical value of the square gradient coefficients for specific fluids have been inferred from attempts to match macroscopic properties such as surface tensions rather than from direct measurement. We employ all-atom molecular dynamics, using the TIP3P and OPLS force fields, to directly measure the coefficients of the density gradient expansion for several real fluids. For all liquids measured, including water, we find that the square gradient coefficient is negative, suggesting the need for some regularization of a model including only the square gradient, but only at wavelengths comparable to the molecular separation of molecules. The implications for liquid-gas interfaces are also examined. Remarkably, the square gradient model is found to give a reasonably accurate description of density fluctuations in the liquid state down to wavelengths close to atomic size.

  2. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  3. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  4. Machine Learning methods for Quantitative Radiomic Biomarkers

    PubMed Central

    Parmar, Chintan; Grossmann, Patrick; Bussink, Johan; Lambin, Philippe; Aerts, Hugo J. W. L.

    2015-01-01

    Radiomics extracts and mines large number of medical imaging features quantifying tumor phenotypic characteristics. Highly accurate and reliable machine-learning approaches can drive the success of radiomic applications in clinical care. In this radiomic study, fourteen feature selection methods and twelve classification methods were examined in terms of their performance and stability for predicting overall survival. A total of 440 radiomic features were extracted from pre-treatment computed tomography (CT) images of 464 lung cancer patients. To ensure the unbiased evaluation of different machine-learning methods, publicly available implementations along with reported parameter configurations were used. Furthermore, we used two independent radiomic cohorts for training (n = 310 patients) and validation (n = 154 patients). We identified that Wilcoxon test based feature selection method WLCX (stability = 0.84 ± 0.05, AUC = 0.65 ± 0.02) and a classification method random forest RF (RSD = 3.52%, AUC = 0.66 ± 0.03) had highest prognostic performance with high stability against data perturbation. Our variability analysis indicated that the choice of classification method is the most dominant source of performance variation (34.21% of total variance). Identification of optimal machine-learning methods for radiomic applications is a crucial step towards stable and clinically relevant radiomic biomarkers, providing a non-invasive way of quantifying and monitoring tumor-phenotypic characteristics in clinical practice. PMID:26278466

  5. Machine Learning methods for Quantitative Radiomic Biomarkers.

    PubMed

    Parmar, Chintan; Grossmann, Patrick; Bussink, Johan; Lambin, Philippe; Aerts, Hugo J W L

    2015-08-17

    Radiomics extracts and mines large number of medical imaging features quantifying tumor phenotypic characteristics. Highly accurate and reliable machine-learning approaches can drive the success of radiomic applications in clinical care. In this radiomic study, fourteen feature selection methods and twelve classification methods were examined in terms of their performance and stability for predicting overall survival. A total of 440 radiomic features were extracted from pre-treatment computed tomography (CT) images of 464 lung cancer patients. To ensure the unbiased evaluation of different machine-learning methods, publicly available implementations along with reported parameter configurations were used. Furthermore, we used two independent radiomic cohorts for training (n = 310 patients) and validation (n = 154 patients). We identified that Wilcoxon test based feature selection method WLCX (stability = 0.84 ± 0.05, AUC = 0.65 ± 0.02) and a classification method random forest RF (RSD = 3.52%, AUC = 0.66 ± 0.03) had highest prognostic performance with high stability against data perturbation. Our variability analysis indicated that the choice of classification method is the most dominant source of performance variation (34.21% of total variance). Identification of optimal machine-learning methods for radiomic applications is a crucial step towards stable and clinically relevant radiomic biomarkers, providing a non-invasive way of quantifying and monitoring tumor-phenotypic characteristics in clinical practice.

  6. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  7. African Primary Care Research: quantitative analysis and presentation of results.

    PubMed

    Mash, Bob; Ogunbanjo, Gboyega A

    2014-06-06

    This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report.

  8. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems

  9. Chemoenzymatic method for glycomics: isolation, identification, and quantitation

    PubMed Central

    Yang, Shuang; Rubin, Abigail; Eshghi, Shadi Toghi; Zhang, Hui

    2015-01-01

    Over the past decade, considerable progress has been made with respect to the analytical methods for analysis of glycans from biological sources. Regardless of the specific methods that are used, glycan analysis includes isolation, identification, and quantitation. Derivatization is indispensable to increase their identification. Derivatization of glycans can be performed by permethylation or carbodiimide coupling / esterification. By introducing a fluorophore or chromophore at their reducing end, glycans can be separated by electrophoresis or chromatography. The fluorogenically labeled glycans can be quantitated using fluorescent detection. The recently developed approaches using solid-phase such as glycoprotein immobilization for glycan extraction and on-tissue glycan mass spectrometry imaging demonstrate advantages over methods performed in solution. Derivatization of sialic acids is favorably implemented on the solid support using carbodiimide coupling, and the released glycans can be further modified at the reducing end or permethylated for quantitative analysis. In this review, methods for glycan isolation, identification, and quantitation are discussed. PMID:26390280

  10. Method for quantitating sensitivity to a staphylococcal bacteriocin.

    PubMed Central

    Van Norman, G; Groman, N

    1979-01-01

    A convenient method for quantitating the sensitivity of large numbers of bacterial strains (presently Corynebacterium diphtheriae) to a Staphylococcus aureus phage type 71 bacteriocin is described. Images PMID:121117

  11. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A.

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  12. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  13. In vivo osteogenesis assay: a rapid method for quantitative analysis.

    PubMed

    Dennis, J E; Konstantakos, E K; Arm, D; Caplan, A I

    1998-08-01

    A quantitative in vivo osteogenesis assay is a useful tool for the analysis of cells and bioactive factors that affect the amount or rate of bone formation. There are currently two assays in general use for the in vivo assessment of osteogenesis by isolated cells: diffusion chambers and porous calcium phosphate ceramics. Due to the relative ease of specimen preparation and reproducibility of results, the porous ceramic assay was chosen for the development of a rapid method for quantitating in vivo bone formation. The ceramic cube implantation technique consists of combining osteogenic cells with 27-mm3 porous calcium phosphate ceramics, implanting the cell-ceramic composites subcutaneously into an immuno-tolerant host, and, after 2-6 weeks, harvesting and preparing the ceramic implants for histologic analysis. A drawback to the analysis of bone formation within these porous ceramics is that the entire cube must be examined to find small foci of bone present in some samples; a single cross-sectional area is not representative. For this reason, image analysis of serial sections from ceramics is often prohibitively time-consuming. Two alternative scoring methodologies were tested and compared to bone volume measurements obtained by image analysis. The two subjective scoring methods were: (1) Bone Scale: the amount of bone within pores of the ceramic implant is estimated on a scale of 0-4 based on the degree of bone fill (0=no bone, 1=up to 25%, 2=25 to 75%, 4=75 to 100% fill); and (2) Percentage Bone: the amount of bone is estimated by determining the percentage of ceramic pores which contain bone. Every tenth section of serially sectioned cubes was scored by each of these methods under double-blind conditions, and the Bone Scale and Percentage Bone results were directly compared to image analysis measurements from identical samples. Correlation coefficients indicate that the Percentage Bone method was more accurate than the Bone Scale scoring method. The Bone Scale

  14. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment.

  15. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  16. A Quantitative Vainberg Method for Black Box Scattering

    NASA Astrophysics Data System (ADS)

    Galkowski, Jeffrey

    2017-01-01

    We give a quantitative version of Vainberg's method relating pole free regions to propagation of singularities for black box scatterers. In particular, we show that there is a logarithmic resonance free region near the real axis of size {τ} with polynomial bounds on the resolvent if and only if the wave propagator gains derivatives at rate {τ}. Next we show that if there exist singularities in the wave trace at times tending to infinity which smooth at rate {τ}, then there are resonances in logarithmic strips whose width is given by {τ}. As our main application of these results, we give sharp bounds on the size of resonance free regions in scattering on geometrically nontrapping manifolds with conic points. Moreover, these bounds are generically optimal on exteriors of nontrapping polygonal domains.

  17. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    PubMed

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  18. A quantitative dimming method for LED based on PWM

    NASA Astrophysics Data System (ADS)

    Wang, Jiyong; Mou, Tongsheng; Wang, Jianping; Tian, Xiaoqing

    2012-10-01

    Traditional light sources were required to provide stable and uniform illumination for a living or working environment considering performance of visual function of human being. The requirement was always reasonable until non-visual functions of the ganglion cells in the retina photosensitive layer were found. New generation of lighting technology, however, is emerging based on novel lighting materials such as LED and photobiological effects on human physiology and behavior. To realize dynamic lighting of LED whose intensity and color were adjustable to the need of photobiological effects, a quantitative dimming method based on Pulse Width Modulation (PWM) and light-mixing technology was presented. Beginning with two channels' PWM, this paper demonstrated the determinacy and limitation of PWM dimming for realizing Expected Photometric and Colorimetric Quantities (EPCQ), in accordance with the analysis on geometrical, photometric, colorimetric and electrodynamic constraints. A quantitative model which mapped the EPCQ into duty cycles was finally established. The deduced model suggested that the determinacy was a unique individuality only for two channels' and three channels' PWM, but the limitation was an inevitable commonness for multiple channels'. To examine the model, a light-mixing experiment with two kinds of white LED simulated variations of illuminance and Correlation Color Temperature (CCT) from dawn to midday. Mean deviations between theoretical values and measured values were obtained, which were 15lx and 23K respectively. Result shows that this method can effectively realize the light spectrum which has a specific requirement of EPCQ, and provides a theoretical basis and a practical way for dynamic lighting of LED.

  19. Wave propagation models for quantitative defect detection by ultrasonic methods

    NASA Astrophysics Data System (ADS)

    Srivastava, Ankit; Bartoli, Ivan; Coccia, Stefano; Lanza di Scalea, Francesco

    2008-03-01

    Ultrasonic guided wave testing necessitates of quantitative, rather than qualitative, information on flaw size, shape and position. This quantitative diagnosis ability can be used to provide meaningful data to a prognosis algorithm for remaining life prediction, or simply to generate data sets for a statistical defect classification algorithm. Quantitative diagnostics needs models able to represent the interaction of guided waves with various defect scenarios. One such model is the Global-Local (GL) method, which uses a full finite element discretization of the region around a flaw to properly represent wave diffraction, and a suitable set of wave functions to simulate regions away from the flaw. Displacement and stress continuity conditions are imposed at the boundary between the global and the local regions. In this paper the GL method is expanded to take advantage of the Semi-Analytical Finite Element (SAFE) method in the global portion of the waveguide. The SAFE method is efficient because it only requires the discretization of the cross-section of the waveguide to obtain the wave dispersion solutions and it can handle complex structures such as multilayered sandwich panels. The GL method is applied to predicting quantitatively the interaction of guided waves with defects in aluminum and composites structural components.

  20. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  1. Calibration of qualitative HBsAg assay results for quantitative HBsAg monitoring.

    PubMed

    Gunning, Hans; Adachi, Dena; Tang, Julian W

    2014-10-01

    Evidence is accumulating that quantitative hepatitis B surface antigen monitoring may be useful in managing patients with chronic HBV infection on certain treatment regimens. Based on these results with the Abbott Architect qualitative and quantitative HBsAg assays, it seems feasible to convert qualitative to quantitative HBsAg values for this purpose.

  2. A quantitative assessment method for Ascaris eggs on hands.

    PubMed

    Jeandron, Aurelie; Ensink, Jeroen H J; Thamsborg, Stig M; Dalsgaard, Anders; Sengupta, Mita E

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness.

  3. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  4. A Novel Targeted Learning Method for Quantitative Trait Loci Mapping

    PubMed Central

    Wang, Hui; Zhang, Zhongyang; Rose, Sherri; van der Laan, Mark

    2014-01-01

    We present a novel semiparametric method for quantitative trait loci (QTL) mapping in experimental crosses. Conventional genetic mapping methods typically assume parametric models with Gaussian errors and obtain parameter estimates through maximum-likelihood estimation. In contrast with univariate regression and interval-mapping methods, our model requires fewer assumptions and also accommodates various machine-learning algorithms. Estimation is performed with targeted maximum-likelihood learning methods. We demonstrate our semiparametric targeted learning approach in a simulation study and a well-studied barley data set. PMID:25258376

  5. A novel targeted learning method for quantitative trait loci mapping.

    PubMed

    Wang, Hui; Zhang, Zhongyang; Rose, Sherri; van der Laan, Mark

    2014-12-01

    We present a novel semiparametric method for quantitative trait loci (QTL) mapping in experimental crosses. Conventional genetic mapping methods typically assume parametric models with Gaussian errors and obtain parameter estimates through maximum-likelihood estimation. In contrast with univariate regression and interval-mapping methods, our model requires fewer assumptions and also accommodates various machine-learning algorithms. Estimation is performed with targeted maximum-likelihood learning methods. We demonstrate our semiparametric targeted learning approach in a simulation study and a well-studied barley data set.

  6. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  7. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  8. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  9. Testing of flat optical surfaces by the quantitative Foucault method.

    PubMed

    Simon, M C; Simon, J M

    1978-01-01

    The complete theory of measurement of optical flat mirrors of circular or elliptical shape using the quantitative Foucault method is described here. It has been used in Córdoba since 1939 in a partially intuitive but correct form. The surface, not yet flat and, at times, astigmatic, is assimilated to the sum of a spherical plus a cylindrical dome. The errors of the three possible ways of reckoning are calculated.

  10. Method for depth-resolved quantitation of optical properties in layered media using spatially modulated quantitative spectroscopy.

    PubMed

    Saager, Rolf B; Truong, Alex; Cuccia, David J; Durkin, Anthony J

    2011-07-01

    We have demonstrated that spatially modulated quantitative spectroscopy (SMoQS) is capable of extracting absolute optical properties from homogeneous tissue simulating phantoms that span both the visible and near-infrared wavelength regimes. However, biological tissue, such as skin, is highly structured, presenting challenges to quantitative spectroscopic techniques based on homogeneous models. In order to more accurately address the challenges associated with skin, we present a method for depth-resolved optical property quantitation based on a two layer model. Layered Monte Carlo simulations and layered tissue simulating phantoms are used to determine the efficacy and accuracy of SMoQS to quantify layer specific optical properties of layered media. Initial results from both the simulation and experiment show that this empirical method is capable of determining top layer thickness within tens of microns across a physiological range for skin. Layer specific chromophore concentration can be determined to <±10% the actual values, on average, whereas bulk quantitation in either visible or near infrared spectroscopic regimes significantly underestimates the layer specific chromophore concentration and can be confounded by top layer thickness.

  11. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  12. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach.

  13. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikolić, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  14. Synthesizing Regression Results: A Factored Likelihood Method

    ERIC Educational Resources Information Center

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-01-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…

  15. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  16. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  17. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  18. Quantitative analytical method to evaluate the metabolism of vitamin D.

    PubMed

    Mena-Bravo, A; Ferreiro-Vera, C; Priego-Capote, F; Maestro, M A; Mouriño, A; Quesada-Gómez, J M; Luque de Castro, M D

    2015-03-10

    A method for quantitative analysis of vitamin D (both D2 and D3) and its main metabolites - monohydroxylated vitamin D (25-hydroxyvitamin D2 and 25-hydroxyvitamin D3) and dihydroxylated metabolites (1,25-dihydroxyvitamin D2, 1,25-dihydroxyvitamin D3 and 24,25-dihydroxyvitamin D3) in human serum is here reported. The method is based on direct analysis of serum by an automated platform involving on-line coupling of a solid-phase extraction workstation to a liquid chromatograph-tandem mass spectrometer. Detection of the seven analytes was carried out by the selected reaction monitoring (SRM) mode, and quantitative analysis was supported on the use of stable isotopic labeled internal standards (SIL-ISs). The detection limits were between 0.3-75pg/mL for the target compounds, while precision (expressed as relative standard deviation) was below 13.0% for between-day variability. The method was externally validated according to the vitamin D External Quality Assurance Scheme (DEQAS) through the analysis of ten serum samples provided by this organism. The analytical features of the method support its applicability in nutritional and clinical studies targeted at elucidating the role of vitamin D metabolism.

  19. A New Kinetic Spectrophotometric Method for the Quantitation of Amorolfine.

    PubMed

    Soto, César; Poza, Cristian; Contreras, David; Yáñez, Jorge; Nacaratte, Fallon; Toral, M Inés

    2017-01-01

    Amorolfine (AOF) is a compound with fungicide activity based on the dual inhibition of growth of the fungal cell membrane, the biosynthesis and accumulation of sterols, and the reduction of ergosterol. In this work a sensitive kinetic and spectrophotometric method for the AOF quantitation based on the AOF oxidation by means of KMnO4 at 30 min (fixed time), pH alkaline, and ionic strength controlled was developed. Measurements of changes in absorbance at 610 nm were used as criterion of the oxidation progress. In order to maximize the sensitivity, different experimental reaction parameters were carefully studied via factorial screening and optimized by multivariate method. The linearity, intraday, and interday assay precision and accuracy were determined. The absorbance-concentration plot corresponding to tap water spiked samples was rectilinear, over the range of 7.56 × 10(-6)-3.22 × 10(-5) mol L(-1), with detection and quantitation limits of 2.49 × 10(-6) mol L(-1) and 7.56 × 10(-6) mol L(-1), respectively. The proposed method was successfully validated for the application of the determination of the drug in the spiked tap water samples and the percentage recoveries were 94.0-105.0%. The method is simple and does not require expensive instruments or complicated extraction steps of the reaction product.

  20. A New Kinetic Spectrophotometric Method for the Quantitation of Amorolfine

    PubMed Central

    Poza, Cristian; Contreras, David; Yáñez, Jorge; Nacaratte, Fallon; Toral, M. Inés

    2017-01-01

    Amorolfine (AOF) is a compound with fungicide activity based on the dual inhibition of growth of the fungal cell membrane, the biosynthesis and accumulation of sterols, and the reduction of ergosterol. In this work a sensitive kinetic and spectrophotometric method for the AOF quantitation based on the AOF oxidation by means of KMnO4 at 30 min (fixed time), pH alkaline, and ionic strength controlled was developed. Measurements of changes in absorbance at 610 nm were used as criterion of the oxidation progress. In order to maximize the sensitivity, different experimental reaction parameters were carefully studied via factorial screening and optimized by multivariate method. The linearity, intraday, and interday assay precision and accuracy were determined. The absorbance-concentration plot corresponding to tap water spiked samples was rectilinear, over the range of 7.56 × 10−6–3.22 × 10−5 mol L−1, with detection and quantitation limits of 2.49 × 10−6 mol L−1 and 7.56 × 10−6 mol L−1, respectively. The proposed method was successfully validated for the application of the determination of the drug in the spiked tap water samples and the percentage recoveries were 94.0–105.0%. The method is simple and does not require expensive instruments or complicated extraction steps of the reaction product. PMID:28348920

  1. Quantitative cell imaging using single beam phase retrieval method

    NASA Astrophysics Data System (ADS)

    Anand, Arun; Chhaniwal, Vani; Javidi, Bahram

    2011-06-01

    Quantitative three-dimensional imaging of cells can provide important information about their morphology as well as their dynamics, which will be useful in studying their behavior under various conditions. There are several microscopic techniques to image unstained, semi-transparent specimens, by converting the phase information into intensity information. But most of the quantitative phase contrast imaging techniques is realized either by using interference of the object wavefront with a known reference beam or using phase shifting interferometry. A two-beam interferometric method is challenging to implement especially with low coherent sources and it also requires a fine adjustment of beams to achieve high contrast fringes. In this letter, the development of a single beam phase retrieval microscopy technique for quantitative phase contrast imaging of cells using multiple intensity samplings of a volume speckle field in the axial direction is described. Single beam illumination with multiple intensity samplings provides fast convergence and a unique solution of the object wavefront. Three-dimensional thickness profiles of different cells such as red blood cells and onion skin cells were reconstructed using this technique with an axial resolution of the order of several nanometers.

  2. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  3. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  4. [Quantitative and qualitative research methods, can they coexist yet?].

    PubMed

    Hunt, Elena; Lavoie, Anne-Marise

    2011-06-01

    Qualitative design is gaining ground in Nursing research. In spite of a relative progress however, the evidence based practice movement continues to dominate and to underline the exclusive value of quantitative design (particularly that of randomized clinical trials) for clinical decision making. In the actual context convenient to those in power making utilitarian decisions on one hand, and facing nursing criticism of the establishment in favor of qualitative research on the other hand, it is difficult to chose a practical and ethical path that values the nursing role within the health care system, keeping us committed to quality care and maintaining researcher's integrity. Both qualitative and quantitative methods have advantages and disadvantages, and clearly, none of them can, by itself, capture, describe and explain reality adequately. Therefore, a balance between the two methods is needed. Researchers bare responsibility to society and science, and they should opt for the appropriate design susceptible to answering the research question, not promote the design favored by the research funding distributors.

  5. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  6. Novel method for the quantitative measurement of color vision deficiencies

    NASA Astrophysics Data System (ADS)

    Xiong, Kai; Hou, Minxian; Ye, Guanrong

    2005-01-01

    The method is based on chromatic visual evoked potential (VEP) measurement. The equiluminance of color stimulus in normal subjects is characterized by L-cone and M-cone activation in retina. For the deuteranopes and protanopes, only the activations of one relevant remaining cone type should be considered. The equiluminance turning curve was established for the recorded VEPs of the luminance changes of the red and green color stimulus, and the position of the equiluminance was used to define the kind and degree of color vision deficiencies. In the test of 47 volunteers, we got the VEP traces and the equiluminance turning curves, which was in accordance with the judgment by the pseudoisochromatic plate used in clinic. The method fulfills the impersonal and quantitative requirements in color vision deficiencies test.

  7. SWECS tower dynamics analysis methods and results

    NASA Technical Reports Server (NTRS)

    Wright, A. D.; Sexton, J. H.; Butterfield, C. P.; Thresher, R. M.

    1981-01-01

    Several different tower dynamics analysis methods and computer codes were used to determine the natural frequencies and mode shapes of both guyed and freestanding wind turbine towers. These analysis methods are described and the results for two types of towers, a guyed tower and a freestanding tower, are shown. The advantages and disadvantages in the use of and the accuracy of each method are also described.

  8. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  9. Quantitative analysis of rib kinematics based on dynamic chest bone images: preliminary results.

    PubMed

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-04-01

    An image-processing technique for separating bones from soft tissue in static chest radiographs has been developed. The present study was performed to evaluate the usefulness of dynamic bone images in quantitative analysis of rib movement. Dynamic chest radiographs of 16 patients were obtained using a dynamic flat-panel detector and processed to create bone images by using commercial software (Clear Read BS, Riverain Technologies). Velocity vectors were measured in local areas on the dynamic images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as a reduced rib velocity field, resulting in an asymmetrical distribution of rib movement. Vector maps in all normal cases exhibited left/right symmetric distributions of the velocity field, whereas those in abnormal cases showed asymmetric distributions because of locally limited rib movements. Dynamic bone images were useful for accurate quantitative analysis of rib movements. The present method has a potential for an additional functional examination in chest radiography.

  10. Experimental demonstration of quantitation errors in MR spectroscopy resulting from saturation corrections under changing conditions

    NASA Astrophysics Data System (ADS)

    Galbán, Craig J.; Ellis, Scott J.; Spencer, Richard G. S.

    2003-04-01

    Metabolite concentration measurements in in vivo NMR are generally performed under partially saturated conditions, with correction for partial saturation performed after data collection using a measured saturation factor. Here, we present an experimental test of the hypothesis that quantitation errors can occur due to application of such saturation factor corrections in changing systems. Thus, this extends our previous theoretical work on quantitation errors due to varying saturation factors. We obtained results for two systems frequently studied by 31P NMR, the ischemic rat heart and the electrically stimulated rat gastrocnemius muscle. The results are interpreted in light of previous theoretical work which defined the degree of saturation occurring in a one-pulse experiment for a system with given spin-lattice relaxation times, T1s, equilibrium magnetizations, M0s, and reaction rates. We found that (i) the assumption of constancy of saturation factors leads to quantitation errors on the order of 40% in inorganic phosphate; (ii) the dominant contributor to the quantitation errors in inorganic phosphate is most likely changes in T1; (iii) T1 and M0 changes between control and intervention periods, and chemical exchange contribute to different extents to quantitation errors in phosphocreatine and γ-ATP; (iv) relatively small increases in interpulse delay substantially decreased quantitation errors for metabolites in ischemic rat hearts; (v) random error due to finite SNR led to approximately 4% error in quantitation, and hence was a substantially smaller contributor than were changes in saturation factors.

  11. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  12. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  13. Quantitative methods in electroencephalography to access therapeutic response.

    PubMed

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  14. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    SciTech Connect

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  15. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    PubMed

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (P< .001). Repeated measures pairwise correlation between any of the methods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies.

  16. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0

  17. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  18. Method of fault-tree quantitative analysis for solid rocket motor

    NASA Astrophysics Data System (ADS)

    Hu, Baochao; Yang, Yicai; Xie, Weimin

    1993-08-01

    Based on the existing problem in determining the failure probabilities of base events in solid rocket motor fault-tree quantitative analysis, an engineering method of 'Solicitation Opinions to Give Marks' was put forward to determine the failure probability. A satisfactory result was obtained by analyzing the practical example of structure reliability for some solid rocket motors at the test sample stage.

  19. Method of fault-tree quantitative analysis for solid rocket motor

    NASA Astrophysics Data System (ADS)

    Hu, Baochao; Yang, Yicai; Xie, Weimin

    1993-08-01

    Based on the existing problem of determining the failure probabilities of base events in solid rocket motor fault tree quantitative analysis, an engineering method of Solicitation Opinions to Give Marks is put forward to determine failure probability. A more satisfactory result is obtained by analyzing the actual example of the structural reliability of solid rocket motors at the test sample stage.

  20. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  1. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the…

  2. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  3. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  4. Breast tumour visualization using 3D quantitative ultrasound methods

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.

    2016-04-01

    Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.

  5. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  6. Optimization of sample preparation for accurate results in quantitative NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Yamazaki, Taichi; Nakamura, Satoe; Saito, Takeshi

    2017-04-01

    Quantitative nuclear magnetic resonance (qNMR) spectroscopy has received high marks as an excellent measurement tool that does not require the same reference standard as the analyte. Measurement parameters have been discussed in detail and high-resolution balances have been used for sample preparation. However, the high-resolution balances, such as an ultra-microbalance, are not general-purpose analytical tools and many analysts may find those balances difficult to use, thereby hindering accurate sample preparation for qNMR measurement. In this study, we examined the relationship between the resolution of the balance and the amount of sample weighed during sample preparation. We were able to confirm the accuracy of the assay results for samples weighed on a high-resolution balance, such as the ultra-microbalance. Furthermore, when an appropriate tare and amount of sample was weighed on a given balance, accurate assay results were obtained with another high-resolution balance. Although this is a fundamental result, it offers important evidence that would enhance the versatility of the qNMR method.

  7. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  8. Quantitative methods to study epithelial morphogenesis and polarity.

    PubMed

    Aigouy, B; Collinet, C; Merkel, M; Sagner, A

    2017-01-01

    Morphogenesis of an epithelial tissue emerges from the behavior of its constituent cells, including changes in shape, rearrangements, and divisions. In many instances the directionality of these cellular events is controlled by the polarized distribution of specific molecular components. In recent years, our understanding of morphogenesis and polarity highly benefited from advances in genetics, microscopy, and image analysis. They now make it possible to measure cellular dynamics and polarity with unprecedented precision for entire tissues throughout their development. Here we review recent approaches to visualize and measure cell polarity and tissue morphogenesis. The chapter is organized like an experiment. We first discuss the choice of cell and polarity reporters and describe the use of mosaics to reveal hidden cell polarities or local morphogenetic events. Then, we outline application-specific advantages and disadvantages of different microscopy techniques and image projection algorithms. Next, we present methods to extract cell outlines to measure cell polarity and detect cellular events underlying morphogenesis. Finally, we bridge scales by presenting approaches to quantify the specific contribution of each cellular event to global tissue deformation. Taken together, we provide an in-depth description of available tools and theoretical concepts to quantitatively study cell polarity and tissue morphogenesis over multiple scales.

  9. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  10. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  11. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  12. Method for quantitative proteomics research by using metal element chelated tags coupled with mass spectrometry.

    PubMed

    Liu, Huiling; Zhang, Yangjun; Wang, Jinglan; Wang, Dong; Zhou, Chunxi; Cai, Yun; Qian, Xiaohong

    2006-09-15

    The mass spectrometry-based methods with a stable isotope as the internal standard in quantitative proteomics have been developed quickly in recent years. But the use of some stable isotope reagents is limited by the relative high price and synthetic difficulties. We have developed a new method for quantitative proteomics research by using metal element chelated tags (MECT) coupled with mass spectrometry. The bicyclic anhydride diethylenetriamine-N,N,N',N' ',N' '-pentaacetic acid (DTPA) is covalently coupled to primary amines of peptides, and the ligand is then chelated to the rare earth metals Y and Tb. The tagged peptides are mixed and analyzed by LC-ESI-MS/MS. Peptides are quantified by measuring the relative signal intensities for the Y and Tb tag pairs in MS, which permits the quantitation of the original proteins generating the corresponding peptides. The protein is then identified by the corresponding peptide sequence from its MS/MS spectrum. The MECT method was evaluated by using standard proteins as model sample. The experimental results showed that metal chelate-tagged peptides chromatographically coeluted successfully during the reversed-phase LC analysis. The relative quantitation results were accurate for proteins using MECT. DTPA modification of the N-terminal of peptides promoted cleaner fragmentation (only y-series ions) in mass spectrometry and improved the confidence level of protein identification. The MECT strategy provides a simple, rapid, and economical alternative to current mass tagging technologies available.

  13. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  14. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    SciTech Connect

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.

  15. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  16. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  17. Qualitative and quantitative PCR methods for detection of three lines of genetically modified potatoes.

    PubMed

    Rho, Jae Kyun; Lee, Theresa; Jung, Soon-Il; Kim, Tae-San; Park, Yong-Hwan; Kim, Young-Mi

    2004-06-02

    Qualitative and quantitative polymerase chain reaction (PCR) methods have been developed for the detection of genetically modified (GM) potatoes. The combination of specific primers for amplification of the promoter region of Cry3A gene, potato leafroll virus replicase gene, and potato virus Y coat protein gene allows to identify each line of NewLeaf, NewLeaf Y, and NewLeaf Plus GM potatoes. Multiplex PCR method was also established for the simple and rapid detection of the three lines of GM potato in a mixture sample. For further quantitative detection, the realtime PCR method has been developed. This method features the use of a standard plasmid as a reference molecule. Standard plasmid contains both a specific region of the transgene Cry3A and an endogenous UDP-glucose pyrophosphorylase gene of the potato. The test samples containing 0.5, 1, 3, and 5% GM potatoes were quantified by this method. At the 3.0% level of each line of GM potato, the relative standard deviations ranged from 6.0 to 19.6%. This result shows that the above PCR methods are applicable to detect GM potatoes quantitatively as well as qualitatively.

  18. Development and validation of event-specific quantitative PCR method for genetically modified maize MIR604.

    PubMed

    Mano, Junichi; Furui, Satoshi; Takashima, Kaori; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2012-01-01

    A GM maize event, MIR604, has been widely distributed and an analytical method to quantify its content is required to monitor the validity of food labeling. Here we report a novel real-time PCR-based quantitation method for MIR604 maize. We developed real-time PCR assays specific for MIR604 using event-specific primers designed by the trait developer, and for maize endogenous starch synthase IIb gene (SSIIb). Then, we determined the conversion factor, which is required to calculate the weight-based GM maize content from the copy number ratio of MIR604-specific DNA to the endogenous reference DNA. Finally, to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind samples containing MIR604 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The reproducibility (RSDr) of the developed method was evaluated to be less than 25%. The limit of quantitation of the method was estimated to be 0.5% based on the ISO 24276 guideline. These results suggested that the developed method would be suitable for practical quantitative analyses of MIR604 maize.

  19. The expected results method for data verification

    NASA Astrophysics Data System (ADS)

    Monday, Paul

    2016-05-01

    The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest

  20. Methods and challenges in quantitative imaging biomarker development.

    PubMed

    Abramson, Richard G; Burton, Kirsteen R; Yu, John-Paul J; Scalzetti, Ernest M; Yankeelov, Thomas E; Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M

    2015-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This article, drafted by the Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field.

  1. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  2. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  3. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC).

  4. Quantitative methods to characterize morphological properties of cell lines.

    PubMed

    Mancia, Annalaura; Elliott, John T; Halter, Michael; Bhadriraju, Kiran; Tona, Alessandro; Spurlin, Tighe A; Middlebrooks, Bobby L; Baatz, John E; Warr, Gregory W; Plant, Anne L

    2012-07-01

    Descriptive terms are often used to characterize cells in culture, but the use of nonquantitative and poorly defined terms can lead to ambiguities when comparing data from different laboratories. Although recently there has been a good deal of interest in unambiguous identification of cell lines via their genetic markers, it is also critical to have definitive, quantitative metrics to describe cell phenotypic characteristics. Quantitative metrics of cell phenotype will aid the comparison of data from experiments performed at different times and in different laboratories where influences such as the age of the population and differences in culture conditions or protocols can potentially affect cellular metabolic state and gene expression in the absence of changes in the genetic profile. Here, we present examples of robust methodologies for quantitatively assessing characteristics of cell morphology and cell-cell interactions, and of growth rates of cells within the population. We performed these analyses with endothelial cell lines derived from dolphin, bovine and human, and with a mouse fibroblast cell line. These metrics quantify some characteristics of these cells lines that clearly distinguish them from one another, and provide quantitative information on phenotypic changes in one of the cell lines over large number of passages.

  5. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  6. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  7. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works

  8. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  9. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  10. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  11. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  12. MODIS Radiometric Calibration Program, Methods and Results

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong; Guenther, Bruce; Angal, Amit; Barnes, William; Salomonson, Vincent; Sun, Junqiang; Wenny, Brian

    2012-01-01

    As a key instrument for NASA s Earth Observing System (EOS), the Moderate Resolution Imaging Spectroradiometer (MODIS) has made significant contributions to the remote sensing community with its unprecedented amount of data products continuously generated from its observations and freely distributed to users worldwide. MODIS observations, covering spectral regions from visible (VIS) to long-wave infrared (LWIR), have enabled a broad range of research activities and applications for studies of the earth s interactive system of land, oceans, and atmosphere. In addition to extensive pre-launch measurements, developed to characterize sensor performance, MODIS carries a set of on-board calibrators (OBC) that can be used to track on-orbit changes of various sensor characteristics. Most importantly, dedicated and continuous calibration efforts have been made to maintain sensor data quality. This paper provides an overview of the MODIS calibration program, on-orbit calibration activities, methods, and performance. Key calibration results and lessons learned from the MODIS calibration effort are also presented in this paper.

  13. Sequencing human ribs into anatomical order by quantitative multivariate methods.

    PubMed

    Cirillo, John; Henneberg, Maciej

    2012-06-01

    Little research has focussed on methods to anatomically sequence ribs. Correct anatomical sequencing of ribs assists in determining the location and distribution of regional trauma, age estimation, number of puncture wounds, number of individuals, and personal identification. The aim of the current study is to develop a method for placing fragmented and incomplete rib sets into correct anatomical position. Ribs 2-10 were used from eleven cadavers of an Australian population. Seven variables were measured from anatomical locations on the rib. General descriptive statistics were calculated for each variable along with an analysis of variance (ANOVA) and ANOVA with Bonferroni statistics. Considerable overlap was observed between ribs for univariate methods. Bivariate and multivariate methods were then applied. Results of the ANOVA with post hoc Bonferroni statistics show that ratios of various dimensions of a single rib could be used to sequence it within adjacent ribs. Using multiple regression formulae, the most accurate estimation of the anatomical rib number occurs when the entire rib is found in isolation. This however, is not always possible. Even when only the head and neck of the rib are preserved, a modified multivariate regression formula assigned 91.95% of ribs into correct anatomical position or as an adjacent rib. Using multivariate methods it is possible to sequence a single human rib with a high level of accuracy and they are superior to univariate methods. Left and right ribs were found to be highly symmetrical. Some rib dimensions were greater in males than in females, but overall the level of sexual dimorphism was low.

  14. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  15. Quantitative method for enumeration of enterotoxigenic Escherichia coli.

    PubMed Central

    Calderon, R L; Levin, M A

    1981-01-01

    A rapid method was developed to quantify toxigenic Escherichia coli, using a membrane filter procedure. After filtration of samples, the membrane filter was first incubated on a medium selective for E. coli (24 h, 44 degrees C) and then transferred to tryptic soy agar (3%; 6 h, 37 degrees C). To assay for labile toxin-producing colonies, the filter was then transferred to a monolayer of Y-1 cells, the E. coli colonies were marked on the bottom of the petri dish, and the filter was removed after 15 min. The monolayer was observed for a positive rounding effect after a 15- to 24-h incubation. The method has an upper limit of detecting 30 toxigenic colonies per plate and can detect as few as one toxigenic colony per plate. A preliminary screening for these enterotoxigenic strains in polluted waters and known positive fecal samples was performed, and positive results were obtained with fecal samples only. PMID:7007415

  16. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected.

  17. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-10-16

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks.

  18. Visual Display of Scientific Studies, Methods, and Results

    NASA Astrophysics Data System (ADS)

    Saltus, R. W.; Fedi, M.

    2015-12-01

    The need for efficient and effective communication of scientific ideas becomes more urgent each year.A growing number of societal and economic issues are tied to matters of science - e.g., climate change, natural resource availability, and public health. Societal and political debate should be grounded in a general understanding of scientific work in relevant fields. It is difficult for many participants in these debates to access science directly because the formal method for scientific documentation and dissemination is the journal paper, generally written for a highly technical and specialized audience. Journal papers are very effective and important for documentation of scientific results and are essential to the requirements of science to produce citable and repeatable results. However, journal papers are not effective at providing a quick and intuitive summary useful for public debate. Just as quantitative data are generally best viewed in graphic form, we propose that scientific studies also can benefit from visual summary and display. We explore the use of existing methods for diagramming logical connections and dependencies, such as Venn diagrams, mind maps, flow charts, etc., for rapidly and intuitively communicating the methods and results of scientific studies. We also discuss a method, specifically tailored to summarizing scientific papers that we introduced last year at AGU. Our method diagrams the relative importance and connections between data, methods/models, results/ideas, and implications/importance using a single-page format with connected elements in these four categories. Within each category (e.g., data) the spatial location of individual elements (e.g., seismic, topographic, gravity) indicates relative novelty (e.g., are these new data?) and importance (e.g., how critical are these data to the results of the paper?). The goal is to find ways to rapidly and intuitively share both the results and the process of science, both for communication

  19. Accrual Patterns for Clinical Studies Involving Quantitative Imaging: Results of an NCI Quantitative Imaging Network (QIN) Survey

    PubMed Central

    Kurland, Brenda F.; Aggarwal, Sameer; Yankeelov, Thomas E.; Gerstner, Elizabeth R.; Mountz, James M.; Linden, Hannah M.; Jones, Ella F.; Bodeker, Kellie L.; Buatti, John M.

    2017-01-01

    Patient accrual is essential for the success of oncology clinical trials. Recruitment for trials involving the development of quantitative imaging biomarkers may face different challenges than treatment trials. This study surveyed investigators and study personnel for evaluating accrual performance and perceived barriers to accrual and for soliciting solutions to these accrual challenges that are specific to quantitative imaging-based trials. Responses for 25 prospective studies were received from 12 sites. The median percent annual accrual attained was 94.5% (range, 3%–350%). The most commonly selected barrier to recruitment (n = 11/25, 44%) was that “patients decline participation,” followed by “too few eligible patients” (n = 10/25, 40%). In a forced choice for the single greatest recruitment challenge, “too few eligible patients” was the most common response (n = 8/25, 32%). Quantitative analysis and qualitative responses suggested that interactions among institutional, physician, and patient factors contributed to accrual success and challenges. Multidisciplinary collaboration in trial design and execution is essential to accrual success, with attention paid to ensuring and communicating potential trial benefits to enrolled and future patients. PMID:28127586

  20. Facile colorimetric methods for the quantitative determination of tetramisole hydrochloride

    NASA Astrophysics Data System (ADS)

    Amin, A. S.; Dessouki, H. A.

    2002-10-01

    A facile, rapid and sensitive methods for the determination of tetramisole hydrochloride in pure and in dosage forms are described. The procedures are based on the formation of coloured products with the chromogenic reagents alizarin blue BB (I), alizarin red S (II), alizarin violet 3R (III) and alizarin yellow G (IV). The coloured products showed absorption maxima at 605, 468, 631 and 388 nm for I-IV, respectively. The colours obtained were stable for 24 h. The colour system obeyed Beer's law in the concentration range 1.0-36, 0.8-32, 1.2-42 and 0.8-30 μg ml -1, respectively. The results obtained showed good recoveries with relative standard deviations of 1.27, 0.96, 1.13 and 1.35%, respectively. The detection and determination limits were found to be 1.0 and 3.8, 1.2 and 4.2, 1.0 and 3.9 and finally 1.4 and 4.8 ng ml -1 for I-IV complexes, respectively. Applications of the method to representative pharmaceutical formulations are represented and the validity assessed by applying the standard addition technique, which is comparable with that obtained using the official method.

  1. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-11-26

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay.

  2. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  3. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    PubMed

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  4. Combinative Method Using Multi-components Quantitation and HPLC Fingerprint for Comprehensive Evaluation of Gentiana crassicaulis

    PubMed Central

    Song, Jiuhua; Chen, Fengzheng; Liu, Jiang; Zou, Yuanfeng; Luo, Yun; Yi, Xiaoyan; Meng, Jie; Chen, Xingfu

    2017-01-01

    Background: Gentiana crassicaulis () is an important traditional Chinese herb. Like other herbs, its chemical compounds vary greatly by the environmental and genetic factors, as a result, the quality is always different even from the same region, and therefore, the quality evaluation is necessary for its safety and effective use. In this study, a comprehensive method including HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Cujingqinjiao and to classify the samples collected from Lijiang City of Yunnan province. A total of 30 common peaks including four identified peaks, were found, and were involved for further characterization and quality control of Cujingqinjiao. Twenty-one batches of samples from Lijiang City of Yunnan Province were evaluated by similarity analysis (SA), hierarchical cluster analysis (HCA), principal component analysis (PCA) and factor analysis (FA) according to the characteristic of common peaks. Results: The obtained data showed good stability and repeatability of the chromatographic fingerprint, similarity values were all more than 0.90. This study demonstrated that a combination of the chromatographic quantitative analysis and fingerprint offered an efficient way to quality consistency evaluation of Cujingqinjiao. Consistent results were obtained to show that samples from a same origin could be successfully classified into two groups. Conclusion: This study revealed that the combinative method was reliable, simple and sensitive for fingerprint analysis, moreover, for quality control and pattern recognition of Cujingqinjiao. SUMMARY HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Gentiana crassicaulisSimilarity analysis, hierarchical cluster analysis, principal component analysis and factor analysis were employed to analysis the chromatographic dataset.The results of multi-components quantitation analysis, similarity analysis, hierarchical cluster analysis, principal

  5. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    PubMed

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated.

  6. A quantitative SMRT cell sequencing method for ribosomal amplicons.

    PubMed

    Jones, Bethan M; Kustka, Adam B

    2017-04-01

    Advances in sequencing technologies continue to provide unprecedented opportunities to characterize microbial communities. For example, the Pacific Biosciences Single Molecule Real-Time (SMRT) platform has emerged as a unique approach harnessing DNA polymerase activity to sequence template molecules, enabling long reads at low costs. With the aim to simultaneously classify and enumerate in situ microbial populations, we developed a quantitative SMRT (qSMRT) approach that involves the addition of exogenous standards to quantify ribosomal amplicons derived from environmental samples. The V7-9 regions of 18S SSU rDNA were targeted and quantified from protistan community samples collected in the Ross Sea during the Austral summer of 2011. We used three standards of different length and optimized conditions to obtain accurate quantitative retrieval across the range of expected amplicon sizes, a necessary criterion for analyzing taxonomically diverse 18S rDNA molecules from natural environments. The ability to concurrently identify and quantify microorganisms in their natural environment makes qSMRT a powerful, rapid and cost-effective approach for defining ecosystem diversity and function.

  7. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  8. Relative Quantification of Costal Cordillera (Ecuador) Uplift : Preliminary Results from Quantitative Geomorphology

    NASA Astrophysics Data System (ADS)

    Reyes, Pedro; Dauteuil, Olivier; Michaud, François

    2010-05-01

    The coastal cordillera of Ecuador (culminating point around 800 m) includes on its littoral margins uplifted marine terraces (maximum known 360 m). The coastal cordillera constitutes an important barrier of drainage and on nearly 600 km the drainage resulting from the Andes is diverted towards Río Guayas in the South and Río Esmeraldas in North. What is the uplifting mode of the coastal cordillera? For how long it has constituted a barrier of drainage? Does the coastal cordillera rising be linked with the littoral margin rising? Does the cordillera have raised in a homogeneous or segmented way? What is the geodynamic process of the uplift of the cordillera? Can this uplift be related with the subduction of the Carnegie ridge? The first objective of this work is to analyze the morphology of the coastal cordillera with helps of quantitative geomorphology using digital techniques such as DEM (realized with a resolution of 30 m by Marc Souris, IRD), to specify the evolution of the coastal cordillera uplift. This study was carried out starting combining analysis of morphology, maps derived from the slopes and anomalies of the drainage of the hydrographic network. In the second time, three methods were applied to DEM data using the ArcGIS software: 1) the digitalization and the interpolation of basal surface of the last marine formation of regional distribution (the Borbón formation on the geological map of Ecuador) to determine paleo-horizontal and to see its deformation; 2) the extraction of 109 profiles of rivers which allow us to calculate for each river the vertical, horizontal, and total deviation compared to the theoretical profile of the river and the associated SL index; 3) the measurement of the relief incision (depth + half width of the valley, on the whole 7500 measurements) according to the method of Bonnet et al. (1998). We adapted this method to be able to represent the state of incision in any point, correcting from the influence of the lithology and

  9. Quantitative estimation of poikilocytosis by the coherent optical method

    NASA Astrophysics Data System (ADS)

    Safonova, Larisa P.; Samorodov, Andrey V.; Spiridonov, Igor N.

    2000-05-01

    The investigation upon the necessity and the reliability required of the determination of the poikilocytosis in hematology has shown that existing techniques suffer from grave shortcomings. To determine a deviation of the erythrocytes' form from the normal (rounded) one in blood smears it is expedient to use an integrative estimate. The algorithm which is based on the correlation between erythrocyte morphological parameters with properties of the spatial-frequency spectrum of blood smear is suggested. During analytical and experimental research an integrative form parameter (IFP) which characterizes the increase of the relative concentration of cells with the changed form over 5% and the predominating type of poikilocytes was suggested. An algorithm of statistically reliable estimation of the IFP on the standard stained blood smears has been developed. To provide the quantitative characterization of the morphological features of cells a form vector has been proposed, and its validity for poikilocytes differentiation was shown.

  10. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    PubMed

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.

  11. On the quantitative method for measurement and analysis of the fine structure of Fraunhofer line profiles

    NASA Astrophysics Data System (ADS)

    Kuli-Zade, D. M.

    The methods of measurement and analysis of the fine structure of weak and moderate Fraunhofer line profiles are considered. The digital spectral materials were obtained using rapid scanning high dispersion and high resolution double monochromators. The methods of asymmetry coefficient, bisector method and new quantitative method pro- posed by the author are discussed. The new physical values of differential, integral, residual and relative asymmetries are first introduced. These quantitative values permit us to investigate the dependence of asymmetry on microscopic (atomic) and macro- scopic (photospheric) values. It is shown that the integral profile asymmetries grow appreciably with increase in line equivalent width. The average effective depths of the formation of used Fraunhofer lines in the photosphere of the Sun are determined. It is shown that with the increasing of the effective formation depths of the lines integral and residual asymmetries of the lines profiles noticeably decrease. It is in fine agree- ment with the results of intensity dependence of asymmetry. The above-mentioned methods are critically compared and the advantages of author's method are shown. The computer program of calculation of the line-profile asymmetry parameters has been worked out.

  12. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methods—acetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41—was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason.

  13. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    SciTech Connect

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-15

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  14. American historical archeology: methods and results.

    PubMed

    Deetz, J

    1988-01-22

    For historical archeology to be effective, research methods must be employed that ensure that both archeological and historical data be synthesized in a constructive manner. An example from Flowerdew Hundred, a Virginia plantation, illustrates such an approach. Collections from eighteen sites(1619 to 1720) were studied and dated by the inside bore diameters of pipestem fragments from clay smoking pipes. The sites grouped into three distinct categories, each with a different date. The latest group of sites (1680 to 1720) contained Colono ware, a slave produced pottery; none of the earlier sites did, although there were blacks at Flowerdew Hundred as early as 1619. On the basis of studies of probate data and other primary historical sources, it is suggested that this pattern of Colono ware occurrence is due to a change in the social and residential status of blacks during the century and that only when they lived separately from the masters did they make this type of pottery.

  15. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    PubMed

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time <60 s) and method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical

  16. Analyses on Regional Cultivated Land Changebased on Quantitative Method

    NASA Astrophysics Data System (ADS)

    Cao, Yingui; Yuan, Chun; Zhou, Wei; Wang, Jing

    Three Gorges Project is the great project in the world, which accelerates economic development in the reservoir area of Three Gorges Project. In the process of development in the reservoir area of Three Gorges Project, cultivated land has become the important resources, a lot of cultivated land has been occupied and become the constructing land. In the same time, a lot of cultivated land has been flooded because of the rising of the water level. This paper uses the cultivated land areas and social economic indicators of reservoir area of Three Gorges in 1990-2004, takes the statistic analyses and example research in order to analyze the process of cultivated land, get the driving forces of cultivated land change, find the new methods to stimulate and forecast the cultivated land areas in the future, and serve for the cultivated land protection and successive development in reservoir area of Three Gorges. The results indicate as follow, firstly, in the past 15 years, the cultivated land areas has decreased 200142 hm2, the decreasing quantity per year is 13343 hm2. The whole reservoir area is divided into three different areas, they are upper reaches area, belly area and lower reaches area. The trends of cultivated land change in different reservoir areas are similar to the whole reservoir area. Secondly, the curve of cultivated land areas and per capita GDP takes on the reverse U, and the steps between the change rate of cultivated land and the change rate of GDP are different in some years, which indicates that change of cultivated land and change of GDP are decoupling, besides that, change of cultivated land is connection with the development of urbanization and the policy of returning forestry greatly. Lastly, the precision of multi-regression is lower than the BP neural network in the stimulation of cultivated land, then takes use of the BP neural network to forecast the cultivated land areas in 2005, 2010 and 2015, and the forecasting results are reasonable.

  17. Optimization of Quantitative PCR Methods for Enteropathogen Detection.

    PubMed

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen's extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease.

  18. Optimization of Quantitative PCR Methods for Enteropathogen Detection

    PubMed Central

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M.; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R.

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen’s extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease. PMID:27336160

  19. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  20. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  1. Critical appraisal of quantitative PCR results in colorectal cancer research: can we rely on published qPCR results?

    PubMed

    Dijkstra, J R; van Kempen, L C; Nagtegaal, I D; Bustin, S A

    2014-06-01

    The use of real-time quantitative polymerase chain reaction (qPCR) in cancer research has become ubiquitous. The relative simplicity of qPCR experiments, which deliver fast and cost-effective results, means that each year an increasing number of papers utilizing this technique are being published. But how reliable are the published results? Since the validity of gene expression data is greatly dependent on appropriate normalisation to compensate for sample-to-sample and run-to-run variation, we have evaluated the adequacy of normalisation procedures in qPCR-based experiments. Consequently, we assessed all colorectal cancer publications that made use of qPCR from 2006 until August 2013 for the number of reference genes used and whether they had been validated. Using even these minimal evaluation criteria, the validity of only three percent (6/179) of the publications can be adequately assessed. We describe common errors, and conclude that the current state of reporting on qPCR in colorectal cancer research is disquieting. Extrapolated to the study of cancer in general, it is clear that the majority of studies using qPCR cannot be reliably assessed and that at best, the results of these studies may or may not be valid and at worst, pervasive incorrect normalisation is resulting in the wholesale publication of incorrect conclusions. This survey demonstrates that the existence of guidelines, such as MIQE, is necessary but not sufficient to address this problem and suggests that the scientific community should examine its responsibility and be aware of the implications of these findings for current and future research.

  2. Uncertainty in environmental health impact assessment: quantitative methods and perspectives.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Vanni, Tazio; Foss, Anna M

    2013-01-01

    Environmental health impact assessment models are subjected to great uncertainty due to the complex associations between environmental exposures and health. Quantifying the impact of uncertainty is important if the models are used to support health policy decisions. We conducted a systematic review to identify and appraise current methods used to quantify the uncertainty in environmental health impact assessment. In the 19 studies meeting the inclusion criteria, several methods were identified. These were grouped into random sampling methods, second-order probability methods, Bayesian methods, fuzzy sets, and deterministic sensitivity analysis methods. All 19 studies addressed the uncertainty in the parameter values but only 5 of the studies also addressed the uncertainty in the structure of the models. None of the articles reviewed considered conceptual sources of uncertainty associated with the framing assumptions or the conceptualisation of the model. Future research should attempt to broaden the way uncertainty is taken into account in environmental health impact assessments.

  3. Chemical comparison of Tripterygium wilfordii and Tripterygium hypoglaucum based on quantitative analysis and chemometrics methods.

    PubMed

    Guo, Long; Duan, Li; Liu, Ke; Liu, E-Hu; Li, Ping

    2014-07-01

    Tripterygium wilfordii (T. wilfordii) and Tripterygium hypoglaucum (T. hypoglaucum), two commonly used Chinese herbal medicines derived from Tripterygium genus, have been widely used for the treatment of rheumatoid arthritis and other related inflammatory diseases in clinical therapy. In the present study, a rapid resolution liquid chromatography/electrospray ionization tandem mass spectrometry (RRLC-ESI-MS(n)) method has been developed and validated for simultaneous determination of 19 bioactive compounds including four catechins, three sesquiterpene alkaloids, four diterpenoids, and eight triterpenoids in these two similar herbs. The method validation results indicated that the developed method had desirable specificity, linearity, precision and accuracy. Quantitative analysis results showed that there were significant differences in the content of different types of compounds in T. wilfordii and T. hypoglaucum. Moreover, chemometrics methods such as one-way ANOVA, principal component analysis (PCA) and hierarchical clustering analysis (HCA) were performed to compare and discriminate the two Tripterygium herbs based on the quantitative data of analytes, and it was proven straightforward and reliable to differentiate T. wilfordii and T. hypoglaucum samples from different origins. In conclusion, simultaneous quantification of multiple-active component by RRLC-ESI-MS(n) coupled with chemometrics analysis could be a well-acceptable strategy to compare and evaluate the quality of T. wilfordii and T. hypoglaucum.

  4. Quantitative research on the primary process: method and findings.

    PubMed

    Holt, Robert R

    2002-01-01

    Freud always defined the primary process metapsychologically, but he described the ways it shows up in dreams, parapraxes, jokes, and symptoms with enough observational detail to make it possible to create an objective, reliable scoring system to measure its manifestations in Rorschach responses, dreams, TAT stories, free associations, and other verbal texts. That system can identify signs of the thinker's efforts, adaptive or maladaptive, to control or defend against the emergence of primary process. A prerequisite and a consequence of the research that used this system was clarification and elaboration of the psychoanalytic theory of thinking. Results of empirical tests of several propositions derived from psychoanalytic theory are summarized. Predictions concerning the method's most useful index, of adaptive vs. maladaptive regression, have been repeatedly verified: People who score high on this index (who are able to produce well-controlled "primary products" in their Rorschach responses), as compared to those who score at the maladaptive pole (producing primary-process-filled responses with poor reality testing, anxiety, and pathological defensive efforts), are better able to tolerate sensory deprivation, are more able to enter special states of consciousness comfortably (drug-induced, hypnotic, etc.), and have higher achievements in artistic creativity, while schizophrenics tend to score at the extreme of maladaptive regression. Capacity for adaptive regression also predicts success in psychotherapy, and rises with the degree of improvement after both psychotherapy and drug treatment. Some predictive failures have been theoretically interesting: Kris's hypothesis about creativity and the controlled use of primary process holds for males but usually not for females. This body of work is presented as a refutation of charges, brought by such critics as Crews, that psychoanalysis cannot become a science.

  5. Novel method for quantitative ANA measurement using near-infrared imaging.

    PubMed

    Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L

    2009-09-30

    Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.

  6. An ECL-PCR method for quantitative detection of point mutation

    NASA Astrophysics Data System (ADS)

    Zhu, Debin; Xing, Da; Shen, Xingyan; Chen, Qun; Liu, Jinfeng

    2005-04-01

    A new method for identification of point mutations was proposed. Polymerase chain reaction (PCR) amplification of a sequence from genomic DNA was followed by digestion with a kind of restriction enzyme, which only cut the wild-type amplicon containing its recognition site. Reaction products were detected by electrochemiluminescence (ECL) assay after adsorption of the resulting DNA duplexes to the solid phase. One strand of PCR products carries biotin to be bound on a streptavidin-coated microbead for sample selection. Another strand carries Ru(bpy)32+ (TBR) to react with tripropylamine (TPA) to emit light for ECL detection. The method was applied to detect a specific point mutation in H-ras oncogene in T24 cell line. The results show that the detection limit for H-ras amplicon is 100 fmol and the linear range is more than 3 orders of magnitude, thus, make quantitative analysis possible. The genotype can be clearly discriminated. Results of the study suggest that ECL-PCR is a feasible quantitative method for safe, sensitive and rapid detection of point mutation in human genes.

  7. Rapid quantitative method for total brominated vegetable oil in soft drinks using ion chromatography.

    PubMed

    Yousef, Ashraf A; Abbas, Alaa B; Badawi, Bassam Sh; Al-Jowhar, Wafaa Y; Zain, Esam A; El-Mufti, Seham A

    2012-08-01

    A simple, quantitative and rapid method for total brominated vegetable oil (BVO) using ion chromatography (IC) with suppressed conductivity detection was developed and successfully applied to soft drinks with results expressed as inorganic bromide anion. The procedure involves extraction of BVO with diethyl ether and treatment with zinc dust in a solution of acetic acid, giving recoveries ranging between 92.5 and 98.5%. The calibration curves obtained were linear with correlation coefficients (r²) of 0.998, a coefficient of variation (CV) of less than 5% and limit of detection (LOD) and limit of quantification (LOQ) of 250 and 750 µg l⁻¹, respectively. The method was successfully applied to the determination of BVO in several commercial soft drinks which were found to contain BVO in the range 1.8-14.510 mg l⁻¹. The method has less sources of error compared to previously published methods.

  8. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  9. Methods for equine preantral follicles isolation: quantitative aspects.

    PubMed

    Leonel, E C R; Bento-Silva, V; Ambrozio, K S; Luna, H S; Costa e Silva, E V; Zúccari, C E S N

    2013-12-01

    The aim of this study was to test the use of mechanical and mechanical-enzymatic methods, saline solution (SS), and PBS solution for the manipulation and isolation of mare ovarian preantral follicles (PAFs). The ovaries were subjected to mechanical isolation (mixer) alone or in association with enzymatic digestion (collagenase). Incubation times of 10 and 20 min were employed. In the first group, 4.1 ± 4.9 PAFs were harvested with the mechanical-enzymatic method vs 71.1 ± 19.2 with the mechanical procedure, showing a significant difference between methods; using SS and PBS, these numbers were 35.7 ± 34.3 and 39.6 ± 39.6, respectively, with no significant difference between solutions. In the second group, there was significant difference between methods, with 7.1 ± 10.6 follicles harvested with the mechanical-enzymatic method vs 63.2 ± 22.9 with the mechanical procedure; using SS and PBS, means were 35.5 ± 36.4 and 34.9 ± 31.1, respectively. The mechanical method proved more effective than the mechanical-enzymatic approach. Both SS and PBS can be used as a media for equine PAFs preparation.

  10. Quantitative Analysis of Intra-chromosomal Contacts: The 3C-qPCR Method.

    PubMed

    Ea, Vuthy; Court, Franck; Forné, Thierry

    2017-01-01

    The chromosome conformation capture (3C) technique is fundamental to many population-based methods investigating chromatin dynamics and organization in eukaryotes. Here, we provide a modified quantitative 3C (3C-qPCR) protocol for improved quantitative analyses of intra-chromosomal contacts. We also describe an algorithm for data normalization which allows more accurate comparisons between contact profiles.

  11. A general method for the quantitative assessment of mineral pigments.

    PubMed

    Ares, M C Zurita; Fernández, J M

    2016-01-01

    A general method for the estimation of mineral pigment contents in different bases has been proposed using a sole set of calibration curves, (one for each pigment), calculated for a white standard base, thus elaborating patterns for each utilized base is not necessary. The method can be used in different bases and its validity had ev en been proved in strongly tinted bases. The method consists of a novel procedure that combines diffuse reflectance spectroscopy, second derivatives and the Kubelka-Munk function. This technique has proved to be at least one order of magnitude more sensitive than X-Ray diffraction for colored compounds, since it allowed the determination of the pigment amount in colored samples containing 0.5 wt% of pigment that was not detected by X-Ray Diffraction. The method can be used to estimate the concentration of mineral pigments in a wide variety of either natural or artificial materials, since it does not requiere the calculation of each pigment pattern in every base. This fact could have important industrial consequences, as the proposed method would be more convenient, faster and cheaper.

  12. Comparison of QIAGEN automated nucleic acid extraction methods for CMV quantitative PCR testing.

    PubMed

    Miller, Steve; Seet, Henrietta; Khan, Yasmeen; Wright, Carolyn; Nadarajah, Rohan

    2010-04-01

    We examined the effect of nucleic acid extraction methods on the analytic characteristics of a quantitative polymerase chain reaction (PCR) assay for cytomegalovirus (CMV). Human serum samples were extracted with 2 automated instruments (BioRobot EZ1 and QIAsymphony SP, QIAGEN, Valencia, CA) and CMV PCR results compared with those of pp65 antigenemia testing. Both extraction methods yielded results that were comparably linear and precise, whereas the QIAsymphony SP had a slightly lower limit of detection (1.92 log(10) copies/mL vs 2.26 log(10) copies/mL). In both cases, PCR was more sensitive than CMV antigen detection, detecting CMV viremia in 12% (EZ1) and 21% (QIAsymphony) of antigen-negative specimens. This study demonstrates the feasibility of using 2 different extraction techniques to yield results within 0.5 log(10) copies/mL of the mean value, a level that would allow for clinical comparison between different laboratory assays.

  13. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  14. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  15. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models

    PubMed Central

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-01-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. PMID:27591750

  16. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    PubMed

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population.

  17. Quantitative interpretation of mineral hyperspectral images based on principal component analysis and independent component analysis methods.

    PubMed

    Jiang, Xiping; Jiang, Yu; Wu, Fang; Wu, Fenghuang

    2014-01-01

    Interpretation of mineral hyperspectral images provides large amounts of high-dimensional data, which is often complicated by mixed pixels. The quantitative interpretation of hyperspectral images is known to be extremely difficult when three types of information are unknown, namely, the number of pure pixels, the spectrum of pure pixels, and the mixing matrix. The problem is made even more complex by the disturbance of noise. The key to interpreting abstract mineral component information, i.e., pixel unmixing and abundance inversion, is how to effectively reduce noise, dimension, and redundancy. A three-step procedure is developed in this study for quantitative interpretation of hyperspectral images. First, the principal component analysis (PCA) method can be used to process the pixel spectrum matrix and keep characteristic vectors with larger eigenvalues. This can effectively reduce the noise and redundancy, which facilitates the abstraction of major component information. Second, the independent component analysis (ICA) method can be used to identify and unmix the pixels based on the linear mixed model. Third, the pure-pixel spectrums can be normalized for abundance inversion, which gives the abundance of each pure pixel. In numerical experiments, both simulation data and actual data were used to demonstrate the performance of our three-step procedure. Under simulation data, the results of our procedure were compared with theoretical values. Under the actual data measured from core hyperspectral images, the results obtained through our algorithm are compared with those of similar software (Mineral Spectral Analysis 1.0, Nanjing Institute of Geology and Mineral Resources). The comparisons show that our method is effective and can provide reference for quantitative interpretation of hyperspectral images.

  18. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  19. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, Edward F.; Keller, Richard A.; Apel, Charles T.

    1983-01-01

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions.

  20. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1983-09-06

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions. 6 figs.

  1. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  2. Selection methods in forage breeding: a quantitative appraisal

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Forage breeding can be extraordinarily complex because of the number of species, perenniality, mode of reproduction, mating system, and the genetic correlation for some traits evaluated in spaced plants vs. performance under cultivation. Aiming to compare eight forage breeding methods for direct sel...

  3. Magnetic Ligation Method for Quantitative Detection of MicroRNAs

    PubMed Central

    Liong, Monty; Im, Hyungsoon; Majmudar, Maulik D.; Aguirre, Aaron D.; Sebas, Matthew; Lee, Hakho; Weissleder, Ralph

    2014-01-01

    A magnetic ligation method is utilized for the detection of microRNAs amongst a complex biological background without polymerase chain reaction or nucleotide modification. The sandwich probes assay can be adapted to analyze a panel of microRNAs associated with cardiovascular diseases in heart tissue samples. PMID:24532323

  4. Computer Image Analysis Method for Rapid Quantitation of Macrophage Phagocytosis

    DTIC Science & Technology

    1990-01-01

    None- Methods of Enzymology, ( Sabato , D.G., and Everse, J.. Eds.) theless, occasionally an excessive number of micro- New York: Academic Press, vol...heterogeneity in neonates mology ( Sabato , G.D., and Everse, J., Eds.) New York: Aca- and adults. Blood 68,200, 1986. demic Press. Vol. 132, p. 3, 1986. 21

  5. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models.

    PubMed

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics.

  6. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran

    PubMed Central

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-01-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  7. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran.

    PubMed

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2015-09-22

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning.

  8. Smaller, Scale-Free Gene Networks Increase Quantitative Trait Heritability and Result in Faster Population Recovery

    PubMed Central

    Malcom, Jacob W.

    2011-01-01

    One of the goals of biology is to bridge levels of organization. Recent technological advances are enabling us to span from genetic sequence to traits, and then from traits to ecological dynamics. The quantitative genetics parameter heritability describes how quickly a trait can evolve, and in turn describes how quickly a population can recover from an environmental change. Here I propose that we can link the details of the genetic architecture of a quantitative trait—i.e., the number of underlying genes and their relationships in a network—to population recovery rates by way of heritability. I test this hypothesis using a set of agent-based models in which individuals possess one of two network topologies or a linear genotype-phenotype map, 16–256 genes underlying the trait, and a variety of mutation and recombination rates and degrees of environmental change. I find that the network architectures introduce extensive directional epistasis that systematically hides and reveals additive genetic variance and affects heritability: network size, topology, and recombination explain 81% of the variance in average heritability in a stable environment. Network size and topology, the width of the fitness function, pre-change additive variance, and certain interactions account for ∼75% of the variance in population recovery times after a sudden environmental change. These results suggest that not only the amount of additive variance, but importantly the number of loci across which it is distributed, is important in regulating the rate at which a trait can evolve and populations can recover. Taken in conjunction with previous research focused on differences in degree of network connectivity, these results provide a set of theoretical expectations and testable hypotheses for biologists working to span levels of organization from the genotype to the phenotype, and from the phenotype to the environment. PMID:21347400

  9. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR.

    PubMed

    Seeker, Luise A; Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  10. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR

    PubMed Central

    Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J.; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H.

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  11. Comparison of Analytic Methods for Quantitative Real-Time Polymerase Chain Reaction Data

    PubMed Central

    Chen, Ping

    2015-01-01

    Abstract Polymerase chain reaction (PCR) is a laboratory procedure to amplify and simultaneously quantify targeted DNA molecules, and then detect the product of the reaction at the end of all the amplification cycles. A more modern technique, real-time PCR, also known as quantitative PCR (qPCR), detects the product after each cycle of the progressing reaction by applying a specific fluorescence technique. The quantitative methods currently used to analyze qPCR data result in varying levels of estimation quality. This study compares the accuracy and precision of the estimation achieved by eight different models when applied to the same qPCR dataset. Also, the study evaluates a newly introduced data preprocessing approach, the taking-the-difference approach, and compares it to the currently used approach of subtracting the background fluorescence. The taking-the-difference method subtracts the fluorescence in the former cycle from that in the latter cycle to avoid estimating the background fluorescence. The results obtained from the eight models show that taking-the-difference is a better way to preprocess qPCR data compared to the original approach because of a reduction in the background estimation error. The results also show that weighted models are better than non-weighted models, and that the precision of the estimation achieved by the mixed models is slightly better than that achieved by the linear regression models. PMID:26204477

  12. Comparison of analytic methods for quantitative real-time polymerase chain reaction data.

    PubMed

    Chen, Ping; Huang, Xuelin

    2015-11-01

    Polymerase chain reaction (PCR) is a laboratory procedure to amplify and simultaneously quantify targeted DNA molecules, and then detect the product of the reaction at the end of all the amplification cycles. A more modern technique, real-time PCR, also known as quantitative PCR (qPCR), detects the product after each cycle of the progressing reaction by applying a specific fluorescence technique. The quantitative methods currently used to analyze qPCR data result in varying levels of estimation quality. This study compares the accuracy and precision of the estimation achieved by eight different models when applied to the same qPCR dataset. Also, the study evaluates a newly introduced data preprocessing approach, the taking-the-difference approach, and compares it to the currently used approach of subtracting the background fluorescence. The taking-the-difference method subtracts the fluorescence in the former cycle from that in the latter cycle to avoid estimating the background fluorescence. The results obtained from the eight models show that taking-the-difference is a better way to preprocess qPCR data compared to the original approach because of a reduction in the background estimation error. The results also show that weighted models are better than non-weighted models, and that the precision of the estimation achieved by the mixed models is slightly better than that achieved by the linear regression models.

  13. A Powerful and Robust Method for Mapping Quantitative Trait Loci in General Pedigrees

    PubMed Central

    Diao, G. ; Lin, D. Y. 

    2005-01-01

    The variance-components model is the method of choice for mapping quantitative trait loci in general human pedigrees. This model assumes normally distributed trait values and includes a major gene effect, random polygenic and environmental effects, and covariate effects. Violation of the normality assumption has detrimental effects on the type I error and power. One possible way of achieving normality is to transform trait values. The true transformation is unknown in practice, and different transformations may yield conflicting results. In addition, the commonly used transformations are ineffective in dealing with outlying trait values. We propose a novel extension of the variance-components model that allows the true transformation function to be completely unspecified. We present efficient likelihood-based procedures to estimate variance components and to test for genetic linkage. Simulation studies demonstrated that the new method is as powerful as the existing variance-components methods when the normality assumption holds; when the normality assumption fails, the new method still provides accurate control of type I error and is substantially more powerful than the existing methods. We performed a genomewide scan of monoamine oxidase B for the Collaborative Study on the Genetics of Alcoholism. In that study, the results that are based on the existing variance-components method changed dramatically when three outlying trait values were excluded from the analysis, whereas our method yielded essentially the same answers with or without those three outliers. The computer program that implements the new method is freely available. PMID:15918154

  14. A quantitative sampling method for Oncomelania quadrasi by filter paper.

    PubMed

    Tanaka, H; Santos, M J; Matsuda, H; Yasuraoka, K; Santos, A T

    1975-08-01

    Filter paper was found to attract Oncomelania quadrasi in waters the same way as fallen dried banana leaves, although less number of other species of snails was collected on the former than on the latter. Snails were collected in limited areas using a tube (85 cm2 area at cross-section) and a filter paper (20 X 20 CM) samplers. The sheet of filter paper was placed close to the spot where a tube sample was taken, and recovered after 24 hours. At each sampling, 30 samples were taken by each method in an area and sampling was made four times. The correlation of the number of snails collected by the tube and that by filter paper was studied. The ratio of the snail counts by the tube sampler to those by the filter paper was 1.18. A loose correlation was observed between snail counts of both methods as shown by the correlation coefficient r = 0.6502. The formulas for the regression line were Y = 0.77 X + 1.6 and X = 0.55 Y + 1.35 for 3 experiments where Y is the number of snails collected by tube sampling and X is the number of snails collected in the sheet of filter paper. The type of snail distribution was studied in the 30 samples taken by each method and this was observed to be nearly the same in both sampling methods. All sampling data were found to fit the negative binomial distribution with the values of the constant k varying very much from 0.5775 to 5.9186 in (q -- p)-k. In each experiment, the constant k was always larger in tube sampling than in filter paper sampling. This indicates that the uneven distribution of snails on the soil surface becomes more conspicuous by the filter paper sampling.

  15. A quantitative immunopolymerase chain reaction method for detection of vegetative insecticidal protein in genetically modified crops.

    PubMed

    Kumar, Rajesh

    2011-10-12

    Vegetative insecticidal protein (Vip) is being employed for transgenic expression in selected crops such as cotton, brinjal, and corn. For regulatory compliance, there is a need for a sensitive and reliable detection method, which can distinguish between approved and nonapproved genetically modified (GM) events and quantify GM contents as well. A quantitative immunopolymerase chain reaction (IPCR) method has been developed for the detection and quantification of Vip protein in GM crops. The developed assay displayed a detection limit of 1 ng/mL (1 ppb) and linear quantification range between 10 and 1000 ng/mL of Vip-S protein. The sensitivity of the assay was found to be 10 times higher than an analogous enzyme-linked immunosorbent assay for Vip-S protein. The results suggest that IPCR has the potential to become a standard method to quantify GM proteins.

  16. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  17. A new method for quantitative real-time polymerase chain reaction data analysis.

    PubMed

    Rao, Xiayu; Lai, Dejian; Huang, Xuelin

    2013-09-01

    Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantification method that has been extensively used in biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle method and linear and nonlinear model-fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence can hardly be accurate and therefore can distort results. We propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtract the fluorescence in the former cycle from that in the latter cycle, transforming the n cycle raw data into n-1 cycle data. Then, linear regression is applied to the natural logarithm of the transformed data. Finally, PCR amplification efficiencies and the initial DNA molecular numbers are calculated for each reaction. This taking-difference method avoids the error in subtracting an unknown background, and thus it is more accurate and reliable. This method is easy to perform, and this strategy can be extended to all current methods for PCR data analysis.

  18. Statistical methods for mapping quantitative trait loci from a dense set of markers.

    PubMed Central

    Dupuis, J; Siegmund, D

    1999-01-01

    Lander and Botstein introduced statistical methods for searching an entire genome for quantitative trait loci (QTL) in experimental organisms, with emphasis on a backcross design and QTL having only additive effects. We extend their results to intercross and other designs, and we compare the power of the resulting test as a function of the magnitude of the additive and dominance effects, the sample size and intermarker distances. We also compare three methods for constructing confidence regions for a QTL: likelihood regions, Bayesian credible sets, and support regions. We show that with an appropriate evaluation of the coverage probability a support region is approximately a confidence region, and we provide a theroretical explanation of the empirical observation that the size of the support region is proportional to the sample size, not the square root of the sample size, as one might expect from standard statistical theory. PMID:9872974

  19. A novel generalized ridge regression method for quantitative genetics.

    PubMed

    Shen, Xia; Alam, Moudud; Fikse, Freddy; Rönnegård, Lars

    2013-04-01

    As the molecular marker density grows, there is a strong need in both genome-wide association studies and genomic selection to fit models with a large number of parameters. Here we present a computationally efficient generalized ridge regression (RR) algorithm for situations in which the number of parameters largely exceeds the number of observations. The computationally demanding parts of the method depend mainly on the number of observations and not the number of parameters. The algorithm was implemented in the R package bigRR based on the previously developed package hglm. Using such an approach, a heteroscedastic effects model (HEM) was also developed, implemented, and tested. The efficiency for different data sizes were evaluated via simulation. The method was tested for a bacteria-hypersensitive trait in a publicly available Arabidopsis data set including 84 inbred lines and 216,130 SNPs. The computation of all the SNP effects required <10 sec using a single 2.7-GHz core. The advantage in run time makes permutation test feasible for such a whole-genome model, so that a genome-wide significance threshold can be obtained. HEM was found to be more robust than ordinary RR (a.k.a. SNP-best linear unbiased prediction) in terms of QTL mapping, because SNP-specific shrinkage was applied instead of a common shrinkage. The proposed algorithm was also assessed for genomic evaluation and was shown to give better predictions than ordinary RR.

  20. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    NASA Astrophysics Data System (ADS)

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green’s function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10-6 , comparable with the recent results reported in the literature.

  1. A Method for Quantitative Phase Analysis of Nanocrystalline Zirconium Dioxide Polymorphs.

    PubMed

    Zhou, Zhiqiang; Guo, Li

    2015-04-01

    A method based on X-ray diffractometry was developed for quantitative phase analysis of nanocrystalline zirconium dioxide polymorphs. Corresponding formulas were derived. The key factors therein were evaluated by rigorous theoretical calculation and fully verified by experimentation. A process of iteration was raised to make the experimental verification proceed in the case of lack of pure ZrO2 crystal polymorphs. By this method, the weight ratios of tetragonal ZrO2 (t-ZrO2) to monoclinic ZrO2 (m-ZrO2) in any a mixture that contains nanocrystalline t-ZrO2 and m-ZrO2 or their weight fractions in a mixture that is composed of nanocrystalline t-ZrO2 and m-ZrO2 can be determined only upon an XRD test. It is proved by both theoretical calculation and experimental test that mutual substitutions of t-ZrO2 and cubic ZrO2 (c-ZrO2) in a wide range show almost no impact on the XRD patterns of their mixtures. And plus the similarity in property of t-ZrO2 and c-ZrO2, they can be treated as one whole phase. The high agreement of the theoretical and experimental results in this work also proves the validity and reliability of the theoretical calculation based on X-ray diffractometry theory for such quantitative phase analysis. This method has the potential to be popularized to other materials.

  2. Quantitative evaluation of material degradation by Barkhausen noise method

    SciTech Connect

    Yamaguchi, Atsunori; Maeda, Noriyoshi; Sugibayashi, Takuya

    1995-12-01

    Evaluation the life of nuclear power plant becomes inevitable to extend the plant operating period. This paper applied the magnetic method using Barkhausen noise (BHN) to detect the degradation by fatigue and thermal aging. Low alloy steel (SA 508 cl.2) was fatigued at the strain amplitudes of {+-}1% and {+-}0.4%, and duplex stainless steel (SCS14A) was heated at 400 C for a long period (thermal aging). For the degraded material by thermal aging, BHN was measured and good correlation between magnetic properties and absorption energy of the material was obtained. For fatigued material, BHNM was measured at each predetermined cycle and the effect of stress or strain of the material when it measured was evaluated, and good correlation between BHN and fatigue damage ratio was obtained.

  3. Quantitative evaluation of solar wind time-shifting methods

    NASA Astrophysics Data System (ADS)

    Cameron, Taylor; Jackel, Brian

    2016-11-01

    Nine years of solar wind dynamic pressure and geosynchronous magnetic field data are used for a large-scale statistical comparison of uncertainties associated with several different algorithms for propagating solar wind measurements. The MVAB-0 scheme is best overall, performing on average a minute more accurately than a flat time-shift. We also evaluate the accuracy of these time-shifting methods as a function of solar wind magnetic field orientation. We find that all time-shifting algorithms perform significantly worse (>5 min) due to geometric effects when the solar wind magnetic field is radial (parallel or antiparallel to the Earth-Sun line). Finally, we present an empirical scheme that performs almost as well as MVAB-0 on average and slightly better than MVAB-0 for intervals with nonradial B.

  4. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    SciTech Connect

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary; Geller, Jil; Fisher, Susan; Hall, Steven; Hazen, Terry C.; Brenner, Steven; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.

  5. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions*

    PubMed Central

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E.; Geller, Jil T.; Fisher, Susan J.; Hall, Steven C.; Hazen, Terry C.; Brenner, Steven E.; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  6. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE PAGES

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; ...

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  7. Quantitative methods in the tuberculosis epidemiology and in the evaluation of BCG vaccination programs.

    PubMed

    Lugosi, L

    1986-01-01

    Controversies concerning the protective efficacy of the BCG vaccination result mostly from the fact that quantitative methods have not been used in the evaluation of the BCG programs. Therefore, to eliminate the current controversy an unconditional requirement is to apply valid biostatistical models to analyse the results of the BCG programs. In order to achieve objective statistical inferences and epidemiological interpretations the following conditions should be fulfilled: data for evaluation have to be taken from epidemiological trials exempt from sampling error, since the morbidity rates are not normally distributed an appropriate normalizing transformation is needed for point and confidence interval estimations, only unbiased point estimates (dependent variables) could be used in valid models for hypothesis tests, in cases of rejected null hypothesis the ranked estimates of the compared groups must be evaluated in a multiple comparison model in order to diminish the Type I error in the decision. The following quantitative methods are presented to evaluate the effectiveness of BCG vaccination in Hungary: linear regression analysis, stepwise regression analysis and log-linear analysis.

  8. New methods for quantitative and qualitative facial studies: an overview.

    PubMed

    Thomas, I T; Hintz, R J; Frias, J L

    1989-01-01

    The clinical study of birth defects has traditionally followed the Gestalt approach, with a trend, in recent years, toward more objective delineation. Data collection, however, has been largely restricted to measurements from X-rays and anthropometry. In other fields, new techniques are being applied that capitalize on the use of modern computer technology. One such technique is that of remote sensing, of which photogrammetry is a branch. Cartographers, surveyors and engineers, using specially designed cameras, have applied geometrical techniques to locate points on an object precisely. These techniques, in their long-range application, have become part of our industrial technology and have assumed great importance with the development of satellite-borne surveillance systems. The close-range application of similar techniques has the potential for extremely accurate clinical measurement. We are currently evaluating the application of remote sensing to facial measurement using three conventional 35 mm still cameras. The subject is photographed in front of a carefully measured grid, and digitization is then carried out on 35-mm slides specific landmarks on the cranioface are identified, along with points on the background grid and the four corners of the slide frame, and are registered as xy coordinates by a digitizer. These coordinates are then converted into precise locations in object space. The technique is capable of producing measurements to within 1/100th of an inch. We suggest that remote sensing methods such as this may well be of great value in the study of congenital malformations.

  9. A quantitative method for photovoltaic encapsulation system optimization

    NASA Technical Reports Server (NTRS)

    Garcia, A., III; Minning, C. P.; Cuddihy, E. F.

    1981-01-01

    It is pointed out that the design of encapsulation systems for flat plate photovoltaic modules requires the fulfillment of conflicting design requirements. An investigation was conducted with the objective to find an approach which will make it possible to determine a system with optimum characteristics. The results of the thermal, optical, structural, and electrical isolation analyses performed in the investigation indicate the major factors in the design of terrestrial photovoltaic modules. For defect-free materials, minimum encapsulation thicknesses are determined primarily by structural considerations. Cell temperature is not strongly affected by encapsulant thickness or thermal conductivity. The emissivity of module surfaces exerts a significant influence on cell temperature. Encapsulants should be elastomeric, and ribs are required on substrate modules. Aluminum is unsuitable as a substrate material. Antireflection coating is required on cell surfaces.

  10. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  11. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages.

    PubMed

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and "naked" unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance.

  12. Quantitative mineralogical composition of complex mineral wastes - Contribution of the Rietveld method

    SciTech Connect

    Mahieux, P.-Y.; Aubert, J.-E.; Cyr, M.; Coutand, M.; Husson, B.

    2010-03-15

    The objective of the work presented in this paper is the quantitative determination of the mineral composition of two complex mineral wastes: a sewage sludge ash (SSA) and a municipal solid waste incineration fly ash (MSWIFA). The mineral compositions were determined by two different methods: the first based on calculation using the qualitative mineralogical composition of the waste combined with physicochemical analyses; the second the Rietveld method, which uses only X-ray diffraction patterns. The results obtained are coherent, showing that it is possible to quantify the mineral compositions of complex mineral waste with such methods. The apparent simplicity of the Rietveld method (due principally to the availability of software packages implementing the method) facilitates its use. However, care should be taken since the crystal structure analysis based on powder diffraction data needs experience and a thorough understanding of crystallography. So the use of another, complementary, method such as the first one used in this study, may sometimes be needed to confirm the results.

  13. A Comparative Study on Tobacco Cessation Methods: A Quantitative Systematic Review

    PubMed Central

    Heydari, Gholamreza; Masjedi, Mohammadreza; Ahmady, Arezoo Ebn; Leischow, Scott J.; Lando, Harry A.; Shadmehr, Mohammad Behgam; Fadaizadeh, Lida

    2014-01-01

    Background: During recent years, there have been many advances in different types of pharmacological and non-pharmacological tobacco control treatments. In this study, we aimed to identify the most effective smoking cessation methods used in quit based upon a review of the literature. Methods: We did a search of PubMed, limited to English publications from 2000 to 2012. Two trained reviewers independently assessed titles, abstracts and full texts of articles after a pilot inter-rater reliability assessment which was conducted by the author (GH). The total number of papers and their conclusions including recommendation of that method (positive) or not supporting (negative) was computed for each method. The number of negative papers was subtracted from the number of positive ones for each method. In cases of inconsistency between the two reviewers, these were adjudicated by author. Results: Of the 932 articles that were critically assessed, 780 studies supported quit smoking methods. In 90 studies, the methods were not supported or rejected and in 62 cases the methods were not supported. Nicotine replacement therapy (NRT), Champix and Zyban with 352, 117 and 71 studies respectively were the most supported methods and e-cigarettes and non-Nicotine medications with one case were the least supported methods. Finally, NRT with 39 and Champix and education with 36 scores were the most supported methods. Conclusions: Results of this review indicate that the scientific papers in the most recent decade recommend the use of NRT and Champix in combination with educational interventions. Additional research is needed to compare qualitative and quantitative studies for smoking cessation. PMID:25013685

  14. Spatial Access Priority Mapping (SAPM) with Fishers: A Quantitative GIS Method for Participatory Planning

    PubMed Central

    Yates, Katherine L.; Schoeman, David S.

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers’ spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers’ willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision

  15. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  16. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  17. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  18. Quantitative analysis of uranium in aqueous solutions using a semiconductor laser-based spectroscopic method.

    PubMed

    Cho, Hye-Ryun; Jung, Euo Chang; Cha, Wansik; Song, Kyuseok

    2013-05-07

    A simple analytical method based on the simultaneous measurement of the luminescence of hexavalent uranium ions (U(VI)) and the Raman scattering of water, was investigated for determining the concentration of U(VI) in aqueous solutions. Both spectra were measured using a cw semiconductor laser beam at a center wavelength of 405 nm. The empirical calibration curve for the quantitative analysis of U(VI) was obtained by measuring the ratio of the luminescence intensity of U(VI) at 519 nm to the Raman scattering intensity of water at 469 nm. The limit of detection (LOD) in the parts per billion range and a dynamic range from the LOD up to several hundred parts per million were achieved. The concentration of uranium in groundwater determined by this method is in good agreement with the results determined by kinetic phosphorescence analysis and inductively coupled plasma mass spectrometry.

  19. Quantitative methods of measuring the sensitivity of the mouse sperm morphology assay

    SciTech Connect

    Moore, D.H.; Bennett, D.E.; Kranzler, D.; Wyrobek, A.J.

    1982-09-01

    In this study murine sperm were subjected to graded doses of X irradiation (0 to 120 rad) to determine whether quantitative measurements made on enlarged photographs of the sperm heads are related to radiation dose. We found that the Mahalanobis distance statistic, when used to measure distance in a multivariate space from a control group of measurements, could be used to classify sperm as normal or abnormal. The percent classified as abnormal by this method was found to be linearly related to dose. The results suggest that sensitivity of the murine sperm assay can be improved by selecting an optimal set of measurements. This improvement can reduce the doubling dose from approximately 70 rad to 10 to 15 rad while keeping the percentage of abnormal sperm in control mice at 3%, equal to the current visual method.

  20. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level.

  1. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  2. The Ten Beads Method: A Novel Way to Collect Quantitative Data in Rural Uganda

    PubMed Central

    Bwambale, Francis Mulekya; Moyer, Cheryl A.; Komakech, Innocent; -Mangen, Fred-Wabwire; Lori, Jody R

    2013-01-01

    This paper illustrates how locally appropriate methods can be used to collect quantitative data from illiterate respondents. This method uses local beads to represent quantities, which is a novel yet potentially valuable methodological improvement over standard Western survey methods. PMID:25170477

  3. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  4. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  5. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  6. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    PubMed Central

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  7. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    PubMed

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  8. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  9. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2016-10-26

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors

  10. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the

  11. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;…

  12. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer.

    PubMed

    Fu, Guanglei; Sanjay, Sharma T; Dou, Maowei; Li, XiuJun

    2016-03-14

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.

  13. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  14. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  15. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods.

    PubMed

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-15

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  16. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    NASA Astrophysics Data System (ADS)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  17. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  18. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  19. Rapid quantitative analysis of lipids using a colorimetric method in a microplate format.

    PubMed

    Cheng, Yu-Shen; Zheng, Yi; VanderGheynst, Jean S

    2011-01-01

    A colorimetric sulfo-phospho-vanillin (SPV) method was developed for high throughput analysis of total lipids. The developed method uses a reaction mixture that is maintained in a 96-well microplate throughout the entire assay. The new assay provides the following advantages over other methods of lipid measurement: (1) background absorbance can be easily corrected for each well, (2) there is less risk of handling and transferring sulfuric acid contained in reaction mixtures, (3) color develops more consistently providing more accurate measurement of absorbance, and (4) the assay can be used for quantitative measurement of lipids extracted from a wide variety of sources. Unlike other spectrophotometric approaches that use fluorescent dyes, the optimal spectra and reaction conditions for the developed assay do not vary with the sample source. The developed method was used to measure lipids in extracts from four strains of microalgae. No significant difference was found in lipid determination when lipid content was measured using the new method and compared to results obtained using a macro-gravimetric method.

  20. How to use linear regression and correlation in quantitative method comparison studies.

    PubMed

    Twomey, P J; Kroll, M H

    2008-04-01

    Linear regression methods try to determine the best linear relationship between data points while correlation coefficients assess the association (as opposed to agreement) between the two methods. Linear regression and correlation play an important part in the interpretation of quantitative method comparison studies. Their major strength is that they are widely known and as a result both are employed in the vast majority of method comparison studies. While previously performed by hand, the availability of statistical packages means that regression analysis is usually performed by software packages including MS Excel, with or without the software programe Analyze-it as well as by other software packages. Such techniques need to be employed in a way that compares the agreement between the two methods examined and more importantly, because we are dealing with individual patients, whether the degree of agreement is clinically acceptable. Despite their use for many years, there is a lot of ignorance about the validity as well as the pros and cons of linear regression and correlation techniques. This review article describes the types of linear regression and regression (parametric and non-parametric methods) and the necessary general and specific requirements. The selection of the type of regression depends on where one has been trained, the tradition of the laboratory and the availability of adequate software.

  1. Defining the knee joint flexion-extension axis for purposes of quantitative gait analysis: an evaluation of methods.

    PubMed

    Schache, Anthony G; Baker, Richard; Lamoreux, Larry W

    2006-08-01

    Minimising measurement variability associated with hip axial rotation and avoiding knee joint angle cross-talk are two fundamental objectives of any method used to define the knee joint flexion-extension axis for purposes of quantitative gait analysis. The aim of this experiment was to compare three different methods of defining this axis: the knee alignment device (KAD) method, a method based on the transepicondylar axis (TEA) and an alternative numerical method (Dynamic). The former two methods are common approaches that have been applied clinically in many quantitative gait analysis laboratories; the latter is an optimisation procedure. A cohort of 20 subjects performed three different functional tasks (normal gait; squat; non-weight bearing knee flexion) on repeated occasions. Three-dimensional hip and knee angles were computed using the three alternative methods of defining the knee joint flexion-extension axis. The repeatability of hip axial rotation measurements during normal gait was found to be significantly better for the Dynamic method (p<0.01). Furthermore, both the variance in the knee varus-valgus kinematic profile and the degree of knee joint angle cross-talk were smallest for the Dynamic method across all functional tasks. The Dynamic method therefore provided superior results in comparison to the KAD and TEA-based methods and thus represents an attractive solution for orientating the knee joint flexion-extension axis for purposes of quantitative gait analysis.

  2. The use of electromagnetic induction methods for establishing quantitative permafrost models in West Greenland

    NASA Astrophysics Data System (ADS)

    Ingeman-Nielsen, Thomas; Brandt, Inooraq

    2010-05-01

    The sedimentary settings at West Greenlandic town and infrastructural development sites are dominated by fine-grained marine deposits of late to post glacial origin. Prior to permafrost formation, these materials were leached by percolating precipitation, resulting in depletion of salts. Present day permafrost in these deposits is therefore very ice-rich with ice contents approaching 50-70% vol. in some areas. Such formations are of great concern in building and construction projects in Greenland, as they loose strength and bearing capacity upon thaw. It is therefore of both technical and economical interest to develop methods to precisely investigate and determine parameters such as ice-content and depth to bedrock in these areas. In terms of geophysical methods for near surface investigations, traditional methods such as Electrical Resistivity Tomography (ERT) and Refraction Seismics (RS) have generally been applied with success. The Georadar method usually fails due to very limited penetration depth in the fine-grained materials, and Electromagnetic Induction (EMI) methods are seldom applicable for quantitative interpretation due to the very high resistivities causing low induced currents and thus small secondary fields. Nevertheless, in some areas of Greenland the marine sequence was exposed relatively late, and as a result the sediments may not be completely leached of salts. In such cases, layers with pore water salinity approaching that of sea water, may be present below an upper layer of very ice rich permafrost. The saline pore water causes a freezing-point depression which results in technically unfrozen sediments at permafrost temperatures around -3 °C. Traditional ERT and VES measurements are severely affected by equivalency problems in these settings, practically prohibiting reasonable quantitative interpretation without constraining information. Such prior information may be obtained of course from boreholes, but equipment capable of drilling

  3. Effects of DNA extraction and purification methods on real-time quantitative PCR analysis of Roundup Ready soybean.

    PubMed

    Demeke, Tigst; Ratnayaka, Indira; Phan, Anh

    2009-01-01

    The quality of DNA affects the accuracy and repeatability of quantitative PCR results. Different DNA extraction and purification methods were compared for quantification of Roundup Ready (RR) soybean (event 40-3-2) by real-time PCR. DNA was extracted using cetylmethylammonium bromide (CTAB), DNeasy Plant Mini Kit, and Wizard Magnetic DNA purification system for food. CTAB-extracted DNA was also purified using the Zymo (DNA Clean & Concentrator 25 kit), Qtip 100 (Qiagen Genomic-Tip 100/G), and QIAEX II Gel Extraction Kit. The CTAB extraction method provided the largest amount of DNA, and the Zymo purification kit resulted in the highest percentage of DNA recovery. The Abs260/280 and Abs260/230 ratios were less than the expected values for some of the DNA extraction and purification methods used, indicating the presence of substances that could inhibit PCR reactions. Real-time quantitative PCR results were affected by the DNA extraction and purification methods used. Further purification or dilution of the CTAB DNA was required for successful quantification of RR soybean. Less variability of quantitative PCR results was observed among experiments and replications for DNA extracted and/or purified by CTAB, CTAB+Zymo, CTAB+Qtip 100, and DNeasy methods. Correct and repeatable results for real-time PCR quantification of RR soybean were achieved using CTAB DNA purified with Zymo and Qtip 100 methods.

  4. Quantitative interferometric microscopic flow cytometer with expanded principal component analysis method

    NASA Astrophysics Data System (ADS)

    Wang, Shouyu; Jin, Ying; Yan, Keding; Xue, Liang; Liu, Fei; Li, Zhenhua

    2014-11-01

    Quantitative interferometric microscopy is used in biological and medical fields and a wealth of applications are proposed in order to detect different kinds of biological samples. Here, we develop a phase detecting cytometer based on quantitative interferometric microscopy with expanded principal component analysis phase retrieval method to obtain phase distributions of red blood cells with a spatial resolution ~1.5 μm. Since expanded principal component analysis method is a time-domain phase retrieval algorithm, it could avoid disadvantages of traditional frequency-domain algorithms. Additionally, the phase retrieval method realizes high-speed phase imaging from multiple microscopic interferograms captured by CCD camera when the biological cells are scanned in the field of view. We believe this method can be a powerful tool to quantitatively measure the phase distributions of different biological samples in biological and medical fields.

  5. An improved simple colorimetric method for quantitation of non-transferrin-bound iron in serum.

    PubMed

    Zhang, D; Okada, S; Kawabata, T; Yasuda, T

    1995-03-01

    A simple method for direct quantitation of non-transferrin-bound iron (NTBI) in serum is introduced. NTBI was separated from serum by adding excess nitrilotriacetic acid disodium salt (NTA) to serum to form an Fe-NTA complex and then ultrafiltrated using a micro-filter. The NTBI in the ultrafiltrate was quantitated using a bathophenanthroline-based method. The optimal detection condition and several potential confounding factors were investigated. The actual measurements to samples in vivo and in vitro showed that this method is very practical.

  6. Agreement experiments: a method for quantitatively testing new medical image display approaches

    NASA Astrophysics Data System (ADS)

    Johnston, Richard E.; Yankaskas, Bonnie C.; Perry, John R.; Pizer, Stephen M.; Delany, David J.; Parker, L. A.

    1990-08-01

    New medical image display devices or processes are commonly evaluated by anecdotal reports or subjective evaluations which are informative and relatively easy to acquire but do not provide quantitative nieasures. On the other hand, experinients eniploying ROC analysis, yield quantitative measurements but are very laborious and demand pathological proof of outcome. We have designed and are employing a new approach, which we have termed "agreement experiments," to quantitatively test the equivalence of observer performance on two systems. This was specifically developed to test whether a radiologist using a new display technique, which has some clear advantages over the standard technique, will detect and interpret diagnostic signs as he would with the standard display technique. Agreement experiments use checklists and confidence ratings to measure how well two radiologists agree on the presence of diagnostic signs when both view images on the standard display. This yields a baseline measure of agreement. Agreement measurements are then obtained when the two radiologists view cases using the new display, or display method, compared to the standard technique. If the levels of agreement when one reads from the new and one reads from the standard display are not statistically different from the baseline measures of agreement, we conclude the two systems are equivalent in conveying diagnostic signs. We will report on an experiment using this test. The experiment compares the agreement of radiological findings for chest CT studies viewed on the conventional multiformat film/lightbox to agreement of radiological findings from chest CT images presented on a multiple screen video system. The study consists of 80 chest CT studies. The results were an 86% to 81% agreement between the two viewing modalities which fell within our criteria of showing agreement.

  7. Quantitative Analysis of Single Amino Acid Variant Peptides Associated with Pancreatic Cancer in Serum by an Isobaric Labeling Quantitative Method

    PubMed Central

    2015-01-01

    Single amino acid variations are highly associated with many human diseases. The direct detection of peptides containing single amino acid variants (SAAVs) derived from nonsynonymous single nucleotide polymorphisms (SNPs) in serum can provide unique opportunities for SAAV associated biomarker discovery. In the present study, an isobaric labeling quantitative strategy was applied to identify and quantify variant peptides in serum samples of pancreatic cancer patients and other benign controls. The largest number of SAAV peptides to date in serum including 96 unique variant peptides were quantified in this quantitative analysis, of which five variant peptides showed a statistically significant difference between pancreatic cancer and other controls (p-value < 0.05). Significant differences in the variant peptide SDNCEDTPEAGYFAVAVVK from serotransferrin were detected between pancreatic cancer and controls, which was further validated by selected reaction monitoring (SRM) analysis. The novel biomarker panel obtained by combining α-1-antichymotrypsin (AACT), Thrombospondin-1 (THBS1) and this variant peptide showed an excellent diagnostic performance in discriminating pancreatic cancer from healthy controls (AUC = 0.98) and chronic pancreatitis (AUC = 0.90). These results suggest that large-scale analysis of SAAV peptides in serum may provide a new direction for biomarker discovery research. PMID:25393578

  8. A simple method for the subnanomolar quantitation of seven ophthalmic drugs in the rabbit eye.

    PubMed

    Latreille, Pierre-Luc; Banquy, Xavier

    2015-05-01

    This study describes the development and validation of a new liquid chromatography-tandem mass spectrometry (MS/MS) method capable of simultaneous quantitation of seven ophthalmic drugs-pilocarpine, lidocaine, atropine, proparacaine, timolol, prednisolone, and triamcinolone acetonide-within regions of the rabbit eye. The complete validation of the method was performed using an Agilent 1100 series high-performance liquid chromatography system coupled to a 4000 QTRAP MS/MS detector in positive TurboIonSpray mode with pooled drug solutions. The method sensitivity, evaluated by the lower limit of quantitation in two simulated matrices, yielded lower limits of quantitation of 0.25 nmol L(-1) for most of the drugs. The precision in the low, medium, and high ranges of the calibration curves, the freeze-thaw stability over 1 month, the intraday precision, and the interday precision were all within a 15% limit. The method was used to quantitate the different drugs in the cornea, aqueous humor, vitreous humor, and remaining eye tissues of the rabbit eye. It was validated to a concentration of up to 1.36 ng/g in humors and 5.43 ng/g in tissues. The unprecedented low detection limit of the present method and its ease of implementation allow easy, robust, and reliable quantitation of multiple drugs for rapid in vitro and in vivo evaluation of the local pharmacokinetics of these compounds.

  9. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry

    PubMed Central

    Chavez, Juan D.; Eng, Jimmy K.; Schweppe, Devin K.; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E.

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions. PMID:27997545

  10. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  11. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  12. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  13. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    PubMed Central

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009

  14. Quantitative measurement of ultrasound pressure field by optical phase contrast method and acoustic holography

    NASA Astrophysics Data System (ADS)

    Oyama, Seiji; Yasuda, Jun; Hanayama, Hiroki; Yoshizawa, Shin; Umemura, Shin-ichiro

    2016-07-01

    A fast and accurate measurement of an ultrasound field with various exposure sequences is necessary to ensure the efficacy and safety of various ultrasound applications in medicine. The most common method used to measure an ultrasound pressure field, that is, hydrophone scanning, requires a long scanning time and potentially disturbs the field. This may limit the efficiency of developing applications of ultrasound. In this study, an optical phase contrast method enabling fast and noninterfering measurements is proposed. In this method, the modulated phase of light caused by the focused ultrasound pressure field is measured. Then, a computed tomography (CT) algorithm used to quantitatively reconstruct a three-dimensional (3D) pressure field is applied. For a high-intensity focused ultrasound field, a new approach that combines the optical phase contrast method and acoustic holography was attempted. First, the optical measurement of focused ultrasound was rapidly performed over the field near a transducer. Second, the nonlinear propagation of the measured ultrasound was simulated. The result of the new approach agreed well with that of the measurement using a hydrophone and was improved from that of the phase contrast method alone with phase unwrapping.

  15. Comparison of Overlap Methods for Quantitatively Synthesizing Single-Subject Data

    ERIC Educational Resources Information Center

    Wolery, Mark; Busick, Matthew; Reichow, Brian; Barton, Erin E.

    2010-01-01

    Four overlap methods for quantitatively synthesizing single-subject data were compared to visual analysts' judgments. The overlap methods were percentage of nonoverlapping data, pairwise data overlap squared, percentage of data exceeding the median, and percentage of data exceeding a median trend. Visual analysts made judgments about 160 A-B data…

  16. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  17. A method for the quantitative determination of crystalline phases by X-ray

    NASA Technical Reports Server (NTRS)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  18. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  19. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  20. Integrating Qualitative Methods in a Predominantly Quantitative Evaluation: A Case Study and Some Reflections.

    ERIC Educational Resources Information Center

    Mark, Melvin M.; Feller, Irwin; Button, Scott B.

    1997-01-01

    A review of qualitative methods used in a predominantly quantitative evaluation indicates a variety of roles for such a mixing of methods, including framing and revising research questions, assessing the validity of measures and adaptations to program implementation, and gauging the degree of uncertainty and generalizability of conclusions.…

  1. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  2. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  3. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-03-01

    This data article describes a controlled, spiked proteomic dataset for which the "ground truth" of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values.

  4. Using the Taguchi method for rapid quantitative PCR optimization with SYBR Green I.

    PubMed

    Thanakiatkrai, Phuvadol; Welch, Lindsey

    2012-01-01

    Here, we applied the Taguchi method, an engineering optimization process, to successfully determine the optimal conditions for three SYBR Green I-based quantitative PCR assays. This method balanced the effects of all factors and their associated levels by using an orthogonal array rather than a factorial array. Instead of running 27 experiments with the conventional factorial method, the Taguchi method achieved the same optimal conditions using only nine experiments, saving valuable resources.

  5. Measuring access to medicines: a review of quantitative methods used in household surveys

    PubMed Central

    2010-01-01

    Background Medicine access is an important goal of medicine policy; however the evaluation of medicine access is a subject under conceptual and methodological development. The aim of this study was to describe quantitative methodologies to measure medicine access on household level, access expressed as paid or unpaid medicine acquisition. Methods Searches were carried out in electronic databases and health institutional sites; within references from retrieved papers and by contacting authors. Results Nine papers were located. The methodologies of the studies presented differences in the recall period, recruitment of subjects and medicine access characterization. Conclusions The standardization of medicine access indicators and the definition of appropriate recall periods are required to evaluate different medicines and access dimensions, improving studies comparison. Besides, specific keywords must be established to allow future literature reviews about this topic. PMID:20509960

  6. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    PubMed

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  7. Infectious titres of sheep scrapie and bovine spongiform encephalopathy agents cannot be accurately predicted from quantitative laboratory test results.

    PubMed

    González, Lorenzo; Thorne, Leigh; Jeffrey, Martin; Martin, Stuart; Spiropoulos, John; Beck, Katy E; Lockey, Richard W; Vickery, Christopher M; Holder, Thomas; Terry, Linda

    2012-11-01

    It is widely accepted that abnormal forms of the prion protein (PrP) are the best surrogate marker for the infectious agent of prion diseases and, in practice, the detection of such disease-associated (PrP(d)) and/or protease-resistant (PrP(res)) forms of PrP is the cornerstone of diagnosis and surveillance of the transmissible spongiform encephalopathies (TSEs). Nevertheless, some studies question the consistent association between infectivity and abnormal PrP detection. To address this discrepancy, 11 brain samples of sheep affected with natural scrapie or experimental bovine spongiform encephalopathy were selected on the basis of the magnitude and predominant types of PrP(d) accumulation, as shown by immunohistochemical (IHC) examination; contra-lateral hemi-brain samples were inoculated at three different dilutions into transgenic mice overexpressing ovine PrP and were also subjected to quantitative analysis by three biochemical tests (BCTs). Six samples gave 'low' infectious titres (10⁶·⁵ to 10⁶·⁷ LD₅₀ g⁻¹) and five gave 'high titres' (10⁸·¹ to ≥ 10⁸·⁷ LD₅₀ g⁻¹) and, with the exception of the Western blot analysis, those two groups tended to correspond with samples with lower PrP(d)/PrP(res) results by IHC/BCTs. However, no statistical association could be confirmed due to high individual sample variability. It is concluded that although detection of abnormal forms of PrP by laboratory methods remains useful to confirm TSE infection, infectivity titres cannot be predicted from quantitative test results, at least for the TSE sources and host PRNP genotypes used in this study. Furthermore, the near inverse correlation between infectious titres and Western blot results (high protease pre-treatment) argues for a dissociation between infectivity and PrP(res).

  8. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2016-11-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  9. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  10. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.

  11. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    PubMed

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens.

  12. Quantitative Assessment of the Impact of Blood Pulsation on Intraocular Pressure Measurement Results in Healthy Subjects

    PubMed Central

    2017-01-01

    Background. Blood pulsation affects the results obtained using various medical devices in many different ways. Method. The paper proves the effect of blood pulsation on intraocular pressure measurements. Six measurements for each of the 10 healthy subjects were performed in various phases of blood pulsation. A total of 8400 corneal deformation images were recorded. The results of intraocular pressure measurements were related to the results of heartbeat phases measured with a pulse oximeter placed on the index finger of the subject's left hand. Results. The correlation between the heartbeat phase measured with a pulse oximeter and intraocular pressure is 0.69 ± 0.26 (p < 0.05). The phase shift calculated for the maximum correlation is equal to 60 ± 40° (p < 0.05). When the moment of measuring intraocular pressure with an air-puff tonometer is not synchronized, the changes in IOP for the analysed group of subjects can vary in the range of ±2.31 mmHg (p < 0.3). Conclusions. Blood pulsation has a statistically significant effect on the results of intraocular pressure measurement. For this reason, in modern ophthalmic devices, the measurement should be synchronized with the heartbeat phases. The paper proposes an additional method for synchronizing the time of pressure measurement with the blood pulsation phase. PMID:28250983

  13. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    PubMed

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances.

  14. [Quantitative analysis method of natural gas combustion process combining wavelength selection and outlier spectra detection].

    PubMed

    Cao, Hui; Hu, Luo-Na; Zhou, Yan

    2012-10-01

    The present paper uses a combination method of wavelength selection and outlier spectra detection for quantitative analysis of nature gas combustion process based on its near infrared spectra. According to the statistical distribution of partial least squares (PLS) model coefficients and prediction errors, the method realized wavelength selection and outlier spectra detection, respectively. In contrast with PLS, PLS after leave-one-out for outlier detection (LOO-PLS), uninformative variable elimination by PLS (UVE-PLS) and UVE-PLS after leave-one-out for outlier detection (LOO-UVE-PLS), the root-mean-squared error of prediction (RMSEP) based on the method for CH4 prediction model is reduced by 14.33%, 14.33%, 10.96% and 12.21%; the RMSEP value for CO prediction model is reduced by 67.26%, 72.58%, 11.32% and 4.52%; the RMSEP value for CO2 prediction model is reduced by 5.95%, 19.7%, 36.71% and 4.04% respectively. Experimental results demonstrate that the method can significantly decrease the number of selected wavelengths, reduce model complexity and effectively detect outlier spectra. The established prediction model of analytes is more accurate as well as robust.

  15. Intracranial aneurysm segmentation in 3D CT angiography: method and quantitative validation

    NASA Astrophysics Data System (ADS)

    Firouzian, Azadeh; Manniesing, R.; Flach, Z. H.; Risselada, R.; van Kooten, F.; Sturkenboom, M. C. J. M.; van der Lugt, A.; Niessen, W. J.

    2010-03-01

    Accurately quantifying aneurysm shape parameters is of clinical importance, as it is an important factor in choosing the right treatment modality (i.e. coiling or clipping), in predicting rupture risk and operative risk and for pre-surgical planning. The first step in aneurysm quantification is to segment it from other structures that are present in the image. As manual segmentation is a tedious procedure and prone to inter- and intra-observer variability, there is a need for an automated method which is accurate and reproducible. In this paper a novel semi-automated method for segmenting aneurysms in Computed Tomography Angiography (CTA) data based on Geodesic Active Contours is presented and quantitatively evaluated. Three different image features are used to steer the level set to the boundary of the aneurysm, namely intensity, gradient magnitude and variance in intensity. The method requires minimum user interaction, i.e. clicking a single seed point inside the aneurysm which is used to estimate the vessel intensity distribution and to initialize the level set. The results show that the developed method is reproducible, and performs in the range of interobserver variability in terms of accuracy.

  16. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    PubMed Central

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R.; Twa, Michael D.; Larin, Kirill V.

    2015-01-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessing biomechanical properties of tissues with a micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young’s modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods. PMID:25860076

  17. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study.

  18. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas.

  19. Quantitative analysis of collagen change between normal and cancerous thyroid tissues based on SHG method

    NASA Astrophysics Data System (ADS)

    Chen, Xiwen; Huang, Zufang; Xi, Gangqin; Chen, Yongjian; Lin, Duo; Wang, Jing; Li, Zuanfang; Sun, Liqing; Chen, Jianxin; Chen, Rong

    2012-03-01

    Second-harmonic generation (SHG) is proved to be a high spatial resolution, large penetration depth and non-photobleaching method. In our study, SHG method was used to investigate the normal and cancerous thyroid tissue. For SHG imaging performance, system parameters were adjusted for high-contrast images acquisition. Each x-y image was recorded in pseudo-color, which matches the wavelength range in the visible spectrum. The acquisition time for a 512×512-pixels image was 1.57 sec; each acquired image was averaged four frames to improve the signal-to-noise ratio. Our results indicated that collagen presence as determined by counting the ratio of the SHG pixels over the whole pixels for normal and cancerous thyroid tissues were 0.48+/-0.05, 0.33+/-0.06 respectively. In addition, to quantitatively assess collagen-related changes, we employed GLCM texture analysis to the SHG images. Corresponding results showed that the correlation both fell off with distance in normal and cancerous group. Calculated value of Corr50 (the distance where the correlation crossed 50% of the initial correlation) indicated significant difference. This study demonstrates that SHG method can be used as a complementary tool in thyroid histopathology.

  20. Quantitative analysis of collagen change between normal and cancerous thyroid tissues based on SHG method

    NASA Astrophysics Data System (ADS)

    Chen, Xiwen; Huang, Zufang; Xi, Gangqin; Chen, Yongjian; Lin, Duo; Wang, Jing; Li, Zuanfang; Sun, Liqing; Chen, Jianxin; Chen, Rong

    2011-11-01

    Second-harmonic generation (SHG) is proved to be a high spatial resolution, large penetration depth and non-photobleaching method. In our study, SHG method was used to investigate the normal and cancerous thyroid tissue. For SHG imaging performance, system parameters were adjusted for high-contrast images acquisition. Each x-y image was recorded in pseudo-color, which matches the wavelength range in the visible spectrum. The acquisition time for a 512×512-pixels image was 1.57 sec; each acquired image was averaged four frames to improve the signal-to-noise ratio. Our results indicated that collagen presence as determined by counting the ratio of the SHG pixels over the whole pixels for normal and cancerous thyroid tissues were 0.48+/-0.05, 0.33+/-0.06 respectively. In addition, to quantitatively assess collagen-related changes, we employed GLCM texture analysis to the SHG images. Corresponding results showed that the correlation both fell off with distance in normal and cancerous group. Calculated value of Corr50 (the distance where the correlation crossed 50% of the initial correlation) indicated significant difference. This study demonstrates that SHG method can be used as a complementary tool in thyroid histopathology.

  1. Comparison of Concentration Methods for Quantitative Detection of Sewage-Associated Viral Markers in Environmental Waters

    PubMed Central

    Harwood, V. J.; Gyawali, P.; Sidhu, J. P. S.; Toze, S.

    2015-01-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ∼10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg2+) and elution with acid were inefficient and more prone to

  2. Quantitative Differentiation of Bloodstain Patterns Resulting from Gunshot and Blunt Force Impacts.

    PubMed

    Siu, Sonya; Pender, Jennifer; Springer, Faye; Tulleners, Frederic; Ristenpart, William

    2017-02-10

    Bloodstain pattern analysis (BPA) provides significant evidentiary value in crime scene interpretation and reconstruction. In this work, we develop a quantitative methodology using digital image analysis techniques to differentiate impact bloodstain patterns. The bloodstain patterns were digitally imaged and analyzed using image analysis algorithms. Our analysis of 72 unique bloodstain patterns, comprising more than 490,000 individual droplet stains, indicates that the mean drop size in a gunshot spatter pattern is at most 30% smaller than the mean drop stain size in blunt instrument patterns. In contrast, we demonstrate that the spatial distribution of the droplet stains-their density as a function of position in the pattern-significantly differs between gunshot and blunt instrument patterns, with densities as much as 400% larger for gunshot impacts. Thus, quantitative metrics involving the spatial distribution of droplet stains within a bloodstain pattern can be useful for objective differentiation between blunt instrument and gunshot bloodstain patterns.

  3. X-ray fluorescence-based method for the quantitative determination of uranium in the aqueous solutions

    NASA Astrophysics Data System (ADS)

    Dubrovka, S.; Chursin, S.; Verkhoturova, V.

    2017-01-01

    Currently, one of the important issues in the field of nuclear technology is providing special handling with respect to nuclear materials, due to their energy and commercial significancy, as well as their potential radiation contamination threat. There is a necessity to have information about the full qualitative and quantitative composition of the sample as a part of special handling with nuclear materials. Spectrometric methods solve this problem effectively. One of these methods is the X-ray fluorescence analysis, which is fast, nondestructive and environmentally friendly with a high accuracy and reproducibility of the results. Development of uranium quantitative determination method in aqueous solutions to solve the problems of accounting and control of nuclear materials is the subject of research in this article. The development of the uranium concentration determination method in the aqueous solutions of uranyl nitrate UO2(NO3)2 was carried out using Spectroscan MAKC-G – wavelength dispersive crystal diffraction XRF spectrometer.

  4. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  5. Simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and dead reckoning

    NASA Astrophysics Data System (ADS)

    Davey, Neil S.; Godil, Haris

    2013-05-01

    This article presents a comparative study between a well-known SLAM (Simultaneous Localization and Mapping) algorithm, called Gmapping, and a standard Dead-Reckoning algorithm; the study is based on experimental results of both approaches by using a commercial skid-based turning robot, P3DX. Five main base-case scenarios are conducted to evaluate and test the effectiveness of both algorithms. The results show that SLAM outperformed the Dead Reckoning in terms of map-making accuracy in all scenarios but one, since SLAM did not work well in a rapidly changing environment. Although the main conclusion about the excellence of SLAM is not surprising, the presented test method is valuable to professionals working in this area of mobile robots, as it is highly practical, and provides solid and valuable results. The novelty of this study lies in its simplicity. The simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and Dead Reckoning and some applications using autonomous robots are being patented by the authors in U.S. Patent Application Nos. 13/400,726 and 13/584,862.

  6. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  7. Performance Observations of Scanner Qualification of NCI-Designated Cancer Centers: Results From the Centers of Quantitative Imaging Excellence (CQIE) Program

    PubMed Central

    Rosen, Mark; Kinahan, Paul E.; Gimpel, James F.; Opanowski, Adam; Siegel, Barry A.; Hill, G. Craig; Weiss, Linda; Shankar, Lalitha

    2016-01-01

    We present an overview of the Centers for Quantitative Imaging Excellence (CQIE) program, which was initiated in 2010 to establish a resource of clinical trial-ready sites within the National Cancer Institute (NCI)-designated Cancer Centers (NCI-CCs) network. The intent was to enable imaging centers in the NCI-CCs network capable of conducting treatment trials with advanced quantitative imaging end points. We describe the motivations for establishing the CQIE, the process used to initiate the network, the methods of site qualification for positron emission tomography, computed tomography, and magnetic resonance imaging, and the results of the evaluations over the subsequent 3 years. PMID:28395794

  8. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  9. A method for comprehensive glycosite-mapping and direct quantitation of plasma glycoproteins

    PubMed Central

    Hong, Qiuting; Ruhaak, L. Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B.

    2015-01-01

    A comprehensive glycan map was constructed for the top eight abundant plasma glycoproteins using both specific and non-specific enzyme digestions followed by nano LC–Chip/QTOF mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20-min UPLC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared to quantitation methods that involve protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries. PMID:26510530

  10. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT

    PubMed Central

    Yoon, Hyun Jin

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SDx, SDv) or the height of the fundamental peak (A1) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SDx, SDv, A1, and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SDx, and SDv, p = 0.0002 for A1). In ROC analysis, the cutoff values were 0.11 for SDx (AUC: 0.982, p < 0.0001), 0.062 for SDv (AUC: 0.847, p < 0.0001), 0.117 for A1 (AUC: 0.876, p < 0.0001), and 0.349 for MUD-MDD (AUC: 0.948, p < 0.0001). This is the first study to analyze multiple aspects of respiration using various mathematical constructs and provides quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration. PMID:27872857

  11. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  12. Quantitative structure-activity relationships of imidazole-containing farnesyltransferase inhibitors using different chemometric methods.

    PubMed

    Shayanfar, Ali; Ghasemi, Saeed; Soltani, Somaieh; Asadpour-Zeynali, Karim; Doerksen, Robert J; Jouyban, Abolghasem

    2013-05-01

    Farnesyltranseferase inhibitors (FTIs) are one of the most promising classes of anticancer agents, but though some compounds in this category are in clinical trials there are no marketed drugs in this class yet. Quantitative structure activity relationship (QSAR) models can be used for predicting the activity of FTI candidates in early stages of drug discovery. In this study 192 imidazole-containing FTIs were obtained from the literature, structures of the molecules were optimized using Hyperchem software, and molecular descriptors were calculated using Dragon software. The most suitable descriptors were selected using genetic algorithms-partial least squares (GA-PLS) and stepwise regression, and indicated that the volume, shape and polarity of the FTIs are important for their activities. 2D-QSAR models were prepared using both linear methods, i.e., multiple linear regression (MLR), and non-linear methods, i.e., artificial neural networks (ANN) and support vector machines (SVM). The proposed QSAR models were validated using internal and external validation methods. The results show that the proposed 2D-QSAR models are valid and that they can be applied to predict the activities of imidazole-containing FTIs. The prediction capability of the 2D-QSAR (linear and non-linear) models is comparable to and somewhat better than that of previous 3D-QSAR models and the non-linear models are more accurate than the linear models.

  13. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    ERIC Educational Resources Information Center

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  14. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  15. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    ERIC Educational Resources Information Center

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining…

  16. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  17. A GC-FID method for quantitative analysis of N,N-carbonyldiimidazole.

    PubMed

    Lee, Claire; Mangion, Ian

    2016-03-20

    N,N-Carbonyldiimidazole (CDI), a common synthetic reagent used in commercial scale pharmaceutical synthesis, is known to be sensitive to hydrolysis from ambient moisture. This liability demands a simple, robust analytical method to quantitatively determine reagent quality to ensure reproducible performance in chemical reactions. This work describes a protocol for a rapid GC-FID based analysis of CDI.

  18. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  19. A method for the rapid qualitative and quantitative analysis of 4,4-dimethyl sterols.

    PubMed

    Gibbons, G F; Mitropoulos, K A; Ramananda, K

    1973-09-01

    A simple and relatively rapid technique has been developed for the separation of several 4,4-dimethyl steryl acetates, some of which contain sterically hindered nuclear double bonds. The method involves thin-layer chromatography on silver nitrate-impregnated silica gel and silver nitrate-impregnated alumina. The separated steryl acetates may then be analyzed quantitatively by gas-liquid chromatography.

  20. Potential Guidelines for Conducting and Reporting Environmental Education Research: Quantitative Methods of Inquiry.

    ERIC Educational Resources Information Center

    Smith-Sebasto, N. J.

    2001-01-01

    Presents potential guidelines for conducting and reporting environmental education research using quantitative methods of inquiry that were developed during a 10-hour (1-1/2 day) workshop sponsored by the North American Commission on Environmental Education Research during the 1998 annual meeting of the North American Association for Environmental…

  1. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  2. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  3. The Use of Quantitative Methods as an Aid to Decision Making in Educational Administration.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.

    Three quantitative methods are outlined, with suggestions for application to particular problem areas of educational administration: (1) The Leontief input-output analysis, incorporating a "transaction table" for displaying relationships between economic outputs and inputs, mainly applicable to budget analysis and planning; (2) linear programing,…

  4. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  5. A Quantitative Comparison of Calibration Methods for RGB-D Sensors Using Different Technologies

    PubMed Central

    Villena-Martínez, Víctor; Fuster-Guilló, Andrés; Azorín-López, Jorge; Saval-Calvo, Marcelo; Mora-Pascual, Jeronimo; Garcia-Rodriguez, Jose; Garcia-Garcia, Alberto

    2017-01-01

    RGB-D (Red Green Blue and Depth) sensors are devices that can provide color and depth information from a scene at the same time. Recently, they have been widely used in many solutions due to their commercial growth from the entertainment market to many diverse areas (e.g., robotics, CAD, etc.). In the research community, these devices have had good uptake due to their acceptable level of accuracy for many applications and their low cost, but in some cases, they work at the limit of their sensitivity, near to the minimum feature size that can be perceived. For this reason, calibration processes are critical in order to increase their accuracy and enable them to meet the requirements of such kinds of applications. To the best of our knowledge, there is not a comparative study of calibration algorithms evaluating its results in multiple RGB-D sensors. Specifically, in this paper, a comparison of the three most used calibration methods have been applied to three different RGB-D sensors based on structured light and time-of-flight. The comparison of methods has been carried out by a set of experiments to evaluate the accuracy of depth measurements. Additionally, an object reconstruction application has been used as example of an application for which the sensor works at the limit of its sensitivity. The obtained results of reconstruction have been evaluated through visual inspection and quantitative measurements. PMID:28134826

  6. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles.

    PubMed

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki

    2012-09-28

    The detection of estrogen receptors (ERs) by immunohistochemistry (IHC) using 3,3'-diaminobenzidine (DAB) is slightly weak as a prognostic marker, but it is essential to the application of endocrine therapy, such as antiestrogen tamoxifen-based therapy. IHC using DAB is a poor quantitative method because horseradish peroxidase (HRP) activity depends on reaction time, temperature and substrate concentration. However, IHC using fluorescent material provides an effective method to quantitatively use IHC because the signal intensity is proportional to the intensity of the photon excitation energy. However, the high level of autofluorescence has impeded the development of quantitative IHC using fluorescence. We developed organic fluorescent material (tetramethylrhodamine)-assembled nanoparticles for IHC. Tissue autofluorescence is comparable to the fluorescence intensity of quantum dots, which are the most representative fluorescent nanoparticles. The fluorescent intensity of our novel nanoparticles was 10.2-fold greater than quantum dots, and they did not bind non-specifically to breast cancer tissues due to the polyethylene glycol chain that coated their surfaces. Therefore, the fluorescent intensity of our nanoparticles significantly exceeded autofluorescence, which produced a significantly higher signal-to-noise ratio on IHC-imaged cancer tissues than previous methods. Moreover, immunostaining data from our nanoparticle fluorescent IHC and IHC with DAB were compared in the same region of adjacent tissues sections to quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal.

  7. Quantitative and chemical fingerprint analysis for quality control of rhizoma Coptidischinensis based on UPLC-PAD combined with chemometrics methods.

    PubMed

    Kong, Wei-Jun; Zhao, Yan-Ling; Xiao, Xiao-He; Jin, Cheng; Li, Zu-Lun

    2009-10-01

    To control the quality of rhizoma Coptidis, a method based on ultra performance liquid chromatography with photodiode array detector (UPLC-PAD) was developed for quantitative analysis of five active alkaloids and chemical fingerprint analysis. In quantitative analysis, the five alkaloids showed good regression (R > 0.9992) within test ranges and the recovery of the method was in the range of 98.4-100.8%. The limit of detections and quantifications for five alkaloids in PAD were less than 0.07 and 0.22 microg/ml, respectively. In order to compare the UPLC fingerprints between rhizoma Coptidis from different origins, the chemometrics procedures, including similarity analysis (SA), hierarchical clustering analysis (HCA), principal component analysis (PCA) were applied to classify the rhizoma Coptidis samples according to their cultivated origins. Consistent results were obtained to show that rhizoma Coptidis samples could be successfully grouped in accordance with the province of origin. Furthermore, five marker constituents were screened out to be the main chemical marker, which could be applied to accurate discrimination and quality control for rhizoma Coptidis by quantitative analysis. This study revealed that UPLC-PAD method was simple, sensitive and reliable for quantitative and chemical fingerprint analysis, moreover, for the quality evaluation and control of rhizoma Coptidis.

  8. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  9. Path Integrals and Exotic Options:. Methods and Numerical Results

    NASA Astrophysics Data System (ADS)

    Bormetti, G.; Montagna, G.; Moreni, N.; Nicrosini, O.

    2005-09-01

    In the framework of Black-Scholes-Merton model of financial derivatives, a path integral approach to option pricing is presented. A general formula to price path dependent options on multidimensional and correlated underlying assets is obtained and implemented by means of various flexible and efficient algorithms. As an example, we detail the case of Asian call options. The numerical results are compared with those obtained with other procedures used in quantitative finance and found to be in good agreement. In particular, when pricing at the money (ATM) and out of the money (OTM) options, path integral exhibits competitive performances.

  10. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival.

  11. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers.

    PubMed

    Harman-Ware, Anne E; Foster, Cliff; Happs, Renee M; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F

    2016-10-01

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.

  12. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    NASA Astrophysics Data System (ADS)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  13. Quantitative Susceptibility Mapping Using the Multiple Dipole-Inversion Combination with k-space Segmentation Method.

    PubMed

    Sato, Ryota; Shirai, Toru; Taniguchi, Yo; Murase, Takenori; Bito, Yoshitaka; Ochi, Hisaaki

    2017-03-27

    Quantitative susceptibility mapping (QSM) is a new magnetic resonance imaging (MRI) technique for noninvasively estimating the magnetic susceptibility of biological tissue. Several methods for QSM have been proposed. One of these methods can estimate susceptibility with high accuracy in tissues whose contrast is consistent between magnitude images and susceptibility maps, such as deep gray-matter nuclei. However, the susceptibility of small veins is underestimated and not well depicted by using the above approach, because the contrast of small veins is inconsistent between a magnitude image and a susceptibility map. In order to improve the estimation accuracy and visibility of small veins without streaking artifacts, a method with multiple dipole-inversion combination with k-space segmentation (MUDICK) has been proposed. In the proposed method, k-space was divided into three domains (low-frequency, magic-angle, and high-frequency). The k-space data in low-frequency and magic-angle domains were obtained by L1-norm regularization using structural information of a pre-estimated susceptibility map. The k-space data in high-frequency domain were obtained from the pre-estimated susceptibility map in order to preserve small-vein contrasts. Using numerical simulation and human brain study at 3 Tesla, streaking artifacts and small-vein susceptibility were compared between MUDICK and conventional methods (MEDI and TKD). The numerical simulation and human brain study showed that MUDICK and MEDI had no severe streaking artifacts and MUDICK showed higher contrast and accuracy of susceptibility in small-veins compared to MEDI. These results suggest that MUDICK can improve the accuracy and visibility of susceptibility in small-veins without severe streaking artifacts.

  14. Electron paramagnetic resonance method for the quantitative assay of ketoconazole in pharmaceutical preparations.

    PubMed

    Morsy, Mohamed A; Sultan, Salah M; Dafalla, Hatim

    2009-08-15

    In this study, electron paramagnetic resonance (EPR) is used, for the first time, as an analytical tool for the quantitative assay of ketoconazole (KTZ) in drug formulations. The drug was successfully characterized by the prominent signals by two radical species produced as a result of its oxidation with 400 microg/mL cerium(IV) in 0.10 mol dm(-3) sulfuric acid. The EPR signal of the reaction mixture was measured in eight capillary tubes housed in a 4 mm EPR sample tube. The radical stability was investigated by obtaining multi-EPR scans of each KTZ sample solution at time intervals of 2.5 min of the reaction mixing time. The plot of the disappearance of the radical species show that the disappearance is apparently of zero order. The zero-time intercept of the EPR signal amplitude, which should be proportional to the initial radical concentration, is linear in the sample concentration in the range between 100 and 400 microg/mL, with a correlation coefficient, r, of 0.999. The detection limit was determined to be 11.7 +/- 2.5 microg/mL. The method newly adopted was fully validated following the United States Pharmacopeia (USP) monograph protocol in both the generic and the proprietary forms. The method is very accurate, such that we were able to measure the concentration at confidence levels of 99.9%. The method was also found to be suitable for the assay of KTZ in its tablet and cream pharmaceutical preparations, as no interferences were encountered from excipients of the proprietary drugs. High specificity, simplicity, and rapidity are the merits of the present method compared to the previously reported methods.

  15. Linking multidimensional functional diversity to quantitative methods: a graphical hypothesis--evaluation framework.

    PubMed

    Boersma, Kate S; Dee, Laura E; Miller, Steve J; Bogan, Michael T; Lytle, David A; Gitelman, Alix I

    2016-03-01

    Functional trait analysis is an appealing approach to study differences among biological communities because traits determine species' responses to the environment and their impacts on ecosystem functioning. Despite a rapidly expanding quantitative literature, it remains challenging to conceptualize concurrent changes in multiple trait dimensions ("trait space") and select quantitative functional diversity methods to test hypotheses prior to analysis. To address this need, we present a widely applicable framework for visualizing ecological phenomena in trait space to guide the selection, application, and interpretation of quantitative functional diversity methods. We describe five hypotheses that represent general patterns of responses to disturbance in functional community ecology and then apply a formal decision process to determine appropriate quantitative methods to test ecological hypotheses. As a part of this process, we devise a new statistical approach to test for functional turnover among communities. Our combination of hypotheses and metrics can be applied broadly to address ecological questions across a range of systems and study designs. We illustrate the framework with a case study of disturbance in freshwater communities. This hypothesis-driven approach will increase the rigor and transparency of applied functional trait studies.

  16. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer

    NASA Astrophysics Data System (ADS)

    Fu, Guanglei; Sanjay, Sharma T.; Dou, Maowei; Li, Xiujun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. Electronic supplementary information (ESI) available: Additional information on FTIR characterization (Fig. S1), photothermal immunoassay of PSA in human serum samples (Table S1), and the Experimental section, including preparation of antibody-conjugated iron oxide NPs, sandwich-type immunoassay, characterization, and photothermal detection protocol. See DOI: 10.1039/c5nr09051b

  17. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations.

    PubMed

    Wagner, Karla D; Davidson, Peter J; Pollini, Robin A; Strathdee, Steffanie A; Washburn, Rachel; Palinkas, Lawrence A

    2012-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, whilst conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors' research on HIV risk amongst injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a Needle/Syringe Exchange Program in Los Angeles, CA, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative parts.

  18. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations

    PubMed Central

    Wagner, Karla D.; Davidson, Peter J.; Pollini, Robin A.; Strathdee, Steffanie A.; Washburn, Rachel; Palinkas, Lawrence A.

    2011-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, while conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors’ research on HIV risk among injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a needle/syringe exchange program in Los Angeles, California, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative

  19. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  20. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  1. An Evaluation of Quantitative Methods of Determining the Degree of Melting Experienced by a Chondrule

    NASA Technical Reports Server (NTRS)

    Nettles, J. W.; Lofgren, G. E.; Carlson, W. D.; McSween, H. Y., Jr.

    2004-01-01

    Many workers have considered the degree to which partial melting occurred in chondrules they have studied, and this has led to attempts to find reliable methods of determining the degree of melting. At least two quantitative methods have been used in the literature: a convolution index (CVI), which is a ratio of the perimeter of the chondrule as seen in thin section divided by the perimeter of a circle with the same area as the chondrule, and nominal grain size (NGS), which is the inverse square root of the number density of olivines and pyroxenes in a chondrule (again, as seen in thin section). We have evaluated both nominal grain size and convolution index as melting indicators. Nominal grain size was measured on the results of a set of dynamic crystallization experiments previously described, where aliquots of LEW97008(L3.4) were heated to peak temperatures of 1250, 1350, 1370, and 1450 C, representing varying degrees of partial melting of the starting material. Nominal grain size numbers should correlate with peak temperature (and therefore degree of partial melting) if it is a good melting indicator. The convolution index is not directly testable with these experiments because the experiments do not actually create chondrules (and therefore they have no outline on which to measure a CVI). Thus we had no means to directly test how well the CVI predicted different degrees of melting. Therefore, we discuss the use of the CVI measurement and support the discussion with X-ray Computed Tomography (CT) data.

  2. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis.

  3. Identification and quantitation method for nonylphenol and lower oligomer nonylphenol ethoxylates in fish tissues.

    PubMed

    Snyder, S A; Keith, T L; Naylor, C G; Staples, C A; Giesy, J P

    2001-09-01

    Substantial research is currently focused on the toxicological effects of alkylphenol ethoxylates (APEs) and alkylphenols (APs) on aquatic animals. Considerable data are available on the concentrations of APEs and APs in river systems in the United States; however, few if any data are available on the tissue concentrations of fish living in these rivers. A reliable method for the analysis of nonylphenol (NP) and lower oligomer nonylphenol ethoxylates (NPE1-3) in fish tissues has been developed. Nonylphenol and NPE1-3 were extracted from fish tissues using extractive steam distillation. Normal phase high-performance liquid chromatography (HLPC) was used as a cleanup step prior to analysis by gas chromatography with mass selective detection (GC/MSD) using selected ion monitoring. Optimization of this technique resulted in consistent recoveries in excess of 70%, with the exception of NPE3 (17%). Method detection limits (MDLs) and limits of quantitation using the technique range from 3 to 20 and 5 to 29 ng/g wet weight, respectively. Nonylphenol and NPE1 were detected in subsamples (n = 6) of a single common carp captured in the Las Vegas Bay of Lake Mead (NV, USA) at average concentrations of 184+/-4 ng/g and 242+/-9 wet weight, respectively. Nonylphenol ethoxylates were not detected in the carp collected at Lake Mead.

  4. A quantitative cell modeling and wound-healing analysis based on the Electric Cell-substrate Impedance Sensing (ECIS) method.

    PubMed

    Yang, Jen Ming; Chen, Szi-Wen; Yang, Jhe-Hao; Hsu, Chih-Chin; Wang, Jong-Shyan

    2016-02-01

    In this paper, a quantitative modeling and wound-healing analysis of fibroblast and human keratinocyte cells is presented. Our study was conducted using a continuous cellular impedance monitoring technique, dubbed Electric Cell-substrate Impedance Sensing (ECIS). In fact, we have constructed a mathematical model for quantitatively analyzing the cultured cell growth using the time series data directly derived by ECIS in a previous work. In this study, the applicability of our model into the keratinocyte cell growth modeling analysis was assessed first. In addition, an electrical "wound-healing" assay was used as a means to evaluate the healing process of keratinocyte cells at a variety of pressures. Two innovative and new-defined indicators, dubbed cell power and cell electroactivity, respectively, were developed for quantitatively characterizing the biophysical behavior of cells. We then employed the wavelet transform method to perform a multi-scale analysis so the cell power and cell electroactivity across multiple observational time scales may be captured. Numerical results indicated that our model can well fit the data measured from the keratinocyte cell culture for cell growth modeling analysis. Also, the results produced by our quantitative analysis showed that the wound healing process was the fastest at the negative pressure of 125mmHg, which consistently agreed with the qualitative analysis results reported in previous works.

  5. Quantitative evaluation of registration methods for atlas-based diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Wu, Xue; Eggebrecht, Adam T.; Culver, Joseph P.; Zhan, Yuxuan; Basevi, Hector; Dehghani, Hamid

    2013-06-01

    In Diffuse Optical Tomography (DOT), an atlas-based model can be used as an alternative to a subject-specific anatomical model for recovery of brain activity. The main step of the generation of atlas-based subject model is the registration of atlas model to the subject head. The accuracy of the DOT then relies on the accuracy of registration method. In this work, 11 registration methods are quantitatively evaluated. The registration method with EEG 10/20 systems with 19 landmarks and non-iterative point to point algorithm provides approximately 1.4 mm surface error and is considered as the most efficient registration method.

  6. Precision of dehydroascorbic acid quantitation with the use of the subtraction method--validation of HPLC-DAD method for determination of total vitamin C in food.

    PubMed

    Mazurek, Artur; Jamroz, Jerzy

    2015-04-15

    In food analysis, a method for determination of vitamin C should enable measuring of total content of ascorbic acid (AA) and dehydroascorbic acid (DHAA) because both chemical forms exhibit biological activity. The aim of the work was to confirm applicability of HPLC-DAD method for analysis of total content of vitamin C (TC) and ascorbic acid in various types of food by determination of validation parameters such as: selectivity, precision, accuracy, linearity and limits of detection and quantitation. The results showed that the method applied for determination of TC and AA was selective, linear and precise. Precision of DHAA determination by the subtraction method was also evaluated. It was revealed that the results of DHAA determination obtained by the subtraction method were not precise which resulted directly from the assumption of this method and the principles of uncertainty propagation. The proposed chromatographic method should be recommended for routine determinations of total vitamin C in various food.

  7. Quantitative interferometric microscopy with two dimensional Hilbert transform based phase retrieval method

    NASA Astrophysics Data System (ADS)

    Wang, Shouyu; Yan, Keding; Xue, Liang

    2017-01-01

    In order to obtain high contrast images and detailed descriptions of label free samples, quantitative interferometric microscopy combining with phase retrieval is designed to obtain sample phase distributions from fringes. As accuracy and efficiency of recovered phases are affected by phase retrieval methods, thus approaches owning higher precision and faster processing speed are still in demand. Here, two dimensional Hilbert transform based phase retrieval method is adopted in cellular phase imaging, it not only reserves more sample specifics compared to classical fast Fourier transform based method, but also overcomes disadvantages of traditional algorithm according to Hilbert transform which is a one dimensional processing causing phase ambiguities. Both simulations and experiments are provided, proving the proposed phase retrieval approach can acquire quantitative sample phases with high accuracy and fast speed.

  8. Qualification of HSQC methods for quantitative composition of heparin and low molecular weight heparins.

    PubMed

    Mauri, Lucio; Boccardi, Giovanni; Torri, Giangiacomo; Karfunkle, Michael; Macchi, Eleonora; Muzi, Laura; Keire, David; Guerrini, Marco

    2017-03-20

    An NMR HSQC method has recently been proposed for the quantitative determination of the mono- and disaccharide subunits of heparin and low molecular weight heparins (LMWH). The focus of the current study was the validation of this procedure to make the 2D-NMR method suitable for pharmaceutical quality control applications. Pre-validation work investigated the effects of several experimental parameters to assess robustness and to optimize critical factors. Important experimental parameters were pulse sequence selection, equilibration interval between pulse trains and temperature. These observations were needed so that the NMR method was sufficiently understood to enable continuous improvement. A standard validation study on heparin then examined linearity, repeatability, intermediate precision and limits of detection and quantitation; selected validation parameters were also determined for LMWH.

  9. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  10. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    NASA Astrophysics Data System (ADS)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  11. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Cates, Michael R.; Franks, Larry A.

    1985-01-01

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fissions are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for .sup.239 Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  12. Methods of experimentation with models and utilization of results

    NASA Technical Reports Server (NTRS)

    Robert,

    1924-01-01

    The present report treats the subject of testing small models in a wind tunnel and of the methods employed for rendering the results constant, accurate and comparable with one another. Detailed experimental results are given.

  13. Method Development and Validation of a Stability-Indicating RP-HPLC Method for the Quantitative Analysis of Dronedarone Hydrochloride in Pharmaceutical Tablets

    PubMed Central

    Dabhi, Batuk; Jadeja, Yashwantsinh; Patel, Madhavi; Jebaliya, Hetal; Karia, Denish; Shah, Anamik

    2013-01-01

    A simple, precise, and accurate HPLC method has been developed and validated for the quantitative analysis of Dronedarone Hydrochloride in tablet form. An isocratic separation was achieved using a Waters Symmetry C8 (100 × 4.6 mm), 5 μm particle size column with a flow rate of 1 ml/min and UV detector at 290 nm. The mobile phase consisted of buffer: methanol (40:60 v/v) (buffer: 50 mM KH2PO4 + 1 ml triethylamine in 1 liter water, pH=2.5 adjusted with ortho-phosphoric acid). The method was validated for specificity, linearity, precision, accuracy, robustness, and solution stability. The specificity of the method was determined by assessing interference from the placebo and by stress testing the drug (forced degradation). The method was linear over the concentration range 20–80 μg/ml (r2 = 0.999) with a Limit of Detection (LOD) and Limit of Quantitation (LOQ) of 0.1 and 0.3 μg/ml respectively. The accuracy of the method was between 99.2–100.5%. The method was found to be robust and suitable for the quantitative analysis of Dronedarone Hydrochloride in a tablet formulation. Degradation products resulting from the stress studies did not interfere with the detection of Dronedarone Hydrochloride so the assay is thus stability-indicating. PMID:23641332

  14. Depth determination for shallow teleseismic earthquakes Methods and results

    SciTech Connect

    Stein, S.; Wiens, D.A.

    1986-11-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented. 119 references.

  15. Depth determination for shallow teleseismic earthquakes Methods and results

    NASA Technical Reports Server (NTRS)

    Stein, Seth; Wiens, Douglas A.

    1986-01-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented.

  16. [Quantitative bacteriological evaluation of a method for skin disinfection in blood donors].

    PubMed

    Folléa, G; Saint-Laurent, P; Bigey, F; Gayet, S; Bientz, M; Cazenave, J P

    1997-12-01

    Skin disinfection at the site of venipuncture is a critical point in every blood transfusion collection procedure, as it contributes to ensure the bacterial safety of transfusion. Quantitative and qualitative analysis of bacteria present in the antecubital fossae before and after skin disinfection may be one method of assessing the anti-bacterial efficiency of disinfection. Swab culture systems and contact plates are the two techniques usually employed for this purpose. A washing and swabbing technique was used to quantify bacteria before and skin disinfection of the antecubital fossae in blood donors. This contra-placebo study was carried out on 32 donors, each of whom served as his own control, with a random choice of test arm and opposing control arm. Bacterial counts were determined in the antecubital fossae without skin disinfection (control, n = 32) and after a 3 step skin preparation procedure (cleaning, wiping, disinfection) using placebo (distilled water, n = 16) or an antiseptic product (mixture of chlorexidine, benzalkonium chloride and benzylic alcohol, n = 16). The absence of a statistical difference in bacterial counts between the right and left antecubital fossae without disinfection was controlled in a preliminary study of 20 subjects. Mean bacterial counts were 25,000/cm2 and 27,400/cm2 respectively for aerobic and anaerobic bacteria before disinfection, with a wide variation in results between individuals. When using placebo, preparation of the venipuncture site by the 3 step method (cleaning, wiping, disinfection) resulted in a non significant mean reduction of 0.56 log in aerobic and anaerobic bacteria. Using the antiseptic product, the same method resulted in a significant mean reduction of 1.8 and 1.7 log respectively in aerobic (p = 0.015) and anaerobic flora (p = 0.005). On an average, 2,750 aerobic bacteria/cm2 and 2,910 anaerobic bacteria/cm2 remained after disinfection, while qualitative analysis showed that disinfection suppressed the

  17. Raman spectroscopy provides a rapid, non-invasive method for quantitation of starch in live, unicellular microalgae.

    PubMed

    Ji, Yuetong; He, Yuehui; Cui, Yanbin; Wang, Tingting; Wang, Yun; Li, Yuanguang; Huang, Wei E; Xu, Jian

    2014-12-01

    Conventional methods for quantitation of starch content in cells generally involve starch extraction steps and are usually labor intensive, thus a rapid and non-invasive method will be valuable. Using the starch-producing unicellular microalga Chlamydomonas reinhardtii as a model, we employed a customized Raman spectrometer to capture the Raman spectra of individual single cells under distinct culture conditions and along various growth stages. The results revealed a nearly linear correlation (R(2) = 0.9893) between the signal intensity at 478 cm(-1) and the starch content of the cells. We validated the specific correlation by showing that the starch-associated Raman peaks were eliminated in a mutant strain where the AGPase (ADP-glucose pyrophosphorylase) gene was disrupted and consequentially the biosynthesis of starch blocked. Furthermore, the method was validated in an industrial algal strain of Chlorella pyrenoidosa. This is the first demonstration of starch quantitation in individual live cells. Compared to existing cellular starch quantitation methods, this single-cell Raman spectra-based approach is rapid, label-free, non-invasive, culture-independent, low-cost, and potentially able to simultaneously track multiple metabolites in individual live cells, therefore should enable many new applications.

  18. Evaluating the Economic Impact of Smart Care Platforms: Qualitative and Quantitative Results of a Case Study

    PubMed Central

    Van der Auwermeulen, Thomas; Van Ooteghem, Jan; Jacobs, An; Verbrugge, Sofie; Colle, Didier

    2016-01-01

    Background In response to the increasing pressure of the societal challenge because of a graying society, a gulf of new Information and Communication Technology (ICT) supported care services (eCare) can now be noticed. Their common goal is to increase the quality of care while decreasing its costs. Smart Care Platforms (SCPs), installed in the homes of care-dependent people, foster the interoperability of these services and offer a set of eCare services that are complementary on one platform. These eCare services could not only result in more quality care for care receivers, but they also offer opportunities to care providers to optimize their processes. Objective The objective of the study was to identify and describe the expected added values and impacts of integrating SCPs in current home care delivery processes for all actors. In addition, the potential economic impact of SCP deployment is quantified from the perspective of home care organizations. Methods Semistructured and informal interviews and focus groups and cocreation workshops with service providers, managers of home care organizations, and formal and informal care providers led to the identification of added values of SCP integration. In a second step, process breakdown analyses of home care provisioning allowed defining the operational impact for home care organization. Impacts on 2 different process steps of providing home care were quantified. After modeling the investment, an economic evaluation compared the business as usual (BAU) scenario versus the integrated SCP scenario. Results The added value of SCP integration for all actors involved in home care was identified. Most impacts were qualitative such as increase in peace of mind, better quality of care, strengthened involvement in care provisioning, and more transparent care communication. For home care organizations, integrating SCPs could lead to a decrease of 38% of the current annual expenses for two administrative process steps namely

  19. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix

    PubMed Central

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Background: Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. Methods and Results: In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. Conclusion: CFLCM is showed as a useful quantitative method for pleomorphism

  20. A method for operative quantitative interpretation of multispectral images of biological tissues

    NASA Astrophysics Data System (ADS)

    Lisenko, S. A.; Kugeiko, M. M.

    2013-10-01

    A method for operative retrieval of spatial distributions of biophysical parameters of a biological tissue by using a multispectral image of it has been developed. The method is based on multiple regressions between linearly independent components of the diffuse reflection spectrum of the tissue and unknown parameters. Possibilities of the method are illustrated by an example of determining biophysical parameters of the skin (concentrations of melanin, hemoglobin and bilirubin, blood oxygenation, and scattering coefficient of the tissue). Examples of quantitative interpretation of the experimental data are presented.

  1. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  2. Quantitative determination of sibutramine in adulterated herbal slimming formulations by TLC-image analysis method.

    PubMed

    Phattanawasin, Panadda; Sotanaphun, Uthai; Sukwattanasinit, Tasamaporn; Akkarawaranthorn, Jariya; Kitchaiya, Sarunyaporn

    2012-06-10

    A simple thin layer chromatographic (TLC)-image analysis method was developed for rapid determination and quantitation of sibutramine hydrochloride (SH) adulterated in herbal slimming products. Chromatographic separation of SH was achieved on a silica gel 60 F(254) TLC plate, using toluene-n-hexane-diethylamine (9:1:0.3, v/v/v) as the mobile phase and Dragendorff reagent as spot detection. Image analysis of the scanned TLC plate was performed to quantify the amount of SH. The polynomial regression data for the calibration plots showed good linear relationship in the concentration range of 1-6 μg/spot. The limits of detection and quantitation were 190 and 634 ng/spot, respectively. The method gave satisfactory specificity, precision, accuracy, robustness and was applied for determination of SH in herbal formulations. The contents of SH in adulterated samples determined by the TLC-image analysis and TLC-densitometry were also compared.

  3. Qualitative and quantitative results of interferon-γ release assays for monitoring the response to anti-tuberculosis treatment

    PubMed Central

    Park, I-Nae; Shim, Tae Sun

    2017-01-01

    Background/Aims The usefulness of interferon-γ release assays (IGRAs) in monitoring to responses to anti-tuberculosis (TB) treatment is controversial. We compared the results of two IGRAs before and after anti-TB treatment in same patients with active TB. Methods From a retrospective review, we selected patients with active TB who underwent repeated QuantiFERON-TB Gold (QFN-Gold, Cellestis Limited) and T-SPOT.TB (Oxford Immunotec) assays before and after anti-TB treatment with first-line drugs. Both tests were performed prior to the start of anti-TB treatment or within 1 week after the start of anti-TB treatment and after completion of treatment. Results A total of 33 active TB patients were included in the study. On the QFN-Gold test, at baseline, 23 cases (70%) were early secreted antigenic target 6-kDa protein 6 (ESAT-6) or culture filtrate protein 10 (CFP-10) positive. On the T-SPOT. TB test, at baseline, 31 cases (94%) were ESAT-6 or CFP-10 positive. Most of patients remained both test-positive after anti-TB treatment. Although changes in interferon-γ release responses over time were highly variable in both tests, there was a mean decline of 27 and 24 spot-forming counts for ESAT-6 and CFP-10, respectively on the T-SPOT.TB test (p < 0.05 for all). Conclusions Although limited by the small number of patients and a short-term follow-up, there was significant decline in the quantitative result of the T-SPOT. TB test with treatment. However, both commercial IGRAs may not provide evidence regarding the cure of disease in Korea, a country where the prevalence of TB is within the intermediate range. PMID:27951621

  4. "PERFEXT": a direct method for quantitative assessment of cytokine production in vivo at the local level.

    PubMed

    Villavedra, M; Carol, H; Hjulström, M; Holmgren, J; Czerkinsky, C

    1997-05-01

    A method termed "PERFEXT", based on sequential perfusion and detergent extraction of lymphoid and non-lymphoid organs, has been developed for the quantitative measurement of cytokines produced at a local level in a given tissue. In vivo treatment of mice with Staphylococcus enterotoxin B (SEB) or lipopolysaccharide (LPS) served as the model systems. Interleukin-2 (IL2) and interferon-gamma (IFN gamma) levels were monitored by ELISA analysis of extracted samples. After local footpad (FP) injection with SEB, spleen and serum IL2 levels peaked at 2-4 h, while IL2 levels peaked at around 4-8 h in both FP and popliteal lymph nodes. SEB injection resulted in increased IFN gamma levels both in the FP and the draining lymph node. The detection of cytokines in the intestine allows for the application of the method at mucosal sites as well, provided enzyme inhibitors are present during the extraction procedure. After FP injection with LPS, IFN gamma production was significantly increased in the draining lymph node and was detectable in the FP, whereas IL2 was undetectable in any organ examined. IL2 and IFN gamma could also be detected at the site of elicitation of a delayed-type hypersensitivity reaction following local FP challenge. Local cytokine production correlated with the swelling response, whereas cytokine production in the spleen did not. IL2 peaked early, followed by a late increase in IFN gamma production, corresponding to the maximum swelling. This simple method should prove useful for analysing the production of other cytokines in vivo in distinct anatomical compartments.

  5. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo)

    PubMed Central

    Li, Yi; Kim, Jong-Joo

    2015-01-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  6. Development of a Strain-Specific Molecular Method for Quantitating Individual Campylobacter Strains in Mixed Populations▿

    PubMed Central

    Elvers, Karen T.; Helps, Christopher R.; Wassenaar, Trudy M.; Allen, Vivien M.; Newell, Diane G.

    2008-01-01

    The identification of sites resulting in cross-contamination of poultry flocks in the abattoir and determination of the survival and persistence of campylobacters at these sites are essential for the development of intervention strategies aimed at reducing the microbial burden on poultry at retail. A novel molecule-based method, using strain- and genus-specific oligonucleotide probes, was developed to detect and enumerate specific campylobacter strains in mixed populations. Strain-specific oligonucleotide probes were designed for the short variable regions (SVR) of the flaA gene in individual Campylobacter jejuni strains. A 16S rRNA Campylobacter genus-specific probe was also used. Both types of probes were used to investigate populations of campylobacters by colony lift hybridization. The specificity and proof of principle of the method were tested using strains with closely related SVR sequences and mixtures of these strains. Colony lifts of campylobacters were hybridized sequentially with up to two labeled strain-specific probes, followed by the generic 16S rRNA probe. SVR probes were highly specific, differentiating down to 1 nucleotide in the target sequence, and were sufficiently sensitive to detect colonies of a single strain in a mixed population. The 16S rRNA probe detected all Campylobacter spp. tested but not closely related species, such as Arcobacter skirrowi and Helicobacter pullorum. Preliminary field studies demonstrated the application of this technique to target strains isolated from poultry transport crate wash tank water. This method is quantitative, sensitive, and highly specific and allows the identification and enumeration of selected strains among all of the campylobacters in environmental samples. PMID:18281428

  7. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions.

  8. Quantitative methods for genome-scale analysis of in situ hybridization and correlation with microarray data

    PubMed Central

    Lee, Chang-Kyu; Sunkin, Susan M; Kuan, Chihchau; Thompson, Carol L; Pathak, Sayan; Ng, Lydia; Lau, Chris; Fischer, Shanna; Mortrud, Marty; Slaughterbeck, Cliff; Jones, Allan; Lein, Ed; Hawrylycz, Michael

    2008-01-01

    With the emergence of genome-wide colorimetric in situ hybridization (ISH) data sets such as the Allen Brain Atlas, it is important to understand the relationship between this gene expression modality and those derived from more quantitative based technologies. This study introduces a novel method for standardized relative quantification of colorimetric ISH signal that enables a large-scale cross-platform expression level comparison of ISH with two publicly available microarray brain data sources. PMID:18234097

  9. Models and methods for quantitative analysis of surface-enhanced Raman spectra.

    PubMed

    Li, Shuo; Nyagilo, James O; Dave, Digant P; Gao, Jean

    2014-03-01

    The quantitative analysis of surface-enhanced Raman spectra using scattering nanoparticles has shown the potential and promising applications in in vivo molecular imaging. The diverse approaches have been used for quantitative analysis of Raman pectra information, which can be categorized as direct classical least squares models, full spectrum multivariate calibration models, selected multivariate calibration models, and latent variable regression (LVR) models. However, the working principle of these methods in the Raman spectra application remains poorly understood and a clear picture of the overall performance of each model is missing. Based on the characteristics of the Raman spectra, in this paper, we first provide the theoretical foundation of the aforementioned commonly used models and show why the LVR models are more suitable for quantitative analysis of the Raman spectra. Then, we demonstrate the fundamental connections and differences between different LVR methods, such as principal component regression, reduced-rank regression, partial least square regression (PLSR), canonical correlation regression, and robust canonical analysis, by comparing their objective functions and constraints.We further prove that PLSR is literally a blend of multivariate calibration and feature extraction model that relates concentrations of nanotags to spectrum intensity. These features (a.k.a. latent variables) satisfy two purposes: the best representation of the predictor matrix and correlation with the response matrix. These illustrations give a new understanding of the traditional PLSR and explain why PLSR exceeds other methods in quantitative analysis of the Raman spectra problem. In the end, all the methods are tested on the Raman spectra datasets with different evaluation criteria to evaluate their performance.

  10. Integrated multiplatform method for in vitro quantitative assessment of cellular uptake for fluorescent polymer nanoparticles

    NASA Astrophysics Data System (ADS)

    Ferrari, Raffaele; Lupi, Monica; Falcetta, Francesca; Bigini, Paolo; Paolella, Katia; Fiordaliso, Fabio; Bisighini, Cinzia; Salmona, Mario; D'Incalci, Maurizio; Morbidelli, Massimo; Moscatelli, Davide; Ubezio, Paolo

    2014-01-01

    Studies of cellular internalization of nanoparticles (NPs) play a paramount role for the design of efficient drug delivery systems, but so far they lack a robust experimental technique able to quantify the NP uptake in terms of number of NPs internalized in each cell. In this work we propose a novel method which provides a quantitative evaluation of fluorescent NP uptake by combining flow cytometry and plate fluorimetry with measurements of number of cells. Single cell fluorescence signals measured by flow cytometry were associated with the number of internalized NPs, exploiting the observed linearity between average flow cytometric fluorescence and overall plate fluorimeter measures, and previous calibration of the microplate reader with serial dilutions of NPs. This precise calibration has been made possible by using biocompatible fluorescent NPs in the range of 20-300 nm with a narrow particle size distribution, functionalized with a covalently bonded dye, Rhodamine B, and synthesized via emulsion free-radical polymerization. We report the absolute number of NPs internalized in mouse mammary tumor cells (4T1) as a function of time for different NP dimensions and surface charges and at several exposure concentrations. The obtained results indicate that 4T1 cells incorporated 103-104 polymer NPs in a short time, reaching an intracellular concentration 15 times higher than the external one.

  11. A quantitative and standardized robotic method for the evaluation of arm proprioception after stroke.

    PubMed

    Simo, Lucia S; Ghez, Claude; Botzer, Lior; Scheidt, Robert A

    2011-01-01

    Stroke often results in both motor and sensory deficits, which may interact in the manifested functional impairment. Proprioception is known to play important roles in the planning and control of limb posture and movement; however, the impact of proprioceptive deficits on motor function has been difficult to elucidate due in part to the qualitative nature of available clinical tests. We present a quantitative and standardized method for evaluating proprioception in tasks directly relevant to those used to assess motor function. Using a robotic manipulandum that exerted controlled displacements of the hand, stroke participants were evaluated, and compared with a control group, in their ability to detect such displacements in a 2-alternative, forced-choice paradigm. A psychometric function parameterized the decision process underlying the detection of the hand displacements. The shape of this function was determined by a signal detection threshold and by the variability of the response about this threshold. Our automatic procedure differentiates between participants with and without proprioceptive deficits and quantifies functional proprioceptive sensation on a magnitude scale that is meaningful for ongoing studies of degraded motor function in comparable horizontal movements.

  12. EFFECTIVE REMOVAL METHOD OF ILLEGAL PARKING BICYCLES BASED ON THE QUANTITATIVE CHANGE AFTER REMOVAL

    NASA Astrophysics Data System (ADS)

    Toi, Satoshi; Kajita, Yoshitaka; Nishikawa, Shuichirou

    This study aims to find an effective removal method of illegal parking bicycles based on the analysis on the numerical change of illegal bicycles. And then, we built the time and space quantitative distribution model of illegal parking bicycles after removal, considering the logistic increase of illegal parking bicycles, several behaviors concerning of direct return or indirect return to the original parking place and avoidance of the original parking place, based on the investigation of real condition of illegal bicycle parking at TENJIN area in FUKUOKA city. Moreover, we built the simulation model including above-mentioned model, and calculated the number of illegal parking bicycles when we change the removal frequency and the number of removal at one time. The next interesting four results were obtained. (1) Recovery speed from removal the illegal parking bicycles differs by each zone. (2) Thorough removal is effective to keep the number of illegal parking bicycles lower level. (3) Removal at one zone causes the increase of bicycles at other zones where the level of illegal parking is lower. (4) The relationship between effects and costs of removing the illegal parking bicycles was clarified.

  13. Test characteristics of urinary biomarkers depend on quantitation method in acute kidney injury.

    PubMed

    Ralib, Azrina Md; Pickering, John W; Shaw, Geoffrey M; Devarajan, Prasad; Edelstein, Charles L; Bonventre, Joseph V; Endre, Zoltan H

    2012-02-01

    The concentration of urine influences the concentration of urinary biomarkers of AKI. Whether normalization to urinary creatinine concentration, as commonly performed to quantitate albuminuria, is the best method to account for variations in urinary biomarker concentration among patients in the intensive care unit is unknown. Here, we compared the diagnostic and prognostic performance of three methods of biomarker quantitation: absolute concentration, biomarker normalized to urinary creatinine concentration, and biomarker excretion rate. We measured urinary concentrations of alkaline phosphatase, γ-glutamyl transpeptidase, cystatin C, neutrophil gelatinase-associated lipocalin, kidney injury molecule-1, and IL-18 in 528 patients on admission and after 12 and 24 hours. Absolute concentration best diagnosed AKI on admission, but normalized concentrations best predicted death, dialysis, or subsequent development of AKI. Excretion rate on admission did not diagnose or predict outcomes better than either absolute or normalized concentration. Estimated 24-hour biomarker excretion associated with AKI severity, and for neutrophil gelatinase-associated lipocalin and cystatin C, with poorer survival. In summary, normalization to urinary creatinine concentration improves the prediction of incipient AKI and outcome but provides no advantage in diagnosing established AKI. The ideal method for quantitating biomarkers of urinary AKI depends on the outcome of interest.

  14. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles

    SciTech Connect

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki

    2012-09-28

    quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal.

  15. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  16. The Isotope-Coded Affinity Tag Method for Quantitative Protein Profile Comparison and Relative Quantitation of Cysteine Redox Modifications.

    PubMed

    Chan, James Chun Yip; Zhou, Lei; Chan, Eric Chun Yong

    2015-11-02

    The isotope-coded affinity tag (ICAT) technique has been applied to measure pairwise changes in protein expression through differential stable isotopic labeling of proteins or peptides followed by identification and quantification using a mass spectrometer. Changes in protein expression are observed when the identical peptide from each of two biological conditions is identified and a difference is detected in the measurements comparing the peptide labeled with the heavy isotope to the one with a normal isotopic distribution. This approach allows the simultaneous comparison of the expression of many proteins between two different biological states (e.g., yeast grown on galactose versus glucose, or normal versus cancer cells). Due to the cysteine-specificity of the ICAT reagents, the ICAT technique has also been applied to perform relative quantitation of cysteine redox modifications such as oxidation and nitrosylation. This unit describes both protein quantitation and profiling of cysteine redox modifications using the ICAT technique.

  17. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  18. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  19. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT.

    PubMed

    Chun, Se Young; Fessler, Jeffrey A; Dewaraja, Yuni K

    2013-09-07

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose–response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation–maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved −2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower

  20. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  1. Response monitoring using quantitative ultrasound methods and supervised dictionary learning in locally advanced breast cancer

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.

    2016-03-01

    A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.

  2. Method for collecting and immobilizing individual cumulus cells enabling quantitative immunofluorescence analysis of proteins.

    PubMed

    Appeltant, R; Maes, D; Van Soom, A

    2015-07-01

    Most immunofluorescence methods rely on techniques dealing with a very large number of cells. However, when the number of cells in a sample is low (e.g., when cumulus cells must be analyzed from individual cumulus-oocyte complexes), specific techniques are required to conserve, fix, and analyze cells individually. We established and validated a simple and effective method for collecting and immobilizing low numbers of cumulus cells that enables easy and quick quantitative immunofluorescence analysis of proteins from individual cells. To illustrate this technique, we stained proprotein of a disintegrin and metalloproteinase with thrombospondin-like repeats-1 (proADAMTS-1) and analyzed its levels in individual porcine cumulus cells.

  3. A method for quantitative analysis of aquatic humic substances in clear water based on carbon concentration.

    PubMed

    Tsuda, Kumiko; Takata, Akihiro; Shirai, Hidekado; Kozaki, Katsutoshi; Fujitake, Nobuhide

    2012-01-01

    Aquatic humic substances (AHSs) are major constituents of dissolved organic matter (DOM) in freshwater, where they perform a number of important ecological and geochemical functions, yet no method exists for quantifying all AHSs. We have developed a method for the quantitative analysis of AHSs based on their carbon concentration. Our approach includes: (1) the development of techniques for clear-water samples with low AHS concentrations, which normally complicate quantification; (2) avoiding carbon contamination in the laboratory; and (3) optimizing the AHS adsorption conditions.

  4. Detection of human herpesvirus 8 by quantitative polymerase chain reaction: development and standardisation of methods

    PubMed Central

    2012-01-01

    Background Human herpesvirus 8 (HHV-8), the aetiological agent of Kaposi’s sarcoma (KS), multicentric Castleman’s disease (MCD), and primary effusion lymphoma (PEL) is rare in Australia, but endemic in Sub-Saharan Africa, parts of South-east Asia and Oceania. While the treatment of external KS lesions can be monitored by clinical observation, the internal lesions of KS, MCD and PEL require extensive and expensive internal imaging, or autopsy. In patients with MCD and PEL, if HHV-8 viraemia is not reduced quickly, ~50% die within 24 months. HHV-8 qPCR is a valuable tool for monitoring HHV-8 viraemia, but is not available in many parts of the world, including those with high prevalence of KS and HHV-8. Methods A new molecular facility with stringent three-phase workflow was established, adhering to NPAAC and CLSI guidelines. Three fully validated quantitative assays were developed: two for detection and quantification of HHV-8; one for GAPDH, necessary for normalisation of viral loads in tissue and peripheral blood. Results The HHV-8 ORF73 and ORF26 qPCR assays were 100% specific. All qPCR assays, displayed a broad dynamic range (102 to 1010 copies/μL TE Buffer) with a limit of detection of 4.85x103, 5.61x102, and 2.59x102 copies/μL TE Buffer and a limit of quantification of 4.85x103, 3.01x102, and 1.38x102 copies/μL TE Buffer for HHV-8 ORF73, HHV-8 ORF26, and GAPDH respectively. The assays were tested on a panel of 35 KS biopsies from Queensland. All were HHV-8 qPCR positive with average viral load of 2.96x105 HHV-8 copies/μL DNA extract (range: 4.37x103 to 1.47x106 copies/μL DNA extract): When normalised these equate to an average viral load of 2.44x104 HHV-8 copies/103 cells (range: 2.20x102 to 7.38x105 HHV-8 copies/103 cells). Conclusions These are the first fully optimised, validated and MIQE compliant HHV-8 qPCR assays established in Australia. They worked well for qualitative detection of HHV-8 in archival tissue, and are well-suited for

  5. A fluorescence-quenching method for quantitative analysis of Ponceau 4R in beverage.

    PubMed

    Zhang, Jianpo; Na, Lihua; Jiang, Yunxia; Han, Dandan; Lou, Dawei; Jin, Li

    2017-04-15

    CdTe quantum dots was synthesized and used to quantitative analysis of Ponceau 4R in solution. With the excitation wavelength of 380nm the emission of CdTe quantum dots was quenched obviously by Ponceau 4R. In order to detect Ponceau 4R in mild condition the influences of fluorescence emission wavelength of CdTe quantum dots, pH value, temperature and reaction time were examined to establish the experimental condition. The linear response of the fluorescence intensity of CdTe quantum dots to Ponceau 4R allowed the quantitative analysis of Ponceau 4R in a range of 2.5-25μg/mL, and the limit of detection for Ponceau 4R was 0.025μg/mL. In addition, the responsive mechanism of this reaction system was investigated in detail by using the modified Stern-Volmer equation and thermodynamic calculation. Particularly, this method was used to quantitatively analyze the real sample, which indicated that this method could be more widely applied in similar samples.

  6. Verifying quantitative stigma and medication adherence scales using qualitative methods among Thai youth living with HIV/AIDS.

    PubMed

    Fongkaew, Warunee; Viseskul, Nongkran; Suksatit, Benjamas; Settheekul, Saowaluck; Chontawan, Ratanawadee; Grimes, Richard M; Grimes, Deanna E

    2014-01-01

    HIV/AIDS-related stigma has been linked to poor adherence resulting in drug resistance and the failure to control HIV. This study used both quantitative and qualitative methods to examine stigma and its relationship to adherence in 30 HIV-infected Thai youth aged 14 to 21 years. Stigma was measured using the HIV stigma scale and its 4 subscales, and adherence was measured using a visual analog scale. Stigma and adherence were also examined by in-depth interviews. The interviews were to determine whether verbal responses would match the scale's results. The mean score of stigma perception from the overall scale and its 4 subscales ranged from 2.14 to 2.45 on a scale of 1 to 4, indicating moderate levels of stigma. The mean adherence score was .74. The stigma scale and its subscales did not correlate with the adherence. Totally, 17 of the respondents were interviewed. Contrary to the quantitative results, the interviewees reported that the stigma led to poor adherence because the fear of disclosure often caused them to miss medication doses. The differences between the quantitative and the qualitative results highlight the importance of validating psychometric scales when they are translated and used in other cultures.

  7. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  8. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    PubMed

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-08

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine.

  9. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    PubMed

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy.

  10. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is

  11. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    PubMed

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.

  12. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18 O-Labeling Method for Quantitative Proteomics

    SciTech Connect

    Lopez-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather S.; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-08-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min and minimized the amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from bacteria Shewanella oneidensis, and mouse plasma, as well as for the labeling of complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, and rapid, and thus well-suited for automation.

  13. Quantitative imaging of volcanic plumes — Results, needs, and future trends

    USGS Publications Warehouse

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-01-01

    Recent technology allows two-dimensional “imaging” of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry–Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  14. Numerical results for extended field method applications. [thin plates

    NASA Technical Reports Server (NTRS)

    Donaldson, B. K.; Chander, S.

    1973-01-01

    This paper presents the numerical results obtained when a new method of analysis, called the extended field method, was applied to several thin plate problems including one with non-rectangular geometry, and one problem involving both beams and a plate. The numerical results show that the quality of the single plate solutions was satisfactory for all cases except those involving a freely deflecting plate corner. The results for the beam and plate structure were satisfactory even though the structure had a freely deflecting corner.

  15. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  16. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  17. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.

  18. Development and application of quantitative detection method for viral hemorrhagic septicemia virus (VHSV) genogroup IVa.

    PubMed

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-05-23

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R² values of the primer set developed in this study were -0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID₅₀) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID₅₀, making it a very useful tool for VHSV diagnosis.

  19. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (Cf) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined Cf for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  20. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl.

  1. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    PubMed

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge.

  2. Aspects of bioanalytical method validation for the quantitative determination of trace elements.

    PubMed

    Levine, Keith E; Tudan, Christopher; Grohse, Peter M; Weber, Frank X; Levine, Michael A; Kim, Yu-Seon J

    2011-08-01

    Bioanalytical methods are used to quantitatively determine the concentration of drugs, biotransformation products or other specified substances in biological matrices and are often used to provide critical data to pharmacokinetic or bioequivalence studies in support of regulatory submissions. In order to ensure that bioanalytical methods are capable of generating reliable, reproducible data that meet or exceed current regulatory guidance, they are subjected to a rigorous method validation process. At present, regulatory guidance does not necessarily account for nuances specific to trace element determinations. This paper is intended to provide the reader with guidance related to trace element bioanalytical method validation from the authors' perspective for two prevalent and powerful instrumental techniques: inductively coupled plasma-optical emission spectrometry and inductively coupled plasma-MS.

  3. Development and Application of Quantitative Detection Method for Viral Hemorrhagic Septicemia Virus (VHSV) Genogroup IVa

    PubMed Central

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-01-01

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R2 values of the primer set developed in this study were −0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID50) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID50, making it a very useful tool for VHSV diagnosis. PMID:24859343

  4. Development of an analytical method for quantitative comparison of the e-waste management systems in Thailand, Laos, and China.

    PubMed

    Liang, Li; Sharp, Alice

    2016-11-01

    This study employed a set of quantitative criteria to analyse the three parameters; namely policy, process, and practice; of the respective e-waste management systems adopted in Thailand, Laos, and China. Questionnaire surveys were conducted to determine the current status of the three parameters in relation to mobile phones. A total of five, three, and six variables under Policy (P1), Process (P2), and Practice (P3), respectively, were analysed and their weighted averages were calculated. The results showed that among the three countries surveyed, significant differences at p<0.01 were observed in all the P1, P2, and P3 variables, except P305 (sending e-waste to recovery centres) and P306 (treating e-waste by retailers themselves). Based on the quantitative method developed in this study, Laos' e-waste management system received the highest scores in both P1 average (0.130) and P3 average (0.129). However, in the combined Ptotal, China scored the highest (0.141), followed by Laos (0.132) and Thailand (0.121). This method could be used to assist decision makers in performing quantitative analysis of complex issues associating with e-waste management in a country.

  5. A method of quantitative characterization for the component of C/C composites based on the PLM video

    NASA Astrophysics Data System (ADS)

    Li, Y. X.; Qi, L. H.; Song, Y. S.; Li, H. J.

    2016-07-01

    PLM video is used for studying the microstructure of C/C composites, because it contains the structure and motion information at the same time. It means that PLM video could provide more comprehensive microstructure features of C/C composites, and then the microstructure could be quantitatively characterized by image processing. However, several unavoidable displacements still exist in the PLM video, which could occur during the process of image acquisition. Therefore, an image registration method was put forward to correct the displacements by the phase correlation, and further to achieve the quantitative characterization of component combined with image fusion and threshold segmentation based on the PLM video of C/C composites. Specifically, PLM video was decomposed to a frame sequence firstly. Then a series of processes was carried out on this basis, including selecting the frame as equal interval, segmenting the static and dynamic regions and correcting the relative displacements between the adjacent frames. Meanwhile, the result of image registration was verified through image fusion, and it indicates that the proposed method could eliminate the displacements effectively. Finally, some operations of image processing were used to segment the components and calculate their fractions, thus the quantitative calculation was achieved successfully.

  6. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  7. A quantitative method for the evaluation of three-dimensional structure of temporal bone pneumatization

    PubMed Central

    Hill, Cheryl A.; Richtsmeier, Joan T.

    2010-01-01

    Temporal bone pneumatization has been included in lists of characters used in phylogenetic analyses of human evolution. While studies suggest that the extent of pneumatization has decreased over the course of human evolution, little is known about the processes underlying these changes or their significance. In short, reasons for the observed reduction and the potential reorganization within pneumatized spaces are unknown. Technological limitations have limited previous analyses of pneumatization in extant and fossil species to qualitative observations of the extent of temporal bone pneumatization. In this paper, we introduce a novel application of quantitative methods developed for the study of trabecular bone to the analysis of pneumatized spaces of the temporal bone. This method utilizes high-resolution X-ray computed tomography (HRXCT) images and quantitative software to estimate three-dimensional parameters (bone volume fractions, anisotropy, and trabecular thickness) of bone structure within defined units of pneumatized spaces. We apply this approach in an analysis of temporal bones of diverse but related primate species, Gorilla gorilla, Pan troglodytes, Homo sapiens, and Papio hamadryas anubis, to illustrate the potential of these methods. In demonstrating the utility of these methods, we show that there are interspecific differences in the bone structure of pneumatized spaces, perhaps reflecting changes in the localized growth dynamics, location of muscle attachments, encephalization, or basicranial flexion. PMID:18715622

  8. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations.

  9. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating.

  10. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  11. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  12. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  13. Quantitative Assessment of the CCMC's Experimental Real-time SWMF-Geospace Results

    NASA Astrophysics Data System (ADS)

    Liemohn, Michael; Ganushkina, Natalia; De Zeeuw, Darren; Welling, Daniel; Toth, Gabor; Ilie, Raluca; Gombosi, Tamas; van der Holst, Bart; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz

    2016-04-01

    Experimental real-time simulations of the Space Weather Modeling Framework (SWMF) are conducted at the Community Coordinated Modeling Center (CCMC), with results available there (http://ccmc.gsfc.nasa.gov/realtime.php), through the CCMC Integrated Space Weather Analysis (iSWA) site (http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/), and the Michigan SWMF site (http://csem.engin.umich.edu/realtime). Presently, two configurations of the SWMF are running in real time at CCMC, both focusing on the geospace modules, using the BATS-R-US magnetohydrodynamic model, the Ridley Ionosphere Model, and with and without the Rice Convection Model for inner magnetospheric drift physics. While both have been running for several years, nearly continuous results are available since July 2015. Dst from the model output is compared against the Kyoto real-time Dst, in particular the daily minimum value of Dst to quantify the ability of the model to capture storms. Contingency tables are presented, showing that the run with the inner magnetosphere model is much better at reproducing storm-time values. For disturbances with a minimum Dst lower than -50 nT, this version yields a probability of event detection of 0.86 and a Heidke Skill Score of 0.60. In the other version of the SWMF, without the inner magnetospheric module included, the modeled Dst never dropped below -50 nT during the examined epoch.

  14. Comparative Evaluation of Four Real-Time PCR Methods for the Quantitative Detection of Epstein-Barr Virus from Whole Blood Specimens.

    PubMed

    Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall

    2016-07-01

    Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays.

  15. Quantitation of Compounds in Wine Using (1)H NMR Spectroscopy: Description of the Method and Collaborative Study.

    PubMed

    Godelmann, Rolf; Kost, Christian; Patz, Claus-Dieter; Ristow, Reinhard; Wachter, Helmut

    2016-09-01

    To examine whether NMR analysis is a suitable method for the quantitative determination of wine components, an international collaborative trial was organized to evaluate the method according to the international regulations and guidelines of the German Institute for Standardization/International Organization for Standardization, AOAC INTERNATIONAL, the International Union of Pure and Applied Chemistry, and the International Organization of Vine and Wine. Sugars such as glucose; acids such as malic, acetic, fumaric, and shikimic acids (the latter two as minor components); and sorbic acid, a preservative, were selected for the exemplary quantitative determination of substances in wine. Selection criteria for the examination of sample material included different NMR spectral signal types (singlet and multiplet), as well as the suitability of the proposed substances for manual integration at different levels of challenge (e.g., interference as a result of the necessary suppression of a water signal or the coverage of different typical wine concentration ranges for a selection of major components, minor components, and additives). To show that this method can be universally applied, NMR measurement and the method of evaluation were not strictly elucidated. Fifteen international laboratories participated in the collaborative trial and determined six parameters in 10 samples. The values, in particular the reproducibility SD (SR), were compared with the expected Horwitz SD (SH) by forming the quotient SR/SH (i.e., the HorRat value). The resulting HorRat values of most parameters were predominantly between 0.6 and 1.5, and thus of an acceptable range.

  16. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    NASA Astrophysics Data System (ADS)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  17. Modeling Bone Surface Morphology: A Fully Quantitative Method for Age-at-Death Estimation Using the Pubic Symphysis.

    PubMed

    Slice, Dennis E; Algee-Hewitt, Bridget F B

    2015-07-01

    The pubic symphysis is widely used in age estimation for the adult skeleton. Standard practice requires the visual comparison of surface morphology against criteria representing predefined phases and the estimation of case-specific age from an age range associated with the chosen phase. Known problems of method and observer error necessitate alternative tools to quantify age-related change in pubic morphology. This paper presents an objective, fully quantitative method for estimating age-at-death from the skeleton, which exploits a variance-based score of surface complexity computed from vertices obtained from a scanner sampling the pubic symphysis. For laser scans from 41 modern American male skeletons, this method produces results that are significantly associated with known age-at-death (RMSE = 17.15 years). Chronological age is predicted, therefore, equally well, if not, better, with this robust, objective, and fully quantitative method than with prevailing phase-aging systems. This method contributes to forensic casework by responding to medico-legal expectations for evidence standards.

  18. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  19. Investigation of a diffuse optical measurements-assisted quantitative photoacoustic tomographic method in reflection geometry

    PubMed Central

    Xu, Chen; Kumavor, Patrick D.; Aguirre, Andres

    2012-01-01

    Abstract. Photoacoustic tomography provides the distribution of absorbed optical energy density, which is the product of optical absorption coefficient and optical fluence distribution. We report the experimental investigation of a novel fitting procedure that quantitatively determines the optical absorption coefficient of chromophores. The experimental setup consisted of a hybrid system of a 64-channel photoacoustic imaging system with a frequency-domain diffused optical measurement system. The fitting procedure included a complete photoacoustic forward model and an analytical solution of a target chromophore using the diffusion approximation. The fitting procedure combines the information from the photoacoustic image and the background information from the diffuse optical measurements to minimize the photoacoustic measurements and forward model data and recover the target absorption coefficient quantitatively. 1-cm-cube phantom absorbers of high and low contrasts were imaged at depths of up to 3.0 cm. The fitted absorption coefficient results were at least 80% of their true values. The sensitivities of this fitting procedure to target location, target radius, and background optical properties were also investigated. We found that this fitting procedure was most sensitive to the accurate determination of the target radius and depth. Blood sample in a thin tube of radius 0.58 mm, simulating a blood vessel, was also studied. The photoacoustic images and fitted absorption coefficients are presented. These results demonstrate the clinical potential of this fitting procedure to quantitatively characterize small lesions in breast imaging. PMID:22734743

  20. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data.

  1. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications

    PubMed Central

    Wei, Wentao; Huang, Qiu; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring. PMID:28251150

  2. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  3. Parents' decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results.

    PubMed

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9-10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents' general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions.

  4. Parents’ decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results

    PubMed Central

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9–10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents’ general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  5. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  6. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  7. Collaborating to improve the use of free-energy and other quantitative methods in drug discovery

    NASA Astrophysics Data System (ADS)

    Sherborne, Bradley; Shanmugasundaram, Veerabahu; Cheng, Alan C.; Christ, Clara D.; DesJarlais, Renee L.; Duca, Jose S.; Lewis, Richard A.; Loughney, Deborah A.; Manas, Eric S.; McGaughey, Georgia B.; Peishoff, Catherine E.; van Vlijmen, Herman

    2016-12-01

    In May and August, 2016, several pharmaceutical companies convened to discuss and compare experiences with Free Energy Perturbation (FEP). This unusual synchronization of interest was prompted by Schrödinger's FEP+ implementation and offered the opportunity to share fresh studies with FEP and enable broader discussions on the topic. This article summarizes key conclusions of the meetings, including a path forward of actions for this group to aid the accelerated evaluation, application and development of free energy and related quantitative, structure-based design methods.

  8. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-10-01

    Abstract Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  9. Collaborating to improve the use of free-energy and other quantitative methods in drug discovery.

    PubMed

    Sherborne, Bradley; Shanmugasundaram, Veerabahu; Cheng, Alan C; Christ, Clara D; DesJarlais, Renee L; Duca, Jose S; Lewis, Richard A; Loughney, Deborah A; Manas, Eric S; McGaughey, Georgia B; Peishoff, Catherine E; van Vlijmen, Herman

    2016-12-01

    In May and August, 2016, several pharmaceutical companies convened to discuss and compare experiences with Free Energy Perturbation (FEP). This unusual synchronization of interest was prompted by Schrödinger's FEP+ implementation and offered the opportunity to share fresh studies with FEP and enable broader discussions on the topic. This article summarizes key conclusions of the meetings, including a path forward of actions for this group to aid the accelerated evaluation, application and development of free energy and related quantitative, structure-based design methods.

  10. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-09-01

    Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  11. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  12. Semi-quantitative characterisation of ambient ultrafine aerosols resulting from emissions of coal fired power stations.

    PubMed

    Hinkley, J T; Bridgman, H A; Buhre, B J P; Gupta, R P; Nelson, P F; Wall, T F

    2008-02-25

    Emissions from coal fired power stations are known to be a significant anthropogenic source of fine atmospheric particles, both through direct primary emissions and secondary formation of sulfate and nitrate from emissions of gaseous precursors. However, there is relatively little information available in the literature regarding the contribution emissions make to the ambient aerosol, particularly in the ultrafine size range. In this study, the contribution of emissions to particles smaller than 0.3 mum in the ambient aerosol was examined at a sampling site 7 km from two large Australian coal fired power stations equipped with fabric filters. A novel approach was employed using conditional sampling based on sulfur dioxide (SO(2)) as an indicator species, and a relatively new sampler, the TSI Nanometer Aerosol Sampler. Samples were collected on transmission electron microscope (TEM) grids and examined using a combination of TEM imaging and energy dispersive X-ray (EDX) analysis for qualitative chemical analysis. The ultrafine aerosol in low SO(2) conditions was dominated by diesel soot from vehicle emissions, while significant quantities of particles, which were unstable under the electron beam, were observed in the high SO(2) samples. The behaviour of these particles was consistent with literature accounts of sulfate and nitrate species, believed to have been derived from precursor emissions from the power stations. A significant carbon peak was noted in the residues from the evaporated particles, suggesting that some secondary organic aerosol formation may also have been catalysed by these acid seed particles. No primary particulate material was observed in the minus 0.3 mum fraction. The results of this study indicate the contribution of species more commonly associated with gas to particle conversion may be more significant than expected, even close to source.

  13. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  14. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method.

    PubMed

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like (1)H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  15. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  16. Method for quantitative estimation of position perception using a joystick during linear movement.

    PubMed

    Wada, Y; Tanaka, M; Mori, S; Chen, Y; Sumigama, S; Naito, H; Maeda, M; Yamamoto, M; Watanabe, S; Kajitani, N

    1996-12-01

    We designed a method for quantitatively estimating self-motion perceptions during passive body movement on a sled. The subjects were instructed to tilt a joystick in proportion to perceived displacement from a giving starting position during linear movement with varying displacements of 4 m, 10 m and 16 m induced by constant acceleration of 0.02 g, 0.05 g and 0.08 g along the antero-posterior axis. With this method, we could monitor not only subjective position perceptions but also response latencies for the beginning (RLbgn) and end (RLend) of the linear movement. Perceived body position fitted Stevens' power law, where R=kSn (R is output of the joystick, k is a constant, S is the displacement from the linear movement and n is an exponent). RLbgn decreased as linear acceleration increased. We conclude that this method is useful in analyzing the features and sensitivities of self-motion perceptions during movement.

  17. Initial Results of an MDO Method Evaluation Study

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley MDO method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of re- producible experiments. In the first phase of the study, three MDO methods were implemented in the SIGHT: framework and used to solve a set of ten relatively simple problems. In this paper, we comment on the general considerations for conducting method evaluation studies and report some initial results obtained to date. In particular, although the results are not conclusive because of the small initial test set, other formulations, optimality conditions, and sensitivity of solutions to various perturbations. Optimization algorithms are used to solve a particular MDO formulation. It is then appropriate to speak of local convergence rates and of global convergence properties of an optimization algorithm applied to a specific formulation. An analogous distinction exists in the field of partial differential equations. On the one hand, equations are analyzed in terms of regularity, well-posedness, and the existence and unique- ness of solutions. On the other, one considers numerous algorithms for solving differential equations. The area of MDO methods studies MDO formulations combined with optimization algorithms, although at times the distinction is blurred. It is important to

  18. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci.

  19. Quantitative measurement of speech sound distortions with the aid of minimum variance spectral estimation method for dentistry use.

    PubMed

    Bereteu, L; Drăgănescu, G E; Stănescu, D; Sinescu, C

    2011-12-01

    In this paper, we search an adequate quantitative method based on minimum variance spectral analysis in order to reflect the dependence of the speech quality on the correct positioning of the dental prostheses. We also search some quantitative parameters, which reflect the correct position of dental prostheses in a sensitive manner.

  20. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  1. A Rapid, Quantitative Method to Characterize The Human Lymphocyte Concentration for Automated High-Throughput Radiation Biodosimetry

    PubMed Central

    Xu, Yanping; Turner, Helen C.; Garty, Guy; Brenner, David

    2013-01-01

    We have developed a Quantitative Light Absorption Analysis (QLAA) method to rapidly estimate human lymphocyte concentrations isolated from small volumes of whole blood. Measurements of the light absorption analysis were calibrated for lymphocyte concentration levels using a hemocytometer. To validate the QLAA system, blood samples were collected from 17 healthy donors and lymphocyte absorption measurements were directly compared with the manual microscope counting. The results showed that lymphocyte measurements obtained using the QLAA system were comparable with the manually scored lymphocyte counts but with measurements taken in seconds. PMID:23781493

  2. The Quantitative Ideas and Methods in Assessment of Four Properties of Chinese Medicinal Herbs.

    PubMed

    Fu, Jialei; Pang, Jingxiang; Zhao, Xiaolei; Han, Jinxiang

    2015-04-01

    The purpose of this review is to summarize and reflect on the current status and problems of the research on the properties of Chinese medicinal herbs. Hot, warm, cold, and cool are the four properties/natures of Chinese medicinal herbs. They are defined based on the interaction between the herbs with human body. How to quantitatively assess the therapeutic effect of Chinese medicinal herbs based on the theoretical system of Traditional Chinese medicine (TCM) remains to be a challenge. Previous studies on the topic from several perspectives have been presented. Results and problems were discussed. New ideas based on the technology of biophoton radiation detection are proposed. With the development of biophoton detection technology, detection and characterization of human biophoton emission has led to its potential applications in TCM. The possibility of using the biophoton analysis system to study the interaction of Chinese medicinal herbs with human body and to quantitatively determine the effect of the Chinese medicinal herbal is entirely consistent with the holistic concept of TCM theory. The statistical entropy of electromagnetic radiations from the biological systems can characterize the four properties of Chinese medicinal herbs, and the spectrum can characterize the meridian tropism of it. Therefore, we hypothesize that by the use of biophoton analysis system, the four properties and meridian tropism of Chinese medicinal herbs can be quantitatively expressed.

  3. Methods for Quantitative Detection of Antibody-induced Complement Activation on Red Blood Cells

    PubMed Central

    Meulenbroek, Elisabeth M.; Wouters, Diana; Zeerleder, Sacha

    2014-01-01

    Antibodies against red blood cells (RBCs) can lead to complement activation resulting in an accelerated clearance via complement receptors in the liver (extravascular hemolysis) or leading to intravascular lysis of RBCs. Alloantibodies (e.g. ABO) or autoantibodies to RBC antigens (as seen in autoimmune hemolytic anemia, AIHA) leading to complement activation are potentially harmful and can be - especially when leading to intravascular lysis - fatal1. Currently, complement activation due to (auto)-antibodies on RBCs is assessed in vitro by using the Coombs test reflecting complement deposition on RBC or by a nonquantitative hemolytic assay reflecting RBC lysis1-4. However, to assess the efficacy of complement inhibitors, it is mandatory to have quantitative techniques. Here we describe two such techniques. First, an assay to detect C3 and C4 deposition on red blood cells that is induced by antibodies in patient serum is presented. For this, FACS analysis is used with fluorescently labeled anti-C3 or anti-C4 antibodies. Next, a quantitative hemolytic assay is described. In this assay, complement-mediated hemolysis induced by patient serum is measured making use of spectrophotometric detection of the released hemoglobin. Both of these assays are very reproducible and quantitative, facilitating studies of antibody-induced complement activation. PMID:24514151

  4. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  5. A fast semi-quantitative method for Plutonium determination in an alpine firn/ice core

    NASA Astrophysics Data System (ADS)

    Gabrieli, J.; Cozzi, G.; Vallelonga, P.; Schwikowski, M.; Sigl, M.; Boutron, C.; Barbante, C.

    2009-04-01

    deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu profiles with smaller peaks (about 20-30% compared to the 1964 peak) which could be due to French and Chinese tests. Comparison with the Pu profiles obtained from the Col du Dome and Belukha ice cores by AMS (Accelerator Mass Spectrometry) shows very good agreement. Considering the semi-quantitative method and the analytical uncertainty, the results are also quantitatively comparable. However, the Pu concentrations at Colle Gnifetti are normally 2-3 times greater than in Col du Dome. This could be explained by different air mass transport or, more likely, different accumulation rates at each site.

  6. Simultaneous quantitation and validation of method for the quality evaluation of Eucommiae cortex by HPLC/UV.

    PubMed

    Zhao, Bing Tian; Jeong, Su Yang; Kim, Tae In; Seo, Eun Kyoung; Min, Byung Sun; Son, Jong Keun; Woo, Mi Hee

    2015-12-01

    A new HPLC/UV method has been developed for the simultaneous quantitative determination of four major components in Eucommiae cortex, namely geniposidic acid (1), geniposide (2), pinoresinol di-O-β-D-glucopyranoside (3), and liriodendrin (4). Simultaneous separations of these four components were achieved on a J'sphere ODS C(18) column (250 × 4.6 mm, 4 µm). The elution was done using water with 0.1% phosphoric acid (A) and acetonitrile with 0.1% phosphoric acid (B) in a two-step elution of the mobile phase at a flow rate of 1.0 mL/min and a wavelength of 230 nm. The method was validated for linearity, recovery, precision, accuracy, stability and robustness. All calibration curves showed good linear regression (r(2) > 0.999) within the test ranges. This method showed good recovery and reproducibility for the quantification of these four components in 85 species of Eucommiae cortex. The intra-day and inter-day precisions were lower than 0.53% (as a relative standard deviation, RSD) and accuracies between 93.00 and 106.28% for all standards. The results indicate that the established HPLC/UV method is suitable for quantitation and quality evaluation of Eucommiae cortex.

  7. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development.

  8. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  9. Setting health research priorities using the CHNRI method: VI. Quantitative properties of human collective opinion

    PubMed Central

    Yoshida, Sachiyo; Rudan, Igor; Cousens, Simon

    2016-01-01

    Introduction Crowdsourcing has become an increasingly important tool to address many problems – from government elections in democracies, stock market prices, to modern online tools such as TripAdvisor or Internet Movie Database (IMDB). The CHNRI method (the acronym for the Child Health and Nutrition Research Initiative) for setting health research priorities has crowdsourcing as the major component, which it uses to generate, assess and prioritize between many competing health research ideas. Methods We conducted a series of analyses using data from a group of 91 scorers to explore the quantitative properties of their collective opinion. We were interested in the stability of their collective opinion as the sample size increases from 15 to 90. From a pool of 91 scorers who took part in a previous CHNRI exercise, we used sampling with replacement to generate multiple random samples of different size. First, for each sample generated, we identified the top 20 ranked research ideas, among 205 that were proposed and scored, and calculated the concordance with the ranking generated by the 91 original scorers. Second, we used rank correlation coefficients to compare the ranks assigned to all 205 proposed research ideas when samples of different size are used. We also analysed the original pool of 91 scorers to to look for evidence of scoring variations based on scorers' characteristics. Results The sample sizes investigated ranged from 15 to 90. The concordance for the top 20 scored research ideas increased with sample sizes up to about 55 experts. At this point, the median level of concordance stabilized at 15/20 top ranked questions (75%), with the interquartile range also generally stable (14–16). There was little further increase in overlap when the sample size increased from 55 to 90. When analysing the ranking of all 205 ideas, the rank correlation coefficient increased as the sample size increased, with a median correlation of 0.95 reached at the sample size

  10. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  11. Quantitation of mRNA levels of steroid 5alpha-reductase isozymes: a novel method that combines quantitative RT-PCR and capillary electrophoresis.

    PubMed

    Torres, Jesús M; Ortega, Esperanza

    2004-01-01

    A novel, accurate, rapid and modestly labor-intensive method has been developed to quantitate specific mRNA species by reverse transcription-polymerase chain reaction (RT-PCR). This strategy combines the high degree of specificity of competitive PCR with the sensitivity of laser-induced fluorescence capillary electrophoresis (LIF-CE). The specific target mRNA and a mimic DNA fragment, used as an internal standard (IS), were co-amplified in a single reaction in which the same primers are used. The amount of mRNA was then quantitated by extrapolation from the standard curve generated with the internal standard. PCR primers were designed to amplify both a 185 bp fragment of the target cDNA for steroid 5alpha-reductase 1 (5alpha-R1) and a 192 bp fragment of the target cDNA for steroid 5alpha-reductase type 2 (5alpha-R2). The 5' forward primers were end-labeled with 6-carboxy-fluorescein (6-FAM). Two synthetic internal standard DNAs of 300 bp were synthesized from the sequence of plasmid pEGFP-C1. The ratio of fluorescence intensity between amplified products of the target cDNA (185 or 192 bp fragments) and the competitive DNA (300 bp fragment) was determined quantitatively after separation by capillary electrophoresis and fluorescence analysis. The accurate quantitation of low-abundance mRNAs by the present method allows low-level gene expression to be characterized.

  12. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors

  13. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity

    PubMed Central

    O’Brien-Simpson, Neil M.; Pantarat, Namfon; Attard, Troy J.; Walsh, Katrina A.; Reynolds, Eric C.

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  14. Enantiomer labelling, a method for the quantitative analysis of amino acids.

    PubMed

    Frank, H; Nicholson, G J; Bayer, E

    1978-12-21

    Enantiomer labelling a method for the quntitative analysis of optically active natural compounds by gas chromatography, involves the use of the unnatural enantiomer as an internal standard. With Chirasil-Val, a chiral stationary phase that is thermally stable up to up to 240 degrees, the enantiomers of amino acids and a variety of other compounds can be separated and quantitated. Incomplete recovery from the sample, incomplete derivatization, hydrolysis and thermal decomposition of the derivative and shifting response factors can be compensated for by adding the unnatural enantiomer. The accuracy of amino acid analysis by enantiomer labelling is equal or superior to that of hitherto known methods. The procedure affords a complete analysis of peptides with respect to both amino acid composition and the optical purity of each amino acid.

  15. Simple saponification method for the quantitative determination of carotenoids in green vegetables.

    PubMed

    Larsen, Erik; Christensen, Lars P

    2005-08-24

    A simple, reliable, and gentle saponification method for the quantitative determination of carotenoids in green vegetables was developed. The method involves an extraction procedure with acetone and the selective removal of the chlorophylls and esterified fatty acids from the organic phase using a strongly basic resin (Ambersep 900 OH). Extracts from common green vegetables (beans, broccoli, green bell pepper, chive, lettuce, parsley, peas, and spinach) were analyzed by high-performance liquid chromatography (HPLC) for their content of major carotenoids before and after action of Ambersep 900 OH. The mean recovery percentages for most carotenoids [(all-E)-violaxanthin, (all-E)-lutein epoxide, (all-E)-lutein, neolutein A, and (all-E)-beta-carotene] after saponification of the vegetable extracts with Ambersep 900 OH were close to 100% (99-104%), while the mean recovery percentages of (9'Z)-neoxanthin increased to 119% and that of (all-E)-neoxanthin and neolutein B decreased to 90% and 72%, respectively.

  16. Genetic programming:  a novel method for the quantitative analysis of pyrolysis mass spectral data.

    PubMed

    Gilbert, R J; Goodacre, R; Woodward, A M; Kell, D B

    1997-11-01

    A technique for the analysis of multivariate data by genetic programming (GP) is described, with particular reference to the quantitative analysis of orange juice adulteration data collected by pyrolysis mass spectrometry (PyMS). The dimensionality of the input space was reduced by ranking variables according to product moment correlation or mutual information with the outputs. The GP technique as described gives predictive errors equivalent to, if not better than, more widespread methods such as partial least squares and artificial neural networks but additionally can provide a means for easing the interpretation of the correlation between input and output variables. The described application demonstrates that by using the GP method for analyzing PyMS data the adulteration of orange juice with 10% sucrose solution can be quantified reliably over a 0-20% range with an RMS error in the estimate of ∼1%.

  17. Age-related changes in rat cerebellar basket cells: a quantitative study using unbiased stereological methods

    PubMed Central

    HENRIQUE, RUI M. F.; ROCHA, EDUARDO; REIS, ALCINDA; MARCOS, RICARDO; OLIVEIRA, MARIA H.; SILVA, MARIA W.; MONTEIRO, ROGÉRIO A. F.

    2001-01-01

    Cortical cerebellar basket cells are stable postmitotic cells; hence, they are liable to endure age-related changes. Since the cerebellum is a vital organ for the postural control, equilibrium and motor coordination, we aimed to determine the quantitative morphological changes in those interneurons with the ageing process, using unbiased techniques. Material from the cerebellar cortex (Crus I and Crus II) was collected from female rats aged 2, 6, 9, 12, 15, 18, 21 and 24 mo (5 animals per each age group), fixed by intracardiac perfusion, and processed for transmission electron microscopy, using conventional techniques. Serial semithin sections were obtained (5 blocks from each rat), enabling the determination of the number-weighted mean nuclear volume (by the nucleator method). On ultrathin sections, 25 cell profiles from each animal were photographed. The volume density of the nucleus, ground substance, mitochondria, Golgi apparatus (Golgi) and dense bodies (DB), and the mean surface density of the rough endoplasmic reticulum (RER) were determined, by point counting, using a morphometric grid. The mean total volumes of the soma and organelles and the mean total surface area of the RER [s̄N (RER)] were then calculated. The results were analysed with 1-way ANOVA; posthoc pairwise comparisons of group means were performed using the Newman-Keuls test. The relation between age and each of the parameters was studied by regression analysis. Significant age-related changes were observed for the mean volumes of the soma, ground substance, Golgi, DB, and s̄N (RER). Positive linear trends were found for the mean volumes of the ground substance, Golgi, and DB; a negative linear trend was found for the s̄N (RER). These results indicate that rat cerebellar basket cells endure important age-related changes. The significant decrease in the s̄N (RER) may be responsible for a reduction in the rate of protein synthesis. Additionally, it may be implicated in a cascade of events

  18. DNAPL characterization using the Ribbon NAPL sampler: Methods and results

    SciTech Connect

    Riha, B.D.

    2000-04-25

    The Ribbon NAPL Sampler (RNS) is a direct sampling device that provides detailed depth discrete mapping of Non Aqueous Phase Liquids (NAPLs) in a borehole. This characterization method provides a yes or no answer to the presence of NAPLs and is used to complement and enhance other characterization techniques. Several cone penetrometer deployment methods are in use and methods for other drilling techniques are under development. The RNS has been deployed in the vadose and saturated zones at four different sites. Three of the sites contain DNAPLs from cleaning and degreasing operations and the fourth site contains creosote from a wood preserving plant. A brief description of the process history and geology is provided for each site. Where available, lithology and contaminant concentration information is provided and discussed in context with the RNS results.

  19. A quantitative method for measurement of HL-60 cell apoptosis based on diffraction imaging flow cytometry technique.

    PubMed

    Yang, Xu; Feng, Yuanming; Liu, Yahui; Zhang, Ning; Lin, Wang; Sa, Yu; Hu, Xin-Hua

    2014-07-01

    A quantitative method for measurement of apoptosis in HL-60 cells based on polarization diffraction imaging flow cytometry technique is presented in this paper. Through comparative study with existing methods and the analysis of diffraction images by a gray level co-occurrence matrix algorithm (GLCM), we found 4 GLCM parameters of contrast (CON), cluster shade (CLS), correlation (COR) and dissimilarity (DIS) exhibit high sensitivities as the apoptotic rates. It was further demonstrated that the CLS parameter correlates significantly (R(2) = 0.899) with the degree of nuclear fragmentation and other three parameters showed a very good correlations (R(2) ranges from 0.69 to 0.90). These results demonstrated that the new method has the capability for rapid and accurate extraction of morphological features to quantify cellular apoptosis without the need for cell staining.

  20. A quantitative method for measurement of HL-60 cell apoptosis based on diffraction imaging flow cytometry technique

    PubMed Central

    Yang, Xu; Feng, Yuanming; Liu, Yahui; Zhang, Ning; Lin, Wang; Sa, Yu; Hu, Xin-Hua

    2014-01-01

    A quantitative method for measurement of apoptosis in HL-60 cells based on polarization diffraction imaging flow cytometry technique is presented in this paper. Through comparative study with existing methods and the analysis of diffraction images by a gray level co-occurrence matrix algorithm (GLCM), we found 4 GLCM parameters of contrast (CON), cluster shade (CLS), correlation (COR) and dissimilarity (DIS) exhibit high sensitivities as the apoptotic rates. It was further demonstrated that the CLS parameter correlates significantly (R2 = 0.899) with the degree of nuclear fragmentation and other three parameters showed a very good correlations (R2 ranges from 0.69 to 0.90). These results demonstrated that the new method has the capability for rapid and accurate extraction of morphological features to quantify cellular apoptosis without the need for cell staining. PMID:25071957

  1. [THE COMPARATIVE ANALYSIS OF RESULTS OF DETECTION OF CARCINOGENIC TYPES OF HUMAN PAPILLOMA VIRUS BY QUALITATIVE AND QUANTITATIVE TESTS].

    PubMed

    Kuzmenko, E T; Labigina, A V; Leshenko, O Ya; Rusanov, D N; Kuzmenko, V V; Fedko, L P; Pak, I P

    2015-05-01

    The analysis of results of screening (n = 3208; sexually active citizen aged from 18 to 59 years) was carried out to detect oncogene types of human papilloma virus in using qualitative (1150 females and 720 males) and quantitative (polymerase chain reaction in real-time (843 females and 115 males) techniques. The human papilloma virus of high oncogene type was detected in 65% and 68.4% of females and in 48.6% and 53% of males correspondingly. Among 12 types of human papilloma virus the most frequently diagnosed was human papilloma virus 16 independently of gender of examined and technique of analysis. In females, under application of qualitative tests rate of human papilloma virus 16 made up to 18.3% (n = 280) and under application of quantitative tests Rte of human papilloma virus made up to 14.9% (n = 126; p ≤ 0.05). Under examination of males using qualitative tests rate of human papilloma virus 16 made up to 8.3% (n = 60) and under application of qualitative tests made up to 12.2% (n = 14; p ≥ 0.05). Under application of qualitative tests rate of detection on the rest ofoncogene types of human papilloma virus varied in females from 3.4% to 8.4% and in males from 1.8% to 5.9%. Under application of qualitative tests to females rate of human papilloma virus with high viral load made up to 68.4%, with medium viral load - 2.85% (n = 24) and with low viral load -0.24% (n = 2). Under application of quantitative tests in males rate of detection of types of human papilloma virus made up to 53% and at that in all high viral load was established. In females, the most of oncogene types of human papilloma virus (except for 31, 39, 59) are detected significantly more often than in males.

  2. A computational quantitative structure-activity relationship study of carbamate anticonvulsants using quantum pharmacological methods.

    PubMed

    Knight, J L; Weaver, D F

    1998-10-01

    A pattern recognition quantitative structure-activity relationship (QSAR) study has been performed to determine the molecular features of carbamate anticonvulsants which influence biological activity. Although carbamates, such as felbamate, have been used to treat epilepsy, their mechanisms of efficacy and toxicity are not completely understood. Quantum and classical mechanics calculations have been exploited to describe 46 carbamate drugs. Employing a principal component analysis and multiple linear regression calculations, five crucial structural descriptors were identified which directly relate to the bioactivity of the carbamate family. With the resulting mathematical model, the biological activity of carbamate analogues can be predicted with 85-90% accuracy.

  3. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody.

    PubMed

    Yoshinari, Tomoya; Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena; Ohkawa, Hideo; Sugita-Konishi, Yoshiko

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L(-1). The coefficients of variation were 7.9% at 0.003 mg L(-1), 5.0% at 0.03 mg L(-1) and 13.7% at 0.3 mg L(-1), respectively. The limit of detection was 0.006 mg L(-1) for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9-100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg(-1). The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R(2) = 0.9760) than the immunochromatographic assay kit (R(2) = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety.

  4. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  5. Targeted LC-MS/MS Method for the Quantitation of Plant Lignans and Enterolignans in Biofluids from Humans and Pigs.

    PubMed

    Nørskov, Natalja P; Olsen, Anja; Tjønneland, Anne; Bolvig, Anne Katrine; Lærke, Helle Nygaard; Knudsen, Knud Erik Bach

    2015-07-15

    Lignans have gained nutritional interest due to their promising role in the prevention of lifestyle diseases. However, epidemiological studies are in need of more evidence to link the intake of lignans to this promising role. In this context, it is necessary to study large population groups to obtain sufficient statistical power. Therefore, there is a demand for fast, sensitive, and accurate methods for quantitation with high throughput of samples. This paper presents a validated LC-MS/MS method for the quantitation of eight plant lignans (matairesinol, hydroxymatairesinol, secoisolariciresinol, lariciresinol, isolariciresinol, syringaresinol, medioresinol, and pinoresinol) and two enterolignans (enterodiol and enterolactone) in both human and pig plasma and urine. The method showed high selectivity and sensitivity allowing quantitation of lignans in the range of 0.024-100 ng/mL and with a run time of only 4.8 min per sample. The method was successfully applied to quantitate lignans in biofluids from ongoing studies with humans and pigs.

  6. Quantitative method for determining serum alkaline phosphatase isoenzyme activity II. Development and clinical application of method for measuring four serum alkaline phosphatase isoenzymes.

    PubMed Central

    Shephard, M D; Peake, M J; Walmsley, R N

    1986-01-01

    A method for quantitating the liver, bone, intestinal and placental alkaline phosphatase activity of serum, using an algorithm for converting selective inactivation by guanidine hydrochloride, L-phenylalanine, and heat into equivalent isoenzyme activity is described. The method can individually quantify mixtures of isoenzymes to within a margin of 3%; it has acceptable reproducibility and has been used to develop both age and sex related reference ranges. Analysis time is about 30 minutes. The clinical reliability of this method has been shown in a study of 101 patients, in 79% of whom isoenzyme results were compatible with the final clinical diagnosis; in 10% a clinical diagnosis resulted from isoenzyme analysis, and in a further 11% the source of the increased alkaline phosphatase activity was identified and supported by electrophoresis, with a definite clinical diagnosis yet to be made. PMID:3760234

  7. Test Results for Entry Guidance Methods for Space Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2004-01-01

    There are a number of approaches to advanced guidance and control that have the potential for achieving the goals of significantly increasing reusable launch vehicle (or any space vehicle that enters an atmosphere) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future vehicle concepts.

  8. Test Results for Entry Guidance Methods for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2003-01-01

    There are a number of approaches to advanced guidance and control (AG&C) that have the potential for achieving the goals of significantly increasing reusable launch vehicle (RLV) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies (ITAGCT) has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future reusable vehicle concepts.

  9. Quantitative GSL-glycome analysis of human whole serum based on an EGCase digestion and glycoblotting method[S

    PubMed Central

    Furukawa, Jun-ichi; Sakai, Shota; Yokota, Ikuko; Okada, Kazue; Hanamatsu, Hisatoshi; Kobayashi, Takashi; Yoshida, Yasunobu; Higashino, Kenichi; Tamura, Tomohiro; Igarashi, Yasuyuki; Shinohara, Yasuro

    2015-01-01

    Glycosphingolipids (GSLs) are lipid molecules linked to carbohydrate units that form the plasma membrane lipid raft, which is clustered with sphingolipids, sterols, and specific proteins, and thereby contributes to membrane physical properties and specific recognition sites for various biological events. These bioactive GSL molecules consequently affect the pathophysiology and pathogenesis of various diseases. Thus, altered expression of GSLs in various diseases may be of importance for disease-related biomarker discovery. However, analysis of GSLs in blood is particularly challenging because GSLs are present at extremely low concentrations in serum/plasma. In this study, we established absolute GSL-glycan analysis of human serum based on endoglycoceramidase digestion and glycoblotting purification. We established two sample preparation protocols, one with and the other without GSL extraction using chloroform/methanol. Similar amounts of GSL-glycans were recovered with the two protocols. Both protocols permitted absolute quantitation of GSL-glycans using as little as 20 μl of serum. Using 10 healthy human serum samples, up to 42 signals corresponding to GSL-glycan compositions could be quantitatively detected, and the total serum GSL-glycan concentration was calculated to be 12.1–21.4 μM. We further applied this method to TLC-prefractionated serum samples. These findings will assist the discovery of disease-related biomarkers by serum GSL-glycomics. PMID:26420879

  10. Simple absolute quantification method correcting for quantitative PCR efficiency variations for microbial community samples.

    PubMed

    Brankatschk, Robert; Bodenhausen, Natacha; Zeyer, Josef; Bürgmann, Helmut

    2012-06-01

    Real-time quantitative PCR (qPCR) is a widely used technique in microbial community analysis, allowing the quantification of the number of target genes in a community sample. Currently, the standard-curve (SC) method of absolute quantification is widely employed for these kinds of analysis. However, the SC method assumes that the amplification efficiency (E) is the same for both the standard and the sample target template. We analyzed 19 bacterial strains and nine environmental samples in qPCR assays, targeting the nifH and 16S rRNA genes. The E values of the qPCRs differed significantly, depending on the template. This has major implications for the quantification. If the sample and standard differ in their E values, quantification errors of up to orders of magnitude are possible. To address this problem, we propose and test the one-point calibration (OPC) method for absolute quantification. The OPC method corrects for differences in E and was derived from the ΔΔC(T) method with correction for E, which is commonly used for relative quantification in gene expression studies. The SC and OPC methods were compared by quantifying artificial template mixtures from Geobacter sulfurreducens (DSM 12127) and Nostoc commune (Culture Collection of Algae and Protozoa [CCAP] 1453/33), which differ in their E values. While the SC method deviated from the expected nifH gene copy number by 3- to 5-fold, the OPC method quantified the template mixtures with high accuracy. Moreover, analyzing environmental samples, we show that even small differences in E between the standard and the sample can cause significant differences between the copy numbers calculated by the SC and the OPC methods.

  11. Membrane chromatographic immunoassay method for rapid quantitative analysis of specific serum antibodies.

    PubMed

    Ghosh, Raja

    2006-02-05

    This paper discusses a membrane chromatographic immunoassay method for rapid detection and quantitative analysis of specific serum antibodies. A type of polyvinylidine fluoride (PVDF) microfiltration membrane was used in the method for its ability to reversibly and specifically bind IgG antibodies from antiserum samples by hydrophobic interaction. Using this form of selective antibody binding and enrichment an affinity membrane with antigen binding ability was obtained in-situ. This was done by passing a pulse of diluted antiserum sample through a stack of microporous PVDF membranes. The affinity membrane thus formed was challenged with a pulse of antigen solution and the amount of antigen bound was accurately determined using chromatographic methods. The antigen binding correlated well with the antibody loading on the membrane. This method is direct, rapid and accurate, does not involve any chemical reaction, and uses very few reagents. Moreover, the same membrane could be repeatedly used for sequential immunoassays on account of the reversible nature of the antibody binding. Proof of concept of this method is provided using human hemoglobin as model antigen and rabbit antiserum against human hemoglobin as the antibody source.

  12. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  13. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials.

  14. Advances in Statistical Methods to Map Quantitative Trait Loci in Outbred Populations

    PubMed Central

    Hoeschele, I.; Uimari, P.; Grignola, F. E.; Zhang, Q.; Gage, K. M.

    1997-01-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown. PMID:9383084

  15. Development of a quantitative method to monitor the effect of a tooth whitening agent.

    PubMed

    Amaechi, Bennett T; Higham, Susan M

    2002-01-01

    This study demonstrated a quantitative method for assessing the effect of a tooth whitening agent. Forty human teeth were stained with a tea solution, and randomly assigned to two groups (A, B) of twenty teeth. The teeth were subsequently treated with either sodium hypochlorite (NaOCL) or deionized distilled water (DDW) by intermittent immersion (60 seconds on each occasion) in a 1:10 dilution of NaOCL (group A) or DDW (group B). Prior to whitening and following each immersion, the color of the teeth at the stained spot was measured using ShadeEye-Ex Dental Chroma Meter and quantitative light-induced fluorescence (QLF). ShadeEye-Ex instantly gave a numerical value for the stain intensity, chroma (C), which is the average of three measurements taken automatically by the machine. QLF gave a quantitative value for the stain, delta Q (% mm2), following analysis of the fluorescence image of the tooth. Immersion was stopped after four readings when one specimen, in group A, was observed to have regained its natural color. There was a good correlation between C and delta Q with either NaOCL (Pearson correlation coefficient (r) = 0.974; p < 0.05) or DDW (r = 0.978; p < 0.05). With NaOCL, an inverse relationship observed between stain measurements, C (Linear fit correlation (R) = -0.982; p < 0.05) or delta Q (R = -0.988; p < 0.05) and exposure time correlated to a linear fit, but not with DDW. ANOVA showed a significant difference between the means (n = 20) of the reading at the measurement intervals (0, 60, 120 and 180 seconds) for both C (p < 0.001) and delta Q (p < 0.001) with NaOCL but not with DDW. In conclusion, the study highlighted the potential of ShadeEye-Ex Dental Chroma Meter as a tool for the quantitative assessment of the gradual change in shade of discolored teeth by tooth whitening products.

  16. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research.

  17. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  18. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  19. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  20. Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method.

    PubMed

    Meng, Xin; Huang, Huachuan; Yan, Keding; Tian, Xiaolin; Yu, Wei; Cui, Haoyang; Kong, Yan; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-12-20

    In order to realize high contrast imaging with portable devices for potential mobile healthcare, we demonstrate a hand-held smartphone based quantitative phase microscope using the transport of intensity equation method. With a cost-effective illumination source and compact microscope system, multi-focal images of samples can be captured by the smartphone's camera via manual focusing. Phase retrieval is performed using a self-developed Android application, which calculates sample phases from multi-plane intensities via solving the Poisson equation. We test the portable microscope using a random phase plate with known phases, and to further demonstrate its performance, a red blood cell smear, a Pap smear and monocot root and broad bean epidermis sections are also successfully imaged. Considering its advantages as an accurate, high-contrast, cost-effective and field-portable device, the smartphone based hand-held quantitative phase microscope is a promising tool which can be adopted in the future in remote healthcare and medical diagnosis.

  1. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  2. Qualitative and quantitative characterization of protein-phosphoinositide interactions with liposome-based methods.

    PubMed

    Busse, Ricarda A; Scacioc, Andreea; Hernandez, Javier M; Krick, Roswitha; Stephan, Milena; Janshoff, Andreas; Thumm, Michael; Kühnel, Karin

    2013-05-01

    We characterized phosphoinositide binding of the S. cerevisiae PROPPIN Hsv2 qualitatively with density flotation assays and quantitatively through isothermal titration calorimetry (ITC) measurements using liposomes. We discuss the design of these experiments and show with liposome flotation assays that Hsv2 binds with high specificity to both PtdIns3P and PtdIns(3,5)P 2. We propose liposome flotation assays as a more accurate alternative to the commonly used PIP strips for the characterization of phosphoinositide-binding specificities of proteins. We further quantitatively characterized PtdIns3P binding of Hsv2 with ITC measurements and determined a dissociation constant of 0.67 µM and a stoichiometry of 2:1 for PtdIns3P binding to Hsv2. PtdIns3P is crucial for the biogenesis of autophagosomes and their precursors. Besides the PROPPINs there are other PtdIns3P binding proteins with a link to autophagy, which includes the FYVE-domain containing proteins ZFYVE1/DFCP1 and WDFY3/ALFY and the PX-domain containing proteins Atg20 and Snx4/Atg24. The methods described could be useful tools for the characterization of these and other phosphoinositide-binding proteins.

  3. A quantitative method for zoning of protected areas and its spatial ecological implications.

    PubMed

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  4. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  5. Supersonic cruise research aircraft structural studies: Methods and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Gross, D.; Kurtze, W.; Newsom, J.; Wrenn, G.; Greene, W.

    1981-01-01

    NASA Langley Research Center SCAR in-house structural studies are reviewed. In methods development, advances include a new system of integrated computer programs called ISSYS, progress in determining aerodynamic loads and aerodynamically induced structural loads (including those due to gusts), flutter optimization for composite and metal airframe configurations using refined and simplified mathematical models, and synthesis of active controls. Results given address several aspects of various SCR configurations. These results include flutter penalties on composite wing, flutter suppression using active controls, roll control effectiveness, wing tip ground clearance, tail size effect on flutter, engine weight and mass distribution influence on flutter, and strength and flutter optimization of new configurations. The ISSYS system of integrated programs performed well in all the applications illustrated by the results, the diversity of which attests to ISSYS' versatility.

  6. A processing method and results of meteor shower radar observations

    NASA Technical Reports Server (NTRS)

    Belkovich, O. I.; Suleimanov, N. I.; Tokhtasjev, V. S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed.

  7. Methane generation in tropical landfills: simplified methods and field results.

    PubMed

    Machado, Sandro L; Carvalho, Miriam F; Gourc, Jean-Pierre; Vilar, Orencio M; do Nascimento, Julio C F

    2009-01-01

    This paper deals with the use of simplified methods to predict methane generation in tropical landfills. Methane recovery data obtained on site as part of a research program being carried out at the Metropolitan Landfill, Salvador, Brazil, is analyzed and used to obtain field methane generation over time. Laboratory data from MSW samples of different ages are presented and discussed; and simplified procedures to estimate the methane generation potential, Lo, and the constant related to the biodegradation rate, k are applied. The first order decay method is used to fit field and laboratory results. It is demonstrated that despite the assumptions and the simplicity of the adopted laboratory procedures, the values Lo and k obtained are very close to those measured in the field, thus making this kind of analysis very attractive for first approach purposes.

  8. Setting health research priorities using the CHNRI method: V. Quantitative properties of human collective knowledge

    PubMed Central

    Rudan, Igor; Yoshida, Sachiyo; Wazny, Kerri; Chan, Kit Yee; Cousens, Simon

    2016-01-01

    Introduction The CHNRI method for setting health research priorities has crowdsourcing as the major component. It uses the collective opinion of a group of experts to generate, assess and prioritize between many competing health research ideas. It is difficult to compare the accuracy of human individual and collective opinions in predicting uncertain future outcomes before the outcomes are known. However, this limitation does not apply to existing knowledge, which is an important component underlying opinion. In this paper, we report several experiments to explore the quantitative properties of human collective knowledge and discuss their relevance to the CHNRI method. Methods We conducted a series of experiments in groups of about 160 (range: 122–175) undergraduate Year 2 medical students to compare their collective knowledge to their individual knowledge. We asked them to answer 10 questions on each of the following: (i) an area in which they have a degree of expertise (undergraduate Year 1 medical curriculum); (ii) an area in which they likely have some knowledge (general knowledge); and (iii) an area in which they are not expected to have any knowledge (astronomy). We also presented them with 20 pairs of well–known celebrities and asked them to identify the older person of the pair. In all these experiments our goal was to examine how the collective answer compares to the distribution of students’ individual answers. Results When answering the questions in their own area of expertise, the collective answer (the median) was in the top 20.83% of the most accurate individual responses; in general knowledge, it was in the top 11.93%; and in an area with no expertise, the group answer was in the top 7.02%. However, the collective answer based on mean values fared much worse, ranging from top 75.60% to top 95.91%. Also, when confronted with guessing the older of the two celebrities, the collective response was correct in 18/20 cases (90%), while the 8 most

  9. A simple, quantitative method using alginate gel to determine rat colonic tumor volume in vivo.

    PubMed

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-04-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the Apc(Pirc/+) rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm³. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained.

  10. A Simple, Quantitative Method Using Alginate Gel to Determine Rat Colonic Tumor Volume In Vivo

    PubMed Central

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-01-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the ApcPirc/+ rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm3. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained. PMID:24674588

  11. Spectrophotometric Method for Quantitative Determination of Cefixime in Bulk and Pharmaceutical Preparation Using Ferroin Complex

    NASA Astrophysics Data System (ADS)

    Naeem Khan, M.; Qayum, A.; Ur Rehman, U.; Gulab, H.; Idrees, M.

    2015-09-01

    A method was developed for the quantitative determination of cefixime in bulk and pharmaceutical preparations using ferroin complex. The method is based on the oxidation of the cefixime with Fe(III) in acidic medium. The formed Fe(II) reacts with 1,10-phenanthroline, and the ferroin complex is measured spectrophotometrically at 510 nm against reagent blank. Beer's law was obeyed in the concentration range 0.2-10 μg/ml with a good correlation of 0.993. The molar absorptivity was calculated and was found to be 1.375×105 L/mol × cm. The limit of detection (LOD) and limit of quantification (LOQ) were found to be 0.030 and 0.101 μg/ml respectively. The proposed method has reproducibility with a relative standard deviation of 5.28% (n = 6). The developed method was validated statistically by performing a recoveries study and successfully applied for the determination of cefixime in bulk powder and pharmaceutical formulations without interferences from common excipients. Percent recoveries were found to range from 98.00 to 102.05% for the pure form and 97.83 to 102.50% for pharmaceutical preparations.

  12. Polymorphism in nimodipine raw materials: development and validation of a quantitative method through differential scanning calorimetry.

    PubMed

    Riekes, Manoela Klüppel; Pereira, Rafael Nicolay; Rauber, Gabriela Schneider; Cuffini, Silvia Lucia; de Campos, Carlos Eduardo Maduro; Silva, Marcos Antonio Segatto; Stulzer, Hellen Karine

    2012-11-01

    Due to the physical-chemical and therapeutic impacts of polymorphism, its monitoring in raw materials is necessary. The purpose of this study was to develop and validate a quantitative method to determine the polymorphic content of nimodipine (NMP) raw materials based on differential scanning calorimetry (DSC). The polymorphs required for the development of the method were characterized through DSC, X-ray powder diffraction (XRPD) and Raman spectroscopy and their polymorphic identity was confirmed. The developed method was found to be linear, robust, precise, accurate and specific. Three different samples obtained from distinct suppliers (NMP 1, NMP 2 and NMP 3) were firstly characterized through XRPD and DSC as polymorphic mixtures. The determination of their polymorphic identity revealed that all samples presented the Modification I (Mod I) or metastable form in greatest proportion. Since the commercial polymorph is Mod I, the polymorphic characteristic of the samples analyzed needs to be investigated. Thus, the proposed method provides a useful tool for the monitoring of the polymorphic content of NMP raw materials.

  13. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  14. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  15. A method to optimize selection on multiple identified quantitative trait loci

    PubMed Central

    Chakraborty, Reena; Moreau, Laurence; Dekkers, Jack CM

    2002-01-01

    A mathematical approach was developed to model and optimize selection on multiple known quantitative trait loci (QTL) and polygenic estimated breeding values in order to maximize a weighted sum of responses to selection over multiple generations. The model allows for linkage between QTL with multiple alleles and arbitrary genetic effects, including dominance, epistasis, and gametic imprinting. Gametic phase disequilibrium between the QTL and between the QTL and polygenes is modeled but polygenic variance is assumed constant. Breeding programs with discrete generations, differential selection of males and females and random mating of selected parents are modeled. Polygenic EBV obtained from best linear unbiased prediction models can be accommodated. The problem was formulated as a multiple-stage optimal control problem and an iterative approach was developed for its solution. The method can be used to develop and evaluate optimal strategies for selection on multiple QTL for a wide range of situations and genetic models. PMID:12081805

  16. A quantitative and qualitative method to control chemotherapeutic preparations by Fourier transform infrared-ultraviolet spectrophotometry.

    PubMed

    Dziopa, Florian; Galy, Guillaume; Bauler, Stephanie; Vincent, Benoit; Crochon, Sarah; Tall, Mamadou Lamine; Pirot, Fabrice; Pivot, Christine

    2013-06-01

    Chemotherapy products in hospitals include a reconstitution step of manufactured drugs providing an adapted dosage to each patient. The administration of highly iatrogenic drugs raises the question of patients' safety and treatment efficiency. In order to reduce administration errors due to faulty preparations, we introduced a new qualitative and quantitative routine control based on Fourier Transform Infrared (FTIR) and UV-Visible spectrophotometry. This automated method enabled fast and specific control for 14 anticancer drugs. A 1.2 mL sample was used to assay and identify each preparation in less than 90 sec. Over a two-year period, 9370 controlled infusion bags showed a 1.49% nonconformity rate, under 15% tolerance from the theoretical concentration and 96% minimum identification matching factor. This study evaluated the reliability of the control process, as well as its accordance to chemotherapy deliverance requirements. Thus, corrective measures were defined to improve the control process.

  17. Quantitative test method for evaluation of anti-fingerprint property of coated surfaces

    NASA Astrophysics Data System (ADS)

    Wu, Linda Y. L.; Ngian, S. K.; Chen, Z.; Xuan, D. T. T.

    2011-01-01

    An artificial fingerprint liquid is formulated from artificial sweat, hydroxyl-terminated polydimethylsiloxane and a solvent for direct determination of anti-fingerprint property of a coated surface. A range of smooth and rough surfaces with different anti-fingerprint (AF) properties were fabricated by sol-gel technology, on which the AF liquid contact angles, artificial fingerprint and real human fingerprints (HF) were verified and correlated. It is proved that a surface with AF contact angle above 87° is fingerprint free. This provides an objective and quantitative test method to determine anti-fingerprint property of coated surfaces. It is also concluded that AF property can be achieved on smooth and optically clear surfaces. Deep porous structures are more favorable than bumpy structure for oleophobic and AF properties.

  18. Risk assessment of false-positive quantitative real-time PCR results in food, due to detection of DNA originating from dead cells.

    PubMed

    Wolffs, Petra; Norling, Börje; Rådström, Peter

    2005-03-01

    Real-time PCR technology is increasingly used for detection and quantification of pathogens in food samples. A main disadvantage of nucleic acid detection is the inability to distinguish between signals originating from viable cells and DNA released from dead cells. In order to gain knowledge concerning risks of false-positive results due to detection of DNA originating from dead cells, quantitative PCR (qPCR) was used to investigate the degradation kinetics of free DNA in four types of meat samples. Results showed that the fastest degradation rate was observed (1 log unit per 0.5 h) in chicken homogenate, whereas the slowest rate was observed in pork rinse (1 log unit per 120.5 h). Overall results indicated that degradation occurred faster in chicken samples than in pork samples and faster at higher temperatures. Based on these results, it was concluded that, especially in pork samples, there is a risk of false-positive PCR results. This was confirmed in a quantitative study on cell death and signal persistence over a period of 28 days, employing three different methods, i.e. viable counts, direct qPCR, and finally floatation, a recently developed discontinuous density centrifugation method, followed by qPCR. Results showed that direct qPCR resulted in an overestimation of up to 10 times of the amount of cells in the samples compared to viable counts, due to detection of DNA from dead cells. However, after using floatation prior to qPCR, results resembled the viable count data. This indicates that by using of floatation as a sample treatment step prior to qPCR, the risk of false-positive PCR results due to detection of dead cells, can be minimized.

  19. Verification methods: Rigorous results using floating-point arithmetic

    NASA Astrophysics Data System (ADS)

    Rump, Siegfried M.

    A classical mathematical proof is constructed using pencil and paper. However, there are many ways in which computers may be used in a mathematical proof. But `proof by computer', or even the use of computers in the course of a proof, is not so readily accepted (the December 2008 issue of the Notices of the American Mathematical Society is devoted to formal proofs by computer).In the following we introduce verification methods and discuss how they can assist in achieving a mathematically rigorous result. In particular we emphasize how floating-point arithmetic is used.

  20. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  1. Quantitative prediction of in vivo profiles of CYP3A4 induction in humans from in vitro results with a reporter gene assay.

    PubMed

    Kozawa, Masanari; Honma, Masashi; Suzuki, Hiroshi

    2009-06-01

    Although primary human hepatocytes are commonly used for induction studies, the evaluation method is associated with several problems. More recently, a reporter gene assay has been suggested to be an alternative, although the contribution of only transfected nuclear receptors can be evaluated. The aim of the present study was to establish a method by which the extent of in vivo CYP3A4 induction in humans can be quantitatively predicted based on in vitro results with a reporter gene assay. From previous reports, we calculated in vivo induction ratios (R(in vivo)) caused by prototypical inducers based on the alterations in the hepatic intrinsic clearance of probe drugs. Next, we derived equations by which these R(in vivo) values can be predicted from the results of a reporter gene assay. To use the data obtained from a reporter gene assay, rifampicin was used as a reference drug. The correction coefficient (CC), which is used to quantitatively correlate the activity of inducers between in vitro and in vivo situations, was calculated by comparing the predicted data with the observed R(in vivo) values for rifampicin. With the calculated CC value, good correlations were found between the predicted and observed R(in vivo) values for other inducers such as phenobarbital, phenytoin, and omeprazole. Taken together, with the equations derived in the present study, we have been able to predict the extent of in vivo induction of human CYP3A4 by inducers in a time-dependent and quantitative manner from in vitro data.

  2. A Practical Method of Monitoring the Results of Health Care

    PubMed Central

    Daugharty, G. D.

    1979-01-01

    To meet our goal of improving health care through more productive use of the data we are collecting about the delivery of health care we need to define our concepts of health and quality. The WHO definition of health allows the design of useful functional outcome criteria which give us measurable standards for the outcome of the health care. By recording, retrieving, and reviewing pertinent information from the structure and the process of health care for a valid comparison with its outcome, the most effective and efficient health care is identified. A practical system is presented which identifies the better methods of management and produces the motivation for change that results in improved care. The successful use of this system in a private practice supports its universal adaptability for health care providers. The initial encouraging results suggest that future trials in other types of practices will be even more encouraging.

  3. A quantitative real-time PCR method for monitoring Clostridium botulinum type A in rice samples.

    PubMed

    Takahashi, Hajime; Takakura, Chikako; Kimura, Bon

    2010-04-01

    A quantitative real-time PCR using SYBR Green dye was developed to target the neurotoxin type A (boNT/A) gene of Clostridium botulinum type A. Primer specificity was confirmed by analyzing 63 strains including 5 strains of C. botulinum type A and 11 of non-type A C. botulinum. The highly similar amplification efficiencies of the real-time PCR assay were observed for 5 strains of C. botulinum type A. The DNA extraction with NucliSENS miniMAG provided sufficient performance to obtain the purified DNA from steamed rice samples and to develop the standard curve for the enumeration of C. botulinum in steamed rice samples. The real-time PCR assay could detect 10 cells per milliliter of 10 x rice homogenate, thus indicating that more than 100 C. botulinum cells per g of rice sample was quantifiable by the real-time PCR assay. The inoculation of aseptic rice samples with low numbers of C. botulinum type A cells revealed that the fate of inoculated C. botulinum type A cells in rice samples could be monitored accurately by the real-time PCR assay. These results indicate that the real-time PCR assay developed in this study provides rapid, effective, and quantitative monitoring of C. botulinum in steamed rice samples.

  4. Transconvolution and the virtual positron emission tomograph-A new method for cross calibration in quantitative PET/CT imaging

    SciTech Connect

    Prenosil, George A.; Weitzel, Thilo; Hentschel, Michael; Klaeser, Bernd; Krause, Thomas

    2013-06-15

    Purpose: Positron emission tomography (PET)/computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET/CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET/CT in the context of multicenter trials. Methods: To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET/CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET/CT systems, a dedicated solid-state phantom incorporating {sup 68}Ge/{sup 68}Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination

  5. Application of quantitative 1H-NMR method to determination of gentiopicroside in Gentianae radix and Gentianae scabrae radix.

    PubMed

    Tanaka, Rie; Hasebe, Yuko; Nagatsu, Akito

    2014-07-01

    A quantitative (1)H-NMR method (qHNMR) was used to measure gentiopicroside content in Gentianae radix and Gentianae scabrae radix. Gentiopicroside is a major component of Gentianae radix and Gentianae scabrae radix. The purity of gentiopicroside was calculated from the ratio of the intensity of the H-3 signal at δ 7.44 ppm or the H-8 signal at δ 5.78 ppm in methanol-d 4 of gentiopicroside to that of a hexamethyldisilane (HMD) signal at 0 ppm. The concentration of HMD was corrected with SI traceability by using potassium hydrogen phthalate of certified reference material (CRM) grade. As a result, the gentiopicroside content in two lots of Gentianae radix as determined by qHNMR was found to be 1.76 and 2.17 %, respectively. The gentiopicroside content in two lots of Gentianae scabrae radix was 2.73 and 3.99 %, respectively. We demonstrated that this method is useful for the quantitative analysis of crude drugs.

  6. Simulation of collaborative studies for real-time PCR-based quantitation methods for genetically modified crops.

    PubMed

    Watanabe, Satoshi; Sawada, Hiroshi; Naito, Shigehiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi; Hino, Akihiro

    2013-01-01

    To study impacts of various random effects and parameters of collaborative studies on the precision of quantitation methods of genetically modified (GM) crops, we developed a set of random effects models for cycle time values of a standard curve-based relative real-time PCR that makes use of an endogenous gene sequence as the internal standard. The models and data from a published collaborative study for six GM lines at four concentration levels were used to simulate collaborative studies under various conditions. Results suggested that by reducing the numbers of well replications from three to two, and standard levels of endogenous sequence from five to three, the number of unknown samples analyzable on a 96-well PCR plate in routine analyses could be almost doubled, and still the acceptable repeatability RSD (RSDr < or = 25%) and the reproducibility RSD (RSDR < 35%) of the collaborative study could be met. Further, RSDr and RSD(R) were found most sensitive to random effects attributable to inhomogeneity among blind replicates, but they were little influenced by those attributable to DNA extractions. The proposed models are expected to be useful for optimizing standard curve-based relative quantitation methods for GM crops by real-time PCR and their collaborative studies.

  7. A quantitative method for evaluating numerical simulation accuracy of time-transient Lamb wave propagation with its applications to selecting appropriate element size and time step.

    PubMed

    Wan, Xiang; Xu, Guanghua; Zhang, Qing; Tse, Peter W; Tan, Haihui

    2016-01-01

    Lamb wave technique has been widely used in non-destructive evaluation (NDE) and structural health monitoring (SHM). However, due to the multi-mode characteristics and dispersive nature, Lamb wave propagation behavior is much more complex than that of bulk waves. Numerous numerical simulations on Lamb wave propagation have been conducted to study its physical principles. However, few quantitative studies on evaluating the accuracy of these numerical simulations were reported. In this paper, a method based on cross correlation analysis for quantitatively evaluating the simulation accuracy of time-transient Lamb waves propagation is proposed. Two kinds of error, affecting the position and shape accuracies are firstly identified. Consequently, two quantitative indices, i.e., the GVE (group velocity error) and MACCC (maximum absolute value of cross correlation coefficient) derived from cross correlation analysis between a simulated signal and a reference waveform, are proposed to assess the position and shape errors of the simulated signal. In this way, the simulation accuracy on the position and shape is quantitatively evaluated. In order to apply this proposed method to select appropriate element size and time step, a specialized 2D-FEM program combined with the proposed method is developed. Then, the proper element size considering different element types and time step considering different time integration schemes are selected. These results proved that the proposed method is feasible and effective, and can be used as an efficient tool for quantitatively evaluating and verifying the simulation accuracy of time-transient Lamb wave propagation.

  8. Preliminary Results from a Mercury Dry Deposition Measurement Methods Intercomparison

    NASA Astrophysics Data System (ADS)

    Marsik, F. J.; Brooks, S.; Gustin, M. S.; Holsen, T.; Landis, M.; Prestbo, E. M.; Poissant, L.

    2009-12-01

    Over the past fifteen years, a number of intensive field campaigns and measurement networks have provided valuable information on the estimated rates of mercury wet deposition to sensitive ecosystems throughout the world. In contrast, the ability to place bounds on the rates of mercury dry deposition has been hampered by the relative lack of direct measurements of this process. Recently, a number of researchers have performed measurements of mercury dry deposition using a variety of direct and indirect measurement techniques. While these studies have provided important information regarding the potential rates of mercury dry deposition to natural surfaces, little is known about the comparability of the results utilizing these different measurement approaches. During the month of August 2008, a mercury dry deposition measurement methods comparison was conducted in Ann Arbor, Michigan over a nine-day period. Seven research groups participated in the study, with the following measurement approaches: water, cation exchange membrane, chemically treated filter and turf surrogate surfaces; and several micrometeorological modeling methods. Continuous monitoring was conducted for ambient meteorological conditions and elemental, oxidized and particulate mercury concentrations. Preliminary results suggest that study-average mercury dry deposition estimates ranged from 0.17 to 0.59 ng/m2/hour for the group of pure-water surrogate surfaces, the cation exchange membrane and a micrometeorological flux gradient approach. The turf surrogate surface, BrCl spiked-water surface and a gold-coated quartz fiber filter surface resulted in significantly higher mercury dry deposition estimates, with the latter two approaches having been designed to measure total mercury dry deposition. Given that the turf surrogate surface and the cation exchange membrane samplers were designed for long-term deployment (up to one week), these methods were deployed for an additional series of four one

  9. Quantitative Analysis of Single and Mix Food Antiseptics Basing on SERS Spectra with PLSR Method

    NASA Astrophysics Data System (ADS)

    Hou, Mengjing; Huang, Yu; Ma, Lingwei; Zhang, Zhengjun

    2016-06-01

    Usage and dosage of food antiseptics are very concerned due to their decisive influence in food safety. Surface-enhanced Raman scattering (SERS) effect was employed in this research to realize trace potassium sorbate (PS) and sodium benzoate (SB) detection. HfO2 ultrathin film-coated Ag NR array was fabricated as SERS substrate. Protected by HfO2 film, the SERS substrate possesses good acid resistance, which enables it to be applicable in acidic environment where PS and SB work. Regression relationship between SERS spectra of 0.3~10 mg/L PS solution and their concentration was calibrat