Science.gov

Sample records for quantitative methods results

  1. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  2. A Method for Quantitative Evaluation of the Results of Postural Tests.

    PubMed

    Alifirova, V M; Brazovskii, K S; Zhukova, I A; Pekker, Ya S; Tolmachev, I V; Fokin, V A

    2016-07-01

    A method for quantitative evaluation of the results of postural tests is proposed. The method is based on contact-free measurements of 3D coordinates of body point movements. The result can serve as an integral test based on the Mahalanobis distance. PMID:27492397

  3. A quantitative method for evaluating results of treating Legg-Perthes syndrome.

    PubMed

    Harry, J D; Gross, R H

    1987-01-01

    A new quantitative method of analyzing hip joint architecture in Legg-Perthes syndrome is presented. Outlines of the bony femoral head and acetabular configuration as seen on the anteroposterior (AP) view were traced on a digitizer. Computer analysis provided measures of joint congruity, containment, and femoral head shape. The method's ability to distinguish pathologic from normal hips and to trace the course of the bony deformity of the hip joint was demonstrated in a group of 14 patients. Interobserver reliability was established. The method provides an objective quantification of treatment results and a reliable means for comparison of data between groups of patients.

  4. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  5. Quantitative vs qualitative research methods.

    PubMed

    Lakshman, M; Sinha, L; Biswas, M; Charles, M; Arora, N K

    2000-05-01

    Quantitative methods have been widely used because of the fact that things that can be measured or counted gain scientific credibility over the unmeasurable. But the extent of biological abnormality, severity, consequences and the impact of illness cannot be satisfactorily captured and answered by the quantitative research alone. In such situations qualitative methods take a holistic perspective preserving the complexities of human behavior by addressing the "why" and "how" questions. In this paper an attempt has been made to highlight the strengths and weaknesses of both the methods and also that a balanced mix of both qualitative as well as quantitative methods yield the most valid and reliable results.

  6. [The evaluation of the results after coronary angioplasty by intracoronary Doppler and quantitative angiography. The correlation of both methods].

    PubMed

    Goicolea Ruigómez, F J; Iñíguez Romo, A; Macaya, C; Alfonso, F; Hernández Antolín, R; Casado, J; Zamorano, J; Zarco, P

    1992-03-01

    To study the importance of measuring coronary flow reserve immediately after coronary angioplasty we have analysed the results obtained after 28 angioplasties performed in 21 patients. Coronary flow reserve was measured with a 3F intracoronary catheter selectively placed in the dilated artery. Corresponding coronary angiography was analysed with an automatic edge detection program (ARTREK) and visual estimation. Coronary flow reserve increased in 26/27 cases after angioplasty from 2.4 +/- 1.3 to 4.1 +/- 2.7 (p less than 0.001). A correlation was found between minimal luminal area and minimal luminal diameter after coronary angioplasty, and coronary flow reserve (r = 0.46; p less than 0.05 and r = 0.47; p less than 0.05, respectively). The finding of a normal coronary flow reserve (greater than or equal to 3.5), had a 100% specificity but only 56% sensitivity to detect angiographic success (residual stenosis less than 50%). However 47% of patients with angiographic success did not reach normal values of coronary flow reserve. Visual estimation of the stenosis had a good correlation with automatic evaluation but significant scattering was observed at visual levels less than or equal to 25%. Visual assessment underestimated residual stenosis in all but one of the procedures. We conclude that coronary flow reserve is a potentially useful index for assessing the results after angioplasty that may complement coronary angiography. Nonetheless substantial differences between both methods exist in a significant number of cases. The relative merits of both methods, as well as the particular circumstances in which coronary flow reserve should be used, require further studies.

  7. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  8. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  9. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from Midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents.

    PubMed

    Sivaganesan, Mano; Sivaganensan, Mano; Siefring, Shawn; Varma, Manju; Haugland, Richard A

    2014-06-01

    Enterococci target sequence density estimates from analyses of diluted river water DNA extracts by EPA Methods 1611 and 1609 and estimates with lower detection limits from undiluted DNA extracts by Method 1609 were indistinguishable. These methods should be equally suitable for comparison with U.S. EPA 2012 Recreational Water Quality Criteria values.

  10. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  11. Quantitative methods for food allergens: a review.

    PubMed

    Kirsch, Stéphanie; Fourdrilis, Séverine; Dobson, Rowan; Scippo, Marie-Louise; Maghuin-Rogister, Guy; De Pauw, Edwin

    2009-09-01

    The quantitative detection of allergens in the food chain is a strategic health objective as the prevalence of allergy continues to rise. Food allergenicity is caused by proteins either in their native form or in forms resulting from food processing. Progress in mass spectrometry greatly opened up the field of proteomics. These advances are now available for the detection and the quantification of traces of allergenic proteins in complex mixtures, and complete the set of biological tests used until now, such as ELISA or PCR. We review methods classified according to their ability to simultaneously quantify and identify allergenic proteins and underline major advances in the mass-spectrometric methods.

  12. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts.

  13. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  14. A new HPLC method for azithromycin quantitation.

    PubMed

    Zubata, Patricia; Ceresole, Rita; Rosasco, Maria Ana; Pizzorno, Maria Teresa

    2002-02-01

    A simple liquid chromatographic method was developed for the estimation of azithromycin raw material and in pharmaceutical forms. The sample was chromatographed on a reverse phase C18 column and eluants monitored at a wavelength of 215 nm. The method was accurate, precise and sufficiently selective. It is applicable for its quantitation, stability and dissolution tests.

  15. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field.

  16. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  17. Quantitative laser-induced breakdown spectroscopy data using peak area step-wise regression analysis: an alternative method for interpretation of Mars science laboratory results

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Dyar, Melinda D; Schafer, Martha W; Tucker, Jonathan M

    2008-01-01

    The ChemCam instrument on the Mars Science Laboratory (MSL) will include a laser-induced breakdown spectrometer (LIBS) to quantify major and minor elemental compositions. The traditional analytical chemistry approach to calibration curves for these data regresses a single diagnostic peak area against concentration for each element. This approach contrasts with a new multivariate method in which elemental concentrations are predicted by step-wise multiple regression analysis based on areas of a specific set of diagnostic peaks for each element. The method is tested on LIBS data from igneous and metamorphosed rocks. Between 4 and 13 partial regression coefficients are needed to describe each elemental abundance accurately (i.e., with a regression line of R{sup 2} > 0.9995 for the relationship between predicted and measured elemental concentration) for all major and minor elements studied. Validation plots suggest that the method is limited at present by the small data set, and will work best for prediction of concentration when a wide variety of compositions and rock types has been analyzed.

  18. Quantitative analysis of penile ultrasonographic shape during the erectile cycle: a new diagnostic tool for erectile dysfunction? Repeatability of the method and preliminary results.

    PubMed

    Sforza, C; Montorsi, F; Bianchi, A; Ferrario, V F

    1998-12-01

    Penile modifications during erection interest not only penile length and circumference, but also its size and shape. In this investigation, the size and shape repeatability of a single standardized ultrasonographic image of penis during flaccidity and erection was quantified in a group of 19 patients complaining of erectile dysfunction. The penis of each patient was placed in the dorsal position and scanned by a 10 MHz linear ultrasonographic probe at the site of maximal corporeal size: (1) during flaccidity; (2) after an intracorporeal injection of 10 micrograms alprostadil; and (3) after a genital stimulation. Each scan was repeated three times for each patient and printed. On each print the outline of the tunica albuginea was drawn, digitized, and mathematically reconstructed by Fourier series that allow a separate quantification of the size and shape differences. Reliability of the tunica albuginea outline and repeatability of probe positioning were separately assessed within patient and functional phase, and found to be good. The actual examinations of two impotence patients performed using the described method are also given. The shape of the corpora cavernosa may be potentially indicative of the homogeneous function of the tunica albuginea, and it may be of use in patients' follow up.

  19. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  20. Quantitative mass spectrometry methods for pharmaceutical analysis.

    PubMed

    Loos, Glenn; Van Schepdael, Ann; Cabooter, Deirdre

    2016-10-28

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage.This article is part of the themed issue 'Quantitative mass spectrometry'.

  1. Optimization method for quantitative calculation of clay minerals in soil

    NASA Astrophysics Data System (ADS)

    Hao, Libo; Wei, Qiaoqiao; Zhao, Yuyan; Lu, Jilong; Zhao, Xinyun

    2015-04-01

    Determination of types and amounts for clay minerals in soil are important in environmental, agricultural, and geological investigations. Many reliable methods have been established to identify clay mineral types. However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative determination of clay minerals in soil based on bulk chemical composition data. The fundamental principles and processes of the calculation are elucidated. Some samples were used for reliability verification of the method and the results prove the simplicity and efficacy of the approach.

  2. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application. PMID:26321463

  3. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  4. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  5. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  6. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  7. Quantitative gold nanoparticle analysis methods: A review.

    PubMed

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  8. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  9. Extended Rearrangement Inequalities and Applications to Some Quantitative Stability Results

    NASA Astrophysics Data System (ADS)

    Lemou, Mohammed

    2016-09-01

    In this paper, we prove a new functional inequality of Hardy-Littlewood type for generalized rearrangements of functions. We then show how this inequality provides quantitative stability results of steady states to evolution systems that essentially preserve the rearrangements and some suitable energy functional, under minimal regularity assumptions on the perturbations. In particular, this inequality yields a quantitative stability result of a large class of steady state solutions to the Vlasov-Poisson systems, and more precisely we derive a quantitative control of the L 1 norm of the perturbation by the relative Hamiltonian (the energy functional) and rearrangements. A general non linear stability result has been obtained by Lemou et al. (Invent Math 187:145-194, 2012) in the gravitational context, however the proof relied in a crucial way on compactness arguments which by construction provides no quantitative control of the perturbation. Our functional inequality is also applied to the context of 2D-Euler systems and also provides quantitative stability results of a large class of steady-states to this system in a natural energy space.

  10. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  11. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  12. Quantitative rotating frame relaxometry methods in MRI.

    PubMed

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd.

  13. [Progress in stable isotope labeled quantitative proteomics methods].

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  14. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  15. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  16. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  17. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  18. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  19. Evaluating quantitative proton-density-mapping methods.

    PubMed

    Mezer, Aviv; Rokem, Ariel; Berman, Shai; Hastie, Trevor; Wandell, Brian A

    2016-10-01

    Quantitative magnetic resonance imaging (qMRI) aims to quantify tissue parameters by eliminating instrumental bias. We describe qMRI theory, simulations, and software designed to estimate proton density (PD), the apparent local concentration of water protons in the living human brain. First, we show that, in the absence of noise, multichannel coil data contain enough information to separate PD and coil sensitivity, a limiting instrumental bias. Second, we show that, in the presence of noise, regularization by a constraint on the relationship between T1 and PD produces accurate coil sensitivity and PD maps. The ability to measure PD quantitatively has applications in the analysis of in-vivo human brain tissue and enables multisite comparisons between individuals and across instruments. Hum Brain Mapp 37:3623-3635, 2016. © 2016 Wiley Periodicals, Inc.

  20. Evaluating quantitative proton-density-mapping methods.

    PubMed

    Mezer, Aviv; Rokem, Ariel; Berman, Shai; Hastie, Trevor; Wandell, Brian A

    2016-10-01

    Quantitative magnetic resonance imaging (qMRI) aims to quantify tissue parameters by eliminating instrumental bias. We describe qMRI theory, simulations, and software designed to estimate proton density (PD), the apparent local concentration of water protons in the living human brain. First, we show that, in the absence of noise, multichannel coil data contain enough information to separate PD and coil sensitivity, a limiting instrumental bias. Second, we show that, in the presence of noise, regularization by a constraint on the relationship between T1 and PD produces accurate coil sensitivity and PD maps. The ability to measure PD quantitatively has applications in the analysis of in-vivo human brain tissue and enables multisite comparisons between individuals and across instruments. Hum Brain Mapp 37:3623-3635, 2016. © 2016 Wiley Periodicals, Inc. PMID:27273015

  1. Informatics Methods to Enable Sharing of Quantitative Imaging Research Data

    PubMed Central

    Levy, Mia A.; Freymann, John B.; Kirby, Justin S.; Fedorov, Andriy; Fennessy, Fiona M.; Eschrich, Steven A.; Berglund, Anders E.; Fenstermacher, David A.; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L.; Brown, Bartley J.; Braun, Terry A.; Dekker, Andre; Roelofs, Erik; Mountz, James M.; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-01-01

    Introduction The National Cancer Institute (NCI) Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. Methods We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. Results There area variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. Conclusions As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. PMID:22770688

  2. Correlation of qualitative and quantitative results from testing respirator fit

    SciTech Connect

    Hardis, K.E.

    1983-02-01

    Three qualitative respirator fit tests were evaluated for their ability to measure respiratory protection adequately. The methods were the negative pressure test, the isoamyl acetate test, and the irritant smoke test. Each test was performed concurrently with a single qualitative fit test, the dioctylphthalate (DOP) test, during 274 half-mask and 274 full face piece wearings. Most (95%) of the tested study had adequately fitting respirators as determined by quantitative testing. Of these subjects, 96-100% passed the qualitative fit tests. Of the 5% of the study subjects with inadequately fitting half-mask respirators, 93-100% of the inadequate fits were detected by qualitative methods. Twenty-three to 46% of the poorly fitting full face masks were detected by qualitative methods. The probability of passing or failing a qualitative test with an inadequately fitting respirator can be estimated; however, the uncertainty associated with each estimate is largely due to the small number of study subjects with poorly fitting respirators.

  3. Blending Qualitative & Quantitative Research Methods in Theses and Dissertations.

    ERIC Educational Resources Information Center

    Thomas, R. Murray

    This guide discusses combining qualitative and quantitative research methods in theses and dissertations. It covers a wide array of methods, the strengths and limitations of each, and how they can be effectively interwoven into various research designs. The first chapter is "The Qualitative and the Quantitative." Part 1, "A Catalogue of…

  4. A quantitative method for determining the robustness of complex networks

    NASA Astrophysics Data System (ADS)

    Qin, Jun; Wu, Hongrun; Tong, Xiaonian; Zheng, Bojin

    2013-06-01

    Most current studies estimate the invulnerability of complex networks using a qualitative method that analyzes the decay rate of network performance. This method results in confusion over the invulnerability of various types of complex networks. By normalizing network performance and defining a baseline, this paper defines the invulnerability index as the integral of the normalized network performance curve minus the baseline. This quantitative method seeks to measure network invulnerability under both edge and node attacks and provides a definition on the distinguishment of the robustness and fragility of networks. To demonstrate the proposed method, three small-world networks were selected as test beds. The simulation results indicate that the proposed invulnerability index can effectively and accurately quantify network resilience and can deal with both the node and edge attacks. The index can provide a valuable reference for determining network invulnerability in future research.

  5. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  6. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  7. A quantitative method for optimized placement of continuous air monitors.

    PubMed

    Whicker, Jeffrey J; Rodgers, John C; Moxley, John S

    2003-11-01

    Alarming continuous air monitors (CAMs) are a critical component for worker protection in facilities that handle large amounts of hazardous materials. In nuclear facilities, continuous air monitors alarm when levels of airborne radioactive materials exceed alarm thresholds, thus prompting workers to exit the room to reduce inhalation exposures. To maintain a high level of worker protection, continuous air monitors are required to detect radioactive aerosol clouds quickly and with good sensitivity. This requires that there are sufficient numbers of continuous air monitors in a room and that they are well positioned. Yet there are no published methodologies to quantitatively determine the optimal number and placement of continuous air monitors in a room. The goal of this study was to develop and test an approach to quantitatively determine optimal number and placement of continuous air monitors in a room. The method we have developed uses tracer aerosol releases (to simulate accidental releases) and the measurement of the temporal and spatial aspects of the dispersion of the tracer aerosol through the room. The aerosol dispersion data is then analyzed to optimize continuous air monitor utilization based on simulated worker exposure. This method was tested in a room within a Department of Energy operated plutonium facility at the Savannah River Site in South Carolina, U.S. Results from this study show that the value of quantitative airflow and aerosol dispersion studies is significant and that worker protection can be significantly improved while balancing the costs associated with CAM programs.

  8. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  9. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  10. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  11. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems

  12. A quantitative method for blood lipoproteins using cellulose acetate electrophoresis

    PubMed Central

    Magnani, H. N.; Howard, A. N.

    1971-01-01

    A rapid, inexpensive, and quantitative method is described for obtaining the levels of plasma very low, low, and high density lipoproteins using cellulose acetate electrophoresis and lipid assays without prior separation by ultracentrifuge or other techniques. It involves separation of the lipoproteins by cellulose acetate electrophoresis, followed by their identification with the ozone-Schiff reaction. The total lipoprotein concentration is estimated from the total plasma phospholipid, and the percentage of each component obtained by densitometric analysis of the stained electrophoretograms, using reflected light. For samples with a raised level of very low density lipoprotein, plasma triglyceride analysis is also required. The results obtained by the cellulose acetate electrophoresis method are in good agreement with those by the analytical ultracentrifuge and the preparative ultracentrifuge with refractometry. The theoretical assumptions on which the method is based have been shown to be valid. Images PMID:4110791

  13. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A.

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  14. Fluorometric method of quantitative cell mutagenesis

    SciTech Connect

    Dolbeare, F.A.

    1982-08-17

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  15. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  16. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular. PMID:23650936

  17. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  18. A quantitative method for silica flux evaluation

    NASA Astrophysics Data System (ADS)

    Schonewille, R. H.; O'Connell, G. J.; Toguri, J. M.

    1993-02-01

    In the smelting of copper and copper/nickel concentrates, the role of silica flux is to aid in the removal of iron by forming a slag phase. Alternatively, the role of flux may be regarded as a means of controlling the formation of magnetite, which can severely hinder the operation of a furnace. To adequately control the magnetite level, the flux must react rapidly with all of the FeO within the bath. In the present study, a rapid method for silica flux evaluation that can be used directly in the smelter has been developed. Samples of flux are mixed with iron sulfide and magnetite and then smelted at a temperature of 1250 °C. Argon was swept over the reaction mixture and analyzed continuously for sulfur dioxide. The sulfur dioxide concentration with time was found to contain two peaks, the first one being independent of the flux content of the sample. A flux quality parameter has been defined as the height-to-time ratio of the second peak. The value of this parameter for pure silica is 5100 ppm/min. The effects of silica content, silica particle size, and silicate mineralogy were investigated. It was found that a limiting flux quality is achieved for particle sizes less than 0.1 mm in diameter and that fluxes containing feldspar are generally of a poorer quality. The relative importance of free silica and melting point was also studied using synthetic flux mixtures, with free silica displaying the strongest effect.

  19. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  20. Methods and algorithms for quantitative proteomics by mass spectrometry.

    PubMed

    Matthiesen, Rune; Carvalho, Ana Sofia

    2013-01-01

    Protein quantitation by mass spectrometry (MS) is attractive since it is possible to obtain both identification and quantitative values of proteins and their posttranslational modifications in a single experiment. In contrast, protein arrays only provide quantitative values of targeted proteins and their modifications. There are an overwhelming number of quantitative MS methods for protein and peptide quantitation. The aim here is to provide an overview of the most common MS methods and algorithms used in quantitative proteomics and discuss the computational algorithms needed to reliably quantitate proteins, peptides, and their posttranslational modifications. One of the main challenges in data analysis of many experimental projects is to pipe together a number of software solutions that are either commercial or freely available. The aim of this chapter is to provide a good set of algorithms, ideas, and resources that can easily be implemented in scripting language like R, Python, or Perl. By understanding the algorithmic ideas presented here, data from any instrument or modified experimental protocol can be analyzed and is therefore in the authors' opinion more valuable than a black box concept.

  1. Quantitative methods for the analysis of zoosporic fungi.

    PubMed

    Marano, Agostina V; Gleason, Frank H; Bärlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Télesphore; Boussiba, Sammy; de Souza, José I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi.

  2. Applying Quantitative Genetic Methods to Primate Social Behavior.

    PubMed

    Blomquist, Gregory E; Brent, Lauren J N

    2014-02-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus.

  3. Comparison of a quantitative microtiter method, a quantitative automated method, and the plate-count method for determining microbial complement resistance.

    PubMed

    Lee, M D; Wooley, R E; Brown, J; Spears, K R; Nolan, L K; Shotts, E B

    1991-01-01

    A quantitative microtiter method for determining the degree of complement resistance or sensitivity of microorganisms is described. The microtiter method is compared with a quantitative automated system and the standard plate-count technique. Data were accumulated from 30 avian Escherichia coli isolates incubated at 35 C with either chicken plasma or heat-inactivated chicken plasma. Analysis of data generated by the automated system and plate-count techniques resulted in a classification of the microorganisms into three groups: those sensitive to the action of complement; those of intermediate sensitivity to the action of complement; and those resistant to the action of complement. Although the three methods studied did not agree absolutely, there were statistically significant correlations among them.

  4. A quantitative dimming method for LED based on PWM

    NASA Astrophysics Data System (ADS)

    Wang, Jiyong; Mou, Tongsheng; Wang, Jianping; Tian, Xiaoqing

    2012-10-01

    Traditional light sources were required to provide stable and uniform illumination for a living or working environment considering performance of visual function of human being. The requirement was always reasonable until non-visual functions of the ganglion cells in the retina photosensitive layer were found. New generation of lighting technology, however, is emerging based on novel lighting materials such as LED and photobiological effects on human physiology and behavior. To realize dynamic lighting of LED whose intensity and color were adjustable to the need of photobiological effects, a quantitative dimming method based on Pulse Width Modulation (PWM) and light-mixing technology was presented. Beginning with two channels' PWM, this paper demonstrated the determinacy and limitation of PWM dimming for realizing Expected Photometric and Colorimetric Quantities (EPCQ), in accordance with the analysis on geometrical, photometric, colorimetric and electrodynamic constraints. A quantitative model which mapped the EPCQ into duty cycles was finally established. The deduced model suggested that the determinacy was a unique individuality only for two channels' and three channels' PWM, but the limitation was an inevitable commonness for multiple channels'. To examine the model, a light-mixing experiment with two kinds of white LED simulated variations of illuminance and Correlation Color Temperature (CCT) from dawn to midday. Mean deviations between theoretical values and measured values were obtained, which were 15lx and 23K respectively. Result shows that this method can effectively realize the light spectrum which has a specific requirement of EPCQ, and provides a theoretical basis and a practical way for dynamic lighting of LED.

  5. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  6. University Students' Orientation to Qualitative and Quantitative Research Methods.

    ERIC Educational Resources Information Center

    Murtonen, Mari

    This study aimed to determine whether different orientations toward qualitative and quantitative methods can be found among students. Data were collected during 3 years from different research methodology course students. There were 195 Finnish students and 122 U.S. students who answered a questionnaire about the appreciation of research methods…

  7. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  8. Quantitative methods for analyzing cell-cell adhesion in development.

    PubMed

    Kashef, Jubin; Franz, Clemens M

    2015-05-01

    During development cell-cell adhesion is not only crucial to maintain tissue morphogenesis and homeostasis, it also activates signalling pathways important for the regulation of different cellular processes including cell survival, gene expression, collective cell migration and differentiation. Importantly, gene mutations of adhesion receptors can cause developmental disorders and different diseases. Quantitative methods to measure cell adhesion are therefore necessary to understand how cells regulate cell-cell adhesion during development and how aberrations in cell-cell adhesion contribute to disease. Different in vitro adhesion assays have been developed in the past, but not all of them are suitable to study developmentally-related cell-cell adhesion processes, which usually requires working with low numbers of primary cells. In this review, we provide an overview of different in vitro techniques to study cell-cell adhesion during development, including a semi-quantitative cell flipping assay, and quantitative single-cell methods based on atomic force microscopy (AFM)-based single-cell force spectroscopy (SCFS) or dual micropipette aspiration (DPA). Furthermore, we review applications of Förster resonance energy transfer (FRET)-based molecular tension sensors to visualize intracellular mechanical forces acting on cell adhesion sites. Finally, we describe a recently introduced method to quantitate cell-generated forces directly in living tissues based on the deformation of oil microdroplets functionalized with adhesion receptor ligands. Together, these techniques provide a comprehensive toolbox to characterize different cell-cell adhesion phenomena during development.

  9. Analysis of 129I in Groundwater Samples: Direct and Quantitative Results below the Drinking Water Standard

    SciTech Connect

    Brown, Christopher F.; Geiszler, Keith N.; Lindberg, Michael J.

    2007-03-03

    Due to its long half-life (15.7 million years) and relatively unencumbered migration in subsurface environments, 129I has been recognized as a contaminant of concern at numerous federal, private, and international facilities. In order to understand the long-term risk associated with 129I at these locations, quantitative analysis of groundwater samples must be performed. However, the ability to quantitatively assess the 129I content in groundwater samples requires specialized extraction and sophisticated analytical techniques, which are complicated and not always available to the general scientific community. This paper highlights an analytical method capable of directly quantifying 129I in groundwater samples at concentrations below the MCL without the need for sample pre-concentration. Samples were analyzed on a Perkin Elmer ELAN DRC II ICP-MS after minimal dilution using O2 as the reaction gas. Analysis of continuing calibration verification standards indicated that the DRC mode could be used for quantitative analysis of 129I in samples below the drinking water standard (0.0057 ng/ml or 1 pCi/L). The low analytical detection limit of 129I analysis in the DRC mode coupled with minimal sample dilution (1.02x) resulted in a final sample limit of quantification of 0.0051 ng/ml. Subsequent analysis of three groundwater samples containing 129I resulted in fully quantitative results in the DRC mode, and spike recovery analyses performed on all three samples confirmed that the groundwater matrix did not adversely impact the analysis of 129I in the DRC mode. This analytical approach has been proven to be a cost-effective, high-throughput technique for the direct, quantitative analysis of 129I in groundwater samples at concentrations below the current MCL.

  10. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  12. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  13. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  14. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  15. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  16. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  17. Comparison of multivariate calibration methods for quantitative spectral analysis

    SciTech Connect

    Thomas, E.V.; Haaland, D.M. )

    1990-05-15

    The quantitative prediction abilities of four multivariate calibration methods for spectral analyses are compared by using extensive Monte Carlo simulations. The calibration methods compared include inverse least-squares (ILS), classical least-squares (CLS), partial least-squares (PLS), and principal component regression (PCR) methods. ILS is a frequency-limited method while the latter three are capable of full-spectrum calibration. The simulations were performed assuming Beer's law holds and that spectral measurement errors and concentration errors associated with the reference method are normally distributed. Eight different factors that could affect the relative performance of the calibration methods were varied in a two-level, eight-factor experimental design in order to evaluate their effect on the prediction abilities of the four methods. It is found that each of the three full-spectrum methods has its range of superior performance. The frequency-limited ILS method was never the best method, although in the presence of relatively large concentration errors it sometimes yields comparable analysis precision to the full-spectrum methods for the major spectral component. The importance of each factor in the absolute and relative performances of the four methods is compared.

  18. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  19. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikolić, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  20. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  1. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  2. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  3. Quantitative cell imaging using single beam phase retrieval method

    NASA Astrophysics Data System (ADS)

    Anand, Arun; Chhaniwal, Vani; Javidi, Bahram

    2011-06-01

    Quantitative three-dimensional imaging of cells can provide important information about their morphology as well as their dynamics, which will be useful in studying their behavior under various conditions. There are several microscopic techniques to image unstained, semi-transparent specimens, by converting the phase information into intensity information. But most of the quantitative phase contrast imaging techniques is realized either by using interference of the object wavefront with a known reference beam or using phase shifting interferometry. A two-beam interferometric method is challenging to implement especially with low coherent sources and it also requires a fine adjustment of beams to achieve high contrast fringes. In this letter, the development of a single beam phase retrieval microscopy technique for quantitative phase contrast imaging of cells using multiple intensity samplings of a volume speckle field in the axial direction is described. Single beam illumination with multiple intensity samplings provides fast convergence and a unique solution of the object wavefront. Three-dimensional thickness profiles of different cells such as red blood cells and onion skin cells were reconstructed using this technique with an axial resolution of the order of several nanometers.

  4. [Development of rapid methods for quantitative analysis of proteolytic reactions].

    PubMed

    Beloivan, O A; Tsvetkova, M N; Bubriak, O A

    2002-01-01

    The approaches for development of express methods for quantitative control of proteolytic reactions are discussed. Recently, these reactions have taken on special significance for revealing many important problems of theoretical and practical medicine and biology as well as for technological, pharmacological and ecological monitoring. Traditional methods can be improved both by use of immobilized enzymes and substrates, and on the basis of combination of various classic biochemical and immunological approaches. The synthesis of substrates with specified properties allows new methods to be realized for the study of the proteinase activity and kinetic characteristics of the corresponding reactions both in vitro and in vivo. An application of biosensor technology is promising trend since it allows the analysis time and cost to be saved, the direct interaction between enzymes and their inhibitors and activators to be studied in a real time scale, the quantitative measurements to be performed both in liquids and in the air. Besides, biosensor technique is well compatible with computer data processing. PMID:12924013

  5. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  6. Mapcurves: A Quantitative Method for Comparing Categorical Maps

    SciTech Connect

    Hargrove, William Walter; Hoffman, Forrest M; Hessburg, Paul

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if all polygons in one map are comprised of unique sets of the polygons in another map, if the coincidence among map categories is absolute. It is not necessary to interpret (or even know) legend descriptors for the categories in the maps to be compared, since the degree of fit in the spatial overlay alone forms the basis for the comparison. This feature makes Mapcurves ideal for comparing maps derived from remotely sensed images. A translation table is provided for the categories in each map as an output. Since the comparison is category-based rather than cell-based, the GOF is resolution-independent. Mapcurves can be applied either to entire map categories or to individual raster patches or vector polygons. Mapcurves also have applications for quantifying the spatial uncertainty of particular map features.

  7. Comparison of the multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture.

    PubMed

    Converse, Reagan R; Wymer, Larry J; Dufour, Alfred P; Wade, Timothy J

    2012-10-01

    Few studies have addressed the efficacy of composite sampling for measuring indicator bacteria by quantitative PCR (qPCR). We compared results from composited samples with multiple-sample means for culture- and qPCR-based water quality monitoring. Results from composited samples for both methods were similarly correlated to multiple-sample means and predicted criteria exceedances equally.

  8. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  9. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples. PMID:24190861

  10. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  11. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  12. Quantitative methods in electroencephalography to access therapeutic response.

    PubMed

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  13. Quantitative methods in electroencephalography to access therapeutic response.

    PubMed

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  14. System and methods for wide-field quantitative fluorescence imaging during neurosurgery.

    PubMed

    Valdes, Pablo A; Jacobs, Valerie L; Wilson, Brian C; Leblond, Frederic; Roberts, David W; Paulsen, Keith D

    2013-08-01

    We report an accurate, precise and sensitive method and system for quantitative fluorescence image-guided neurosurgery. With a low-noise, high-dynamic-range CMOS array, we perform rapid (integration times as low as 50 ms per wavelength) hyperspectral fluorescence and diffuse reflectance detection and apply a correction algorithm to compensate for the distorting effects of tissue absorption and scattering. Using this approach, we generated quantitative wide-field images of fluorescence in tissue-simulating phantoms for the fluorophore PpIX, having concentrations and optical absorption and scattering variations over clinically relevant ranges. The imaging system was tested in a rodent model of glioma, detecting quantitative levels down to 20 ng/ml. The resulting performance is a significant advance on existing wide-field quantitative imaging techniques, and provides performance comparable to a point-spectroscopy probe that has previously demonstrated significant potential for improved detection of malignant brain tumors during surgical resection. PMID:23903142

  15. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    SciTech Connect

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  16. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    PubMed

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (P< .001). Repeated measures pairwise correlation between any of the methods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies.

  17. Experimental demonstration of quantitation errors in MR spectroscopy resulting from saturation corrections under changing conditions

    NASA Astrophysics Data System (ADS)

    Galbán, Craig J.; Ellis, Scott J.; Spencer, Richard G. S.

    2003-04-01

    Metabolite concentration measurements in in vivo NMR are generally performed under partially saturated conditions, with correction for partial saturation performed after data collection using a measured saturation factor. Here, we present an experimental test of the hypothesis that quantitation errors can occur due to application of such saturation factor corrections in changing systems. Thus, this extends our previous theoretical work on quantitation errors due to varying saturation factors. We obtained results for two systems frequently studied by 31P NMR, the ischemic rat heart and the electrically stimulated rat gastrocnemius muscle. The results are interpreted in light of previous theoretical work which defined the degree of saturation occurring in a one-pulse experiment for a system with given spin-lattice relaxation times, T1s, equilibrium magnetizations, M0s, and reaction rates. We found that (i) the assumption of constancy of saturation factors leads to quantitation errors on the order of 40% in inorganic phosphate; (ii) the dominant contributor to the quantitation errors in inorganic phosphate is most likely changes in T1; (iii) T1 and M0 changes between control and intervention periods, and chemical exchange contribute to different extents to quantitation errors in phosphocreatine and γ-ATP; (iv) relatively small increases in interpulse delay substantially decreased quantitation errors for metabolites in ischemic rat hearts; (v) random error due to finite SNR led to approximately 4% error in quantitation, and hence was a substantially smaller contributor than were changes in saturation factors.

  18. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  19. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  20. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  1. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  2. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  3. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  4. Breast tumour visualization using 3D quantitative ultrasound methods

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.

    2016-04-01

    Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.

  5. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  6. From the manual method of Topliss to a modified quantitative method.

    PubMed

    Yunes, Rosendo A; Heinzen, Vilma E Fonseca; Cechinel Filho, Valdir; Lazzarotto, Marcelo

    2002-01-01

    The optimization of the properties of a lead compound is the first goal of most pre-clinical research projects. Optimization strategies may be applied to the synthesis of analogous compounds in order to minimize cost and time. One strategy of synthesis is the change of the substituents in the molecule. The manual method of Topliss was introduced for the prediction of the substituted compounds that will have the most potent activity in a series of aromatic substituted analogues. A modified Topliss method is proposed that consist of the quantitative correlation by a single regression equation of the activity of a series of 4 or 5 substituted aromatic compounds with the descriptor parameters: hydrophobic (pi), electronic (sigma) and sterics (Es and MR) and some combinations of them in order to predict future synthesis, or to obtain a training set of compounds to be used in the application of more advanced experimental design methods. These results when compared with those of multiple regression analysis applying the Hansch equation are very satisfactory.

  7. A quantitative method for measuring innate phagocytosis by human monocytes using real-time flow cytometry.

    PubMed

    Gu, Ben J; Sun, Chun; Fuller, Stephen; Skarratt, Kristen K; Petrou, Steven; Wiley, James S

    2014-04-01

    Phagocytosis is central to immunity however a rapid and standardized method is much needed for quantitative assessment of the phagocytic process. We describe a real-time flow cytometric method to quantitate the phagocytosis of fluorescent latex beads by human monocytes in serum-free conditions. Effects of buffer composition, temperature, pH, and bead surface on phagocytic rate are described. The innate phagocytic ability of human monocytes from single subjects measured by this method was relatively stable over many months although phagocytosis rate varied as much as two-fold between individuals. Comparable results were obtained with a simplified method using several mL of whole blood which is suitable for routine clinical application. This method also allows two-color flow cytometric measurement of cytosolic calcium levels during the phagocytic uptake of fluorescent beads.

  8. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788. PMID:21071908

  9. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788.

  10. A fully automated method for quantitative cerebral hemodynamic analysis using DSC-MRI.

    PubMed

    Bjørnerud, Atle; Emblem, Kyrre E

    2010-05-01

    Dynamic susceptibility contrast (DSC)-based perfusion analysis from MR images has become an established method for analysis of cerebral blood volume (CBV) in glioma patients. To date, little emphasis has, however, been placed on quantitative perfusion analysis of these patients, mainly due to the associated increased technical complexity and lack of sufficient stability in a clinical setting. The aim of our study was to develop a fully automated analysis framework for quantitative DSC-based perfusion analysis. The method presented here generates quantitative hemodynamic maps without user interaction, combined with automatic segmentation of normal-appearing cerebral tissue. Validation of 101 patients with confirmed glioma after surgery gave mean values for CBF, CBV, and MTT, extracted automatically from normal-appearing whole-brain white and gray matter, in good agreement with literature values. The measured age- and gender-related variations in the same parameters were also in agreement with those in the literature. Several established analysis methods were compared and the resulting perfusion metrics depended significantly on method and parameter choice. In conclusion, we present an accurate, fast, and automatic quantitative perfusion analysis method where all analysis steps are based on raw DSC data only. PMID:20087370

  11. A new method to estimate quantitatively seminal vesicle and prostate gland contributions to ejaculate

    PubMed Central

    Ndovi, Themba T; Parsons, Teresa; Choi, Leena; Caffo, Brian; Rohde, Charles; Hendrix, Craig W

    2007-01-01

    Aims We sought to optimize a quantitative noninvasive method to determine the concentration in their glands of origin of biochemical markers of compartments of the male genital tract as the first step towards validation of a novel method for estimation of drug concentrations in these male genital tract compartments. Methods Sixty-eight men participated. We compared four collection devices to split ejaculate into fractions. Fractions were assayed for fructose and prostate specific antigen (PSA) as unique markers of the seminal vesicle and prostate, respectively. Seminal vesicle fructose and prostatic PSA were estimated using a linear regression method, based on fructose-PSA axis intercepts, and compared with an older method which solves a simultaneous series of equations. Results A five-compartment collection device performed best with mean (95% confidence interval) PSA vs. fructose r2 of 0.84 (0.71, 0.98, P < 0.001). Using resampling simulations, glandular PSA and fructose estimates were highly variable and often implausible when using only two fractions. Using our method, the prostate contributed 37–44% to the whole ejaculate and the seminal vesicle contributed 55–61%. The novel regression method was highly correlated (r2 ≥ 0.98) with older methods. Conclusions We developed a noninvasive quantitative method of male genital tract biochemical marker estimation using a five-compartment tray to collect three to five ejaculate fractions. Our novel regression method is quantitative and more fully developed than older methods. This noninvasive method for determining glandular marker concentrations should be useful to provide quantitative estimates of drug concentrations in these glands. PMID:17076697

  12. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  13. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  14. Quantitative methods in the study of trypanosomes and their applications*

    PubMed Central

    Lumsden, W. H. R.

    1963-01-01

    In the first part of this paper the author summarizes and discusses previous quantitative work on trypanosomes, with particular reference to biometrical studies, in vivo and in vitro studies on numbers of trypanosomes, studies on hosts infected with trypanosomes, and physiological studies. The second part discusses recent work done at the East African Trypanosomiasis Research Organization. A method for the measurement of the infectivity of trypanosome suspensions, based on serial dilution and inoculation into test animals, is outlined, and applications likely to improve diagnostic procedures are suggested for it. Such applications might include: the establishment of experimental procedures not significantly reducing the infectivity of trypanosomes under experiment; determination of the effects on the infectivity of preserved material of some of the factors in the process of preservation, important for the preparation of standard material; comparison of the efficiency of different culture media for the isolation of trypanosomes; study of the distribution of trypanosomes in the vertebrate host; and measurement of the susceptibility of trypanosomes to drugs. The author stresses the importance of relating future experimental work with trypanosomes to preserved material for which comprehensive documentation is available. PMID:20604152

  15. Synthesizing regression results: a factored likelihood method.

    PubMed

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-06-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported in the regression studies to calculate synthesized standardized slopes. It uses available correlations to estimate missing ones through a series of regressions, allowing us to synthesize correlations among variables as if each included study contained all the same variables. Great accuracy and stability of this method under fixed-effects models were found through Monte Carlo simulation. An example was provided to demonstrate the steps for calculating the synthesized slopes through sweep operators. By rearranging the predictors in the included regression models or omitting a relatively small number of correlations from those models, we can easily apply the factored likelihood method to many situations involving synthesis of linear models. Limitations and other possible methods for synthesizing more complicated models are discussed. Copyright © 2012 John Wiley & Sons, Ltd. PMID:26053653

  16. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  17. How Many proteins are Missed in Quantitative proteomics Based on Ms/Ms sequencing Methods?

    PubMed Central

    Mulvey, Claire; Thur, Bettina; Crawford, Mark; Godovac-Zimmermann, Jasminka

    2014-01-01

    Current bottom-up quantitative proteomics methods based on MS/MS sequencing of peptides are shown to be strongly dependent on sample preparation. Using cytosolic proteins from MCF-7 breast cancer cells, it is shown that protein pre-fractionation based on pI and MW is more effective than pre-fractionation using only MW in increasing the number of observed proteins (947 vs. 704 proteins) and the number of spectral counts per protein. Combination of MS data from the different pre-fractionation methods results in further improvements (1238 proteins). We discuss that at present the main limitation on quantitation by MS/MS sequencing is not MS sensitivity and protein abundance, but rather extensive peptide overlap and limited MS/MS sequencing throughput, and that this favors internally calibrated methods such as SILAC, ICAT or ITRAQ over spectral counting methods in attempts to drastically improve proteome coverage of biological samples. PMID:25729266

  18. Quantitative EDXS analysis of organic materials using the ζ-factor method.

    PubMed

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. PMID:24012932

  19. Testing alleged mediumship: methods and results.

    PubMed

    O'keeffe, Ciarán; Wiseman, Richard

    2005-05-01

    Mediums claim to be able to communicate with the deceased. Such claims attract a considerable amount of public interest and, if valid, have important implications for many areas of psychology. For over 100 years, researchers have tested alleged mediums. This work has obtained mixed results and provoked a considerable amount of methodological debate. This paper reviews the key issues in this debate, describes how the authors devised a method of testing that aimed to prevent the many problems that have hindered past research, and how they then used this method to test several professional mediums. The results of this work did not support the existence of genuine mediumistic ability. Competing interpretations of these results are discussed, along with ways in which the methodology presented in the paper could be used to assess conceptually similar, but non-paranormal, claims made in clinical, occupational and forensic contexts.

  20. A novel benzene quantitative analysis method using miniaturized metal ionization gas sensor and non-linear bistable dynamic system

    PubMed Central

    Tang, Xuxiang; Liu, Fuqi

    2015-01-01

    In this paper, a novel benzene quantitative analysis method utilizing miniaturized metal ionization gas sensor and non-linear bistable dynamic system was investigated. Al plate anodic gas-ionization sensor was installed for electrical current-voltage data measurement. Measurement data was analyzed by non-linear bistable dynamics system. Results demonstrated that this method realized benzene concentration quantitative determination. This method is promising in laboratory safety management in benzene leak detection. PMID:26218927

  1. A novel benzene quantitative analysis method using miniaturized metal ionization gas sensor and non-linear bistable dynamic system.

    PubMed

    Tang, Xuxiang; Liu, Fuqi

    2015-01-01

    In this paper, a novel benzene quantitative analysis method utilizing miniaturized metal ionization gas sensor and non-linear bistable dynamic system was investigated. Al plate anodic gas-ionization sensor was installed for electrical current-voltage data measurement. Measurement data was analyzed by non-linear bistable dynamics system. Results demonstrated that this method realized benzene concentration quantitative determination. This method is promising in laboratory safety management in benzene leak detection.

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  4. Quantitative assessment of susceptibility weighted imaging processing methods

    PubMed Central

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2013-01-01

    Purpose To evaluate different susceptibility weighted imaging (SWI) phase processing methods and parameter selection, thereby improving understanding of potential artifacts, as well as facilitating choice of methodology in clinical settings. Materials and Methods Two major phase processing methods, Homodyne-filtering and phase unwrapping-high pass (HP) filtering, were investigated with various phase unwrapping approaches, filter sizes, and filter types. Magnitude and phase images were acquired from a healthy subject and brain injury patients on a 3T clinical Siemens MRI system. Results were evaluated based on image contrast to noise ratio and presence of processing artifacts. Results When using a relatively small filter size (32 pixels for the matrix size 512 × 512 pixels), all Homodyne-filtering methods were subject to phase errors leading to 2% to 3% masked brain area in lower and middle axial slices. All phase unwrapping-filtering/smoothing approaches demonstrated fewer phase errors and artifacts compared to the Homodyne-filtering approaches. For performing phase unwrapping, Fourier-based methods, although less accurate, were 2–4 orders of magnitude faster than the PRELUDE, Goldstein and Quality-guide methods. Conclusion Although Homodyne-filtering approaches are faster and more straightforward, phase unwrapping followed by HP filtering approaches perform more accurately in a wider variety of acquisition scenarios. PMID:24923594

  5. A quantitative method for visual phantom image quality evaluation

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.; Liu, Xiong; O'Shea, Michael; Toto, Lawrence C.

    2000-04-01

    This work presents an image quality evaluation technique for uniform-background target-object phantom images. The Degradation-Comparison-Threshold (DCT) method involves degrading the image quality of a target-containing region with a blocking processing and comparing the resulting image to a similarly degraded target-free region. The threshold degradation needed for 92% correct detection of the target region is the image quality measure of the target. Images of American College of Radiology (ACR) mammography accreditation program phantom were acquired under varying x-ray conditions on a digital mammography machine. Five observers performed ACR and DCT evaluations of the images. A figure-of-merit (FOM) of an evaluation method was defined which takes into account measurement noise and the change of the measure as a function of x-ray exposure to the phantom. The FOM of the DCT method was 4.1 times that of the ACR method for the specks, 2.7 times better for the fibers and 1.4 times better for the masses. For the specks, inter-reader correlations on the same image set increased significantly from 87% for the ACR method to 97% for the DCT method. The viewing time per target for the DCT method was 3 - 5 minutes. The observed greater sensitivity of the DCT method could lead to more precise Quality Control (QC) testing of digital images, which should improve the sensitivity of the QC process to genuine image quality variations. Another benefit of the method is that it can measure the image quality of high detectability target objects, which is impractical by existing methods.

  6. A quantitative measurement method for comparison of seated postures.

    PubMed

    Hillman, Susan J; Hollington, James

    2016-05-01

    This technical note proposes a method to measure and compare seated postures. The three-dimensional locations of palpable anatomical landmarks corresponding to the anterior superior iliac spines, clavicular notch, head, shoulders and knees are measured in terms of x, y and z co-ordinates in the reference system of the measuring apparatus. These co-ordinates are then transformed onto a body-based axis system which allows comparison within-subject. The method was tested on eleven unimpaired adult participants and the resulting data used to calculate a Least Significant Difference (LSD) for the measure, which is used to determine whether two postures are significantly different from one another. The method was found to be sensitive to the four following standardised static postural perturbations: posterior pelvic tilt, pelvic obliquity, pelvic rotation, and abduction of the thighs. The resulting data could be used as an outcome measure for the postural alignment aspect of seating interventions in wheelchairs. PMID:26920073

  7. A Longitudinal Study of Man: A Course of Study. Volume II: Quantitative Results. Final Report.

    ERIC Educational Resources Information Center

    Cort, H. Russell, Jr.; Peskowitz, Nancy

    This second volume of the summative evaluation of "Man: A Course of Study" (MACOS) presents results of quantitative analyses of what MACOS students seemed to learn, what they retained one year later, and how what they learned was different from what students in other social studies courses learned. The first part of the document compares MACOS and…

  8. The LASSO and sparse least square regression methods for SNP selection in predicting quantitative traits.

    PubMed

    Feng, Zeny Z; Yang, Xiaojian; Subedi, Sanjeena; McNicholas, Paul D

    2012-01-01

    Recent work concerning quantitative traits of interest has focused on selecting a small subset of single nucleotide polymorphisms (SNPs) from amongst the SNPs responsible for the phenotypic variation of the trait. When considered as covariates, the large number of variables (SNPs) and their association with those in close proximity pose challenges for variable selection. The features of sparsity and shrinkage of regression coefficients of the least absolute shrinkage and selection operator (LASSO) method appear attractive for SNP selection. Sparse partial least squares (SPLS) is also appealing as it combines the features of sparsity in subset selection and dimension reduction to handle correlations amongst SNPs. In this paper we investigate application of the LASSO and SPLS methods for selecting SNPs that predict quantitative traits. We evaluate the performance of both methods with different criteria and under different scenarios using simulation studies. Results indicate that these methods can be effective in selecting SNPs that predict quantitative traits but are limited by some conditions. Both methods perform similarly overall but each exhibit advantages over the other in given situations. Both methods are applied to Canadian Holstein cattle data to compare their performance.

  9. Development and validation of event-specific quantitative PCR method for genetically modified maize MIR604.

    PubMed

    Mano, Junichi; Furui, Satoshi; Takashima, Kaori; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2012-01-01

    A GM maize event, MIR604, has been widely distributed and an analytical method to quantify its content is required to monitor the validity of food labeling. Here we report a novel real-time PCR-based quantitation method for MIR604 maize. We developed real-time PCR assays specific for MIR604 using event-specific primers designed by the trait developer, and for maize endogenous starch synthase IIb gene (SSIIb). Then, we determined the conversion factor, which is required to calculate the weight-based GM maize content from the copy number ratio of MIR604-specific DNA to the endogenous reference DNA. Finally, to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind samples containing MIR604 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The reproducibility (RSDr) of the developed method was evaluated to be less than 25%. The limit of quantitation of the method was estimated to be 0.5% based on the ISO 24276 guideline. These results suggested that the developed method would be suitable for practical quantitative analyses of MIR604 maize. PMID:23132355

  10. A quantitative analytical method to test for salt effects on giant unilamellar vesicles.

    PubMed

    Hadorn, Maik; Boenzli, Eva; Hotz, Peter Eggenberger

    2011-01-01

    Today, free-standing membranes, i.e. liposomes and vesicles, are used in a multitude of applications, e.g. as drug delivery devices and artificial cell models. Because current laboratory techniques do not allow handling of large sample sizes, systematic and quantitative studies on the impact of different effectors, e.g. electrolytes, are limited. In this work, we evaluated the Hofmeister effects of ten alkali metal halides on giant unilamellar vesicles made of palmitoyloleoylphosphatidylcholine for a large sample size by combining the highly parallel water-in-oil emulsion transfer vesicle preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes. PMID:22355683

  11. Methods and challenges in quantitative imaging biomarker development.

    PubMed

    Abramson, Richard G; Burton, Kirsteen R; Yu, John-Paul J; Scalzetti, Ernest M; Yankeelov, Thomas E; Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M

    2015-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This article, drafted by the Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field.

  12. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  13. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  14. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  15. Projection methods: Results and open problems

    SciTech Connect

    Bauschke, H.

    1994-12-31

    The convex feasibility problem consists of finding a point in the intersection of convex constraint sets. It is very easy to state and has an immediate geometric appeal; its real significance, however, lies in the wide range of applications in mathematics and physical sciences. The method of cyclic projections is an iterative attempt to solve the convex feasibility problem: a convergent sequence is generated by projecting cyclically onto the constraint sets and its limit is a solution. This and similar other methods have been used in practice with great success and quite a few theoretical results are known. Nonetheless, some fundamental questions remain open! In this talk, I will present some (of my favorite) open problems and what`s known about them.

  16. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works

  17. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  18. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  19. Teaching Quantitative Research Methods: A Quasi-Experimental Analysis.

    ERIC Educational Resources Information Center

    Bridges, George S.; Gillmore, Gerald M.; Pershing, Jana L.; Bates, Kristin A.

    1998-01-01

    Describes an experiment designed to introduce aspects of quantitative reasoning to a large, substantively-focused class in the social sciences. Reveals that participating students' abilities to interpret and manipulate empirical data increased significantly, independent of baseline SAT verbal and mathematics scores. Discusses implications for…

  20. Analyzing the Students' Academic Integrity using Quantitative Methods

    ERIC Educational Resources Information Center

    Teodorescu, Daniel; Andrei, Tudorel; Tusa, Erika; Herteliu, Claudiu; Stancu, Stelian

    2007-01-01

    The transition period in Romania has generated a series of important changes, including the reforming of the Romanian tertiary education. This process has been accelerated after the signing of the Bologna treaty. Important changes were recorded in many of the quantitative aspects (such as number of student enrolled, pupil-student ratio etc) as…

  1. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment. PMID:27367198

  2. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  3. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  4. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  5. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  6. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  7. Bridging the gap between qualitative and quantitative colocalization results in fluorescence microscopy studies

    PubMed Central

    Zinchuk, Vadim; Wu, Yong; Grossenbacher-Zinchuk, Olga

    2013-01-01

    Quantitative colocalization studies suffer from the lack of unified approach to interpret obtained results. We developed a tool to characterize the results of colocalization experiments in a way so that they are understandable and comparable both qualitatively and quantitatively. Employing a fuzzy system model and computer simulation, we produced a set of just five linguistic variables tied to the values of popular colocalization coefficients: “Very Weak”, “Weak”, “Moderate”, “Strong”, and “Very Strong”. The use of the variables ensures that the results of colocalization studies are properly reported, easily shared, and universally understood by all researchers working in the field. When new coefficients are introduced, their values can be readily fitted into the set. PMID:23455567

  8. Sequencing human ribs into anatomical order by quantitative multivariate methods.

    PubMed

    Cirillo, John; Henneberg, Maciej

    2012-06-01

    Little research has focussed on methods to anatomically sequence ribs. Correct anatomical sequencing of ribs assists in determining the location and distribution of regional trauma, age estimation, number of puncture wounds, number of individuals, and personal identification. The aim of the current study is to develop a method for placing fragmented and incomplete rib sets into correct anatomical position. Ribs 2-10 were used from eleven cadavers of an Australian population. Seven variables were measured from anatomical locations on the rib. General descriptive statistics were calculated for each variable along with an analysis of variance (ANOVA) and ANOVA with Bonferroni statistics. Considerable overlap was observed between ribs for univariate methods. Bivariate and multivariate methods were then applied. Results of the ANOVA with post hoc Bonferroni statistics show that ratios of various dimensions of a single rib could be used to sequence it within adjacent ribs. Using multiple regression formulae, the most accurate estimation of the anatomical rib number occurs when the entire rib is found in isolation. This however, is not always possible. Even when only the head and neck of the rib are preserved, a modified multivariate regression formula assigned 91.95% of ribs into correct anatomical position or as an adjacent rib. Using multivariate methods it is possible to sequence a single human rib with a high level of accuracy and they are superior to univariate methods. Left and right ribs were found to be highly symmetrical. Some rib dimensions were greater in males than in females, but overall the level of sexual dimorphism was low.

  9. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  10. New fluorescence microscopy methods for microbiology: sharper, faster, and quantitative.

    PubMed

    Gitai, Zemer

    2009-06-01

    In addition to the inherent interest stemming from their ecological and human health impacts, microbes have many advantages as model organisms, including ease of growth and manipulation and relatively simple genomes. However, the imaging of bacteria via light microscopy has been limited by their small sizes. Recent advances in fluorescence microscopy that allow imaging of structures at extremely high resolutions are thus of particular interest to the modern microbiologist. In addition, advances in high-throughput microscopy and quantitative image analysis are enabling cellular imaging to finally take advantage of the full power of bacterial numbers and ease of manipulation. These technical developments are ushering in a new era of using fluorescence microscopy to understand bacterial systems in a detailed, comprehensive, and quantitative manner.

  11. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  12. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  13. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  14. [Methods and clinical results of interstitial thermoradiotherapy].

    PubMed

    Seegenschmiedt, M H; Sauer, R

    1989-05-01

    Besides percutaneous hyperthermia techniques, some new interstitial procedures have been developed and included into the arsenal of radiotherapists. As compared to percutaneous techniques, considerable benefits are offered by interstitial methods: a more homogeneous distribution of therapeutical temperatures, a better sparing of normal tissue, the possibility to treat deep tumors, and better therapy control and evaluation by extensive "thermal mapping" within the target volume. This study presents the technical principles and the clinical possibilities of the interstitial hyperthermia methods developed hitherto: resistive radiofrequency hyperthermia, radiative microwave hyperthermia, inductive ferromagnetic seed hyperthermia, and conductive hot-water perfusion hyperthermia. Until now, interstitial thermo-radiotherapy was only performed during some phase I/II studies. It was applied for palliative reasons in persisting and recurrent tumors accessible for implantation and situated in superficial to semi-deep locations with no or insufficient response to differently combined treatment modalities (surgery, radiotherapy, or chemotherapy). The preliminary clinical results obtained in almost 300 patients are quite promising: complete remission rates up to 70% have been achieved. Interstitial hyperthermia represents an effective and safe therapy modality, especially when combined with radiotherapy in palliative tumor therapy. At present, several prospective randomized multicentric studies are conducted to investigate its value as adjuvant therapy modality.

  15. The expected results method for data verification

    NASA Astrophysics Data System (ADS)

    Monday, Paul

    2016-05-01

    The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest

  16. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples.

  17. The Vulcan Project: Methods, Results, and Evaluation

    NASA Astrophysics Data System (ADS)

    Gurney, K. R.; Mendoza, D.; Miller, C.; Ojima, D.; Knox, S.; Corbin, K.; Denning, S.; Fischer, M.; de La Rue Du Can, S.

    2008-12-01

    The Vulcan Project has quantified fossil fuel CO2 for the United States at the sub-county spatial scale, hourly for the year 2002. It approached quantification of fossil fuel CO2 from a novel perspective: leveraging the information already contained within the National Emissions Inventory for the assessment of nationally regulated air pollution. By utilizing the inventory emissions of carbon monoxide and nitrogen oxides combined with emissions factors, specific to combustion device technology, we have calculated CO2 emissions for industrial point sources, powerplants, mobile sources, residential and commercial sectors with information on fuel used and source classification information. In this presentation, we provide an overview of the Vulcan inventory methods, results and evaluation of the Vulcan inventory by comparing to state-level inventories and other independent estimates. The inventory has been recently placed onto Google Earth and we will provide a preview of this capability. Finally, we will present the result of fossil fuel CO2 concentration as transported by an atmospheric transport model and a comparison to in situ CO2 observations.

  18. A Quantitative Method for Estimating Probable Public Costs of Hurricanes.

    PubMed

    BOSWELL; DEYLE; SMITH; BAKER

    1999-04-01

    / A method is presented for estimating probable public costs resulting from damage caused by hurricanes, measured as local government expenditures approved for reimbursement under the Stafford Act Section 406 Public Assistance Program. The method employs a multivariate model developed through multiple regression analysis of an array of independent variables that measure meteorological, socioeconomic, and physical conditions related to the landfall of hurricanes within a local government jurisdiction. From the regression analysis we chose a log-log (base 10) model that explains 74% of the variance in the expenditure data using population and wind speed as predictors. We illustrate application of the method for a local jurisdiction-Lee County, Florida, USA. The results show that potential public costs range from $4.7 million for a category 1 hurricane with winds of 137 kilometers per hour (85 miles per hour) to $130 million for a category 5 hurricane with winds of 265 kilometers per hour (165 miles per hour). Based on these figures, we estimate expected annual public costs of $2.3 million. These cost estimates: (1) provide useful guidance for anticipating the magnitude of the federal, state, and local expenditures that would be required for the array of possible hurricanes that could affect that jurisdiction; (2) allow policy makers to assess the implications of alternative federal and state policies for providing public assistance to jurisdictions that experience hurricane damage; and (3) provide information needed to develop a contingency fund or other financial mechanism for assuring that the community has sufficient funds available to meet its obligations. KEY WORDS: Hurricane; Public costs; Local government; Disaster recovery; Disaster response; Florida; Stafford Act

  19. Quantitative sacroiliac scintigraphy. The effect of method of selection of region of interest

    SciTech Connect

    Davis, M.C.; Turner, D.A.; Charters, J.R.; Golden, H.E.; Ali, A.; Fordham, E.W.

    1984-06-01

    Various authors have advocated quantitative methods of evaluating bone scintigrams to detect sacroiliitis, while others have not found them useful. Many explanations for this disagreement have been offered, including differences in the method of case selection, ethnicity, gender, and previous drug therapy. It would appear that one of the most important impediments to consistent results is the variability of selecting sacroiliac joint and reference regions of interest (ROIs). The effect of ROI selection would seem particularly important because of the normal variability of radioactivity within the reference regions that have been used (sacrum, spine, iliac wing) and the inhomogeneity of activity in the SI joints. We have investigated the effect of ROI selection, using five different methods representative of, though not necessarily identical to, those found in the literature. Each method produced unique mean indices that were different for patients with ankylosing spondylitis (AS) and controls. The method of Ayres (19) proved superior (largest mean difference, smallest variance), but none worked well as a diagnostic tool because of substantial overlap of the distributions of indices of patient and control groups. We conclude that ROI selection is important in determining results, and quantitative scintigraphic methods in general are not effective tools for diagnosing AS. Among the possible factors limiting success, difficulty in selecting a stable reference area seems of particular importance.

  20. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    PubMed

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  1. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    NASA Astrophysics Data System (ADS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-06-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the sample’s local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilever’s contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilever’s EMI can be well predicted by the equivalent circuit model while the soft cantilever’s cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM.

  2. MODIS Radiometric Calibration Program, Methods and Results

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong; Guenther, Bruce; Angal, Amit; Barnes, William; Salomonson, Vincent; Sun, Junqiang; Wenny, Brian

    2012-01-01

    As a key instrument for NASA s Earth Observing System (EOS), the Moderate Resolution Imaging Spectroradiometer (MODIS) has made significant contributions to the remote sensing community with its unprecedented amount of data products continuously generated from its observations and freely distributed to users worldwide. MODIS observations, covering spectral regions from visible (VIS) to long-wave infrared (LWIR), have enabled a broad range of research activities and applications for studies of the earth s interactive system of land, oceans, and atmosphere. In addition to extensive pre-launch measurements, developed to characterize sensor performance, MODIS carries a set of on-board calibrators (OBC) that can be used to track on-orbit changes of various sensor characteristics. Most importantly, dedicated and continuous calibration efforts have been made to maintain sensor data quality. This paper provides an overview of the MODIS calibration program, on-orbit calibration activities, methods, and performance. Key calibration results and lessons learned from the MODIS calibration effort are also presented in this paper.

  3. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  4. Application of new least-squares methods for the quantitative infrared analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.

    1982-11-01

    Improvements have been made in previous least-squares regression analyses of infrared spectra for the quantitative estimation of concentrations of multicomponent mixtures. Spectral baselines are fitted by least-squares methods, and overlapping spectral features are accounted for in the fitting procedure. Selection of peaks above a threshold value reduces computation time and data storage requirements. Four weighted least-squares methods incorporating different baseline assumptions were investigated using FT-IR spectra of the three pure xylene isomers and their mixtures. By fitting only regions of the spectra that follow Beer's Law, accurate results can be obtained using three of the fitting methods even when baselines are not corrected to zero. Accurate results can also be obtained using one of the fits even in the presence of Beer's Law deviations. This is a consequence of pooling the weighted results for each spectral peak such that the greatest weighting is automatically given to those peaks that adhere to Beer's Law. It has been shown with the xylene spectra that semiquantitative results can be obtained even when all the major components are not known or when expected components are not present. This improvement over previous methods greatly expands the utility of quantitative least-squares analyses.

  5. Difficulties Experienced by Education and Sociology Students in Quantitative Methods Courses.

    ERIC Educational Resources Information Center

    Murtonen, Mari; Lehtinen, Erno

    2003-01-01

    Examined difficulties Finnish university students experienced in learning quantitative methods. Education and sociology students rated different topics on the basis of their difficulty. Overall, students considered statistics and quantitative methods more difficult than other domains. They tended to polarize academic subjects into "easier"…

  6. Visual Display of Scientific Studies, Methods, and Results

    NASA Astrophysics Data System (ADS)

    Saltus, R. W.; Fedi, M.

    2015-12-01

    The need for efficient and effective communication of scientific ideas becomes more urgent each year.A growing number of societal and economic issues are tied to matters of science - e.g., climate change, natural resource availability, and public health. Societal and political debate should be grounded in a general understanding of scientific work in relevant fields. It is difficult for many participants in these debates to access science directly because the formal method for scientific documentation and dissemination is the journal paper, generally written for a highly technical and specialized audience. Journal papers are very effective and important for documentation of scientific results and are essential to the requirements of science to produce citable and repeatable results. However, journal papers are not effective at providing a quick and intuitive summary useful for public debate. Just as quantitative data are generally best viewed in graphic form, we propose that scientific studies also can benefit from visual summary and display. We explore the use of existing methods for diagramming logical connections and dependencies, such as Venn diagrams, mind maps, flow charts, etc., for rapidly and intuitively communicating the methods and results of scientific studies. We also discuss a method, specifically tailored to summarizing scientific papers that we introduced last year at AGU. Our method diagrams the relative importance and connections between data, methods/models, results/ideas, and implications/importance using a single-page format with connected elements in these four categories. Within each category (e.g., data) the spatial location of individual elements (e.g., seismic, topographic, gravity) indicates relative novelty (e.g., are these new data?) and importance (e.g., how critical are these data to the results of the paper?). The goal is to find ways to rapidly and intuitively share both the results and the process of science, both for communication

  7. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-12-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  8. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  9. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-11-26

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay.

  10. [Research progress of real-time quantitative PCR method for group A rotavirus detection].

    PubMed

    Guo, Yan-Qing; Li, Dan-Di; Duan, Zhao-Jun

    2013-11-01

    Group A rotavirus is one of the most significant etiological agents which causes acute gastroenteritis among infants and young children worldwide. So far, several method which includes electron microscopy (EM), enzyme immunoassay (EIA), reverse transcription-polymerase chain reaction (RT-PCR)and Real-time Quantitative PCR has been established for the detection of rotavirus. Compared with other methods, Real-time quantitative PCR have advantages in specificity, sensitivity, genotyping and quantitative accuracy. This article shows a overview of the application of real-time quantitative PCR technique to detecte group A rotavirus.

  11. Quantitative estimation of poikilocytosis by the coherent optical method

    NASA Astrophysics Data System (ADS)

    Safonova, Larisa P.; Samorodov, Andrey V.; Spiridonov, Igor N.

    2000-05-01

    The investigation upon the necessity and the reliability required of the determination of the poikilocytosis in hematology has shown that existing techniques suffer from grave shortcomings. To determine a deviation of the erythrocytes' form from the normal (rounded) one in blood smears it is expedient to use an integrative estimate. The algorithm which is based on the correlation between erythrocyte morphological parameters with properties of the spatial-frequency spectrum of blood smear is suggested. During analytical and experimental research an integrative form parameter (IFP) which characterizes the increase of the relative concentration of cells with the changed form over 5% and the predominating type of poikilocytes was suggested. An algorithm of statistically reliable estimation of the IFP on the standard stained blood smears has been developed. To provide the quantitative characterization of the morphological features of cells a form vector has been proposed, and its validity for poikilocytes differentiation was shown.

  12. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this. PMID:19602858

  13. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    PubMed

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.

  14. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this.

  15. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    PubMed

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time <60 s) and method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical

  16. Analyses on Regional Cultivated Land Changebased on Quantitative Method

    NASA Astrophysics Data System (ADS)

    Cao, Yingui; Yuan, Chun; Zhou, Wei; Wang, Jing

    Three Gorges Project is the great project in the world, which accelerates economic development in the reservoir area of Three Gorges Project. In the process of development in the reservoir area of Three Gorges Project, cultivated land has become the important resources, a lot of cultivated land has been occupied and become the constructing land. In the same time, a lot of cultivated land has been flooded because of the rising of the water level. This paper uses the cultivated land areas and social economic indicators of reservoir area of Three Gorges in 1990-2004, takes the statistic analyses and example research in order to analyze the process of cultivated land, get the driving forces of cultivated land change, find the new methods to stimulate and forecast the cultivated land areas in the future, and serve for the cultivated land protection and successive development in reservoir area of Three Gorges. The results indicate as follow, firstly, in the past 15 years, the cultivated land areas has decreased 200142 hm2, the decreasing quantity per year is 13343 hm2. The whole reservoir area is divided into three different areas, they are upper reaches area, belly area and lower reaches area. The trends of cultivated land change in different reservoir areas are similar to the whole reservoir area. Secondly, the curve of cultivated land areas and per capita GDP takes on the reverse U, and the steps between the change rate of cultivated land and the change rate of GDP are different in some years, which indicates that change of cultivated land and change of GDP are decoupling, besides that, change of cultivated land is connection with the development of urbanization and the policy of returning forestry greatly. Lastly, the precision of multi-regression is lower than the BP neural network in the stimulation of cultivated land, then takes use of the BP neural network to forecast the cultivated land areas in 2005, 2010 and 2015, and the forecasting results are reasonable.

  17. Optimization of Quantitative PCR Methods for Enteropathogen Detection.

    PubMed

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen's extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease. PMID:27336160

  18. Optimization of Quantitative PCR Methods for Enteropathogen Detection

    PubMed Central

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M.; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R.

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen’s extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease. PMID:27336160

  19. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  20. A novel microflow LC-MS method for the quantitation of endocannabinoids in serum.

    PubMed

    Kirkwood, Jay S; Broeckling, Corey D; Donahue, Seth; Prenni, Jessica E

    2016-10-15

    Endocannabinoids (ECs) represent a class of endogenous, small molecules that bind and activate the G-protein coupled EC receptors. They are involved in a variety of fundamental biological processes and are associated with many disease states. Endocannabinoids are often present in complex matrices and at low concentrations, complicating their measurement. Here we describe a highly sensitive method for the quantitation of the following ECs in serum: N-arachidonoylethanolamine (anandamide), N-oleoylethanolamine, N-palmitoylethanolamine, 2-arachidonoylglycerol, and its inactive isomer 1-arachidonoylglycerol. On-line sample trapping coupled with separation via microflow liquid chromatography and detection by tandem quadrupole mass spectrometry results in the necessary sensitivity for accurate quantitation of ECs in less than 50μL of serum, without the need for off-line solid phase extraction. Limits of quantitation between 1.2 and 13.4pg/mL were achieved, representing a significant increase in sensitivity compared to previous methods using analytical flow rates. An additional benefit of microflow chromatography is the reduction of solvent consumption by more than two orders of magnitude. The experimental utility of the assay is demonstrated through the analysis of serum from hibernating bears to assess seasonal changes in circulating EC concentrations. PMID:27592285

  1. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy.

    PubMed

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-02-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8-29% smaller than those obtained from the other two methods. This discrepancy decreased to 3-19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method.

  2. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    PubMed

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. PMID:22207079

  3. Chemical comparison of Tripterygium wilfordii and Tripterygium hypoglaucum based on quantitative analysis and chemometrics methods.

    PubMed

    Guo, Long; Duan, Li; Liu, Ke; Liu, E-Hu; Li, Ping

    2014-07-01

    Tripterygium wilfordii (T. wilfordii) and Tripterygium hypoglaucum (T. hypoglaucum), two commonly used Chinese herbal medicines derived from Tripterygium genus, have been widely used for the treatment of rheumatoid arthritis and other related inflammatory diseases in clinical therapy. In the present study, a rapid resolution liquid chromatography/electrospray ionization tandem mass spectrometry (RRLC-ESI-MS(n)) method has been developed and validated for simultaneous determination of 19 bioactive compounds including four catechins, three sesquiterpene alkaloids, four diterpenoids, and eight triterpenoids in these two similar herbs. The method validation results indicated that the developed method had desirable specificity, linearity, precision and accuracy. Quantitative analysis results showed that there were significant differences in the content of different types of compounds in T. wilfordii and T. hypoglaucum. Moreover, chemometrics methods such as one-way ANOVA, principal component analysis (PCA) and hierarchical clustering analysis (HCA) were performed to compare and discriminate the two Tripterygium herbs based on the quantitative data of analytes, and it was proven straightforward and reliable to differentiate T. wilfordii and T. hypoglaucum samples from different origins. In conclusion, simultaneous quantification of multiple-active component by RRLC-ESI-MS(n) coupled with chemometrics analysis could be a well-acceptable strategy to compare and evaluate the quality of T. wilfordii and T. hypoglaucum.

  4. Quantitative research on the primary process: method and findings.

    PubMed

    Holt, Robert R

    2002-01-01

    Freud always defined the primary process metapsychologically, but he described the ways it shows up in dreams, parapraxes, jokes, and symptoms with enough observational detail to make it possible to create an objective, reliable scoring system to measure its manifestations in Rorschach responses, dreams, TAT stories, free associations, and other verbal texts. That system can identify signs of the thinker's efforts, adaptive or maladaptive, to control or defend against the emergence of primary process. A prerequisite and a consequence of the research that used this system was clarification and elaboration of the psychoanalytic theory of thinking. Results of empirical tests of several propositions derived from psychoanalytic theory are summarized. Predictions concerning the method's most useful index, of adaptive vs. maladaptive regression, have been repeatedly verified: People who score high on this index (who are able to produce well-controlled "primary products" in their Rorschach responses), as compared to those who score at the maladaptive pole (producing primary-process-filled responses with poor reality testing, anxiety, and pathological defensive efforts), are better able to tolerate sensory deprivation, are more able to enter special states of consciousness comfortably (drug-induced, hypnotic, etc.), and have higher achievements in artistic creativity, while schizophrenics tend to score at the extreme of maladaptive regression. Capacity for adaptive regression also predicts success in psychotherapy, and rises with the degree of improvement after both psychotherapy and drug treatment. Some predictive failures have been theoretically interesting: Kris's hypothesis about creativity and the controlled use of primary process holds for males but usually not for females. This body of work is presented as a refutation of charges, brought by such critics as Crews, that psychoanalysis cannot become a science.

  5. An ECL-PCR method for quantitative detection of point mutation

    NASA Astrophysics Data System (ADS)

    Zhu, Debin; Xing, Da; Shen, Xingyan; Chen, Qun; Liu, Jinfeng

    2005-04-01

    A new method for identification of point mutations was proposed. Polymerase chain reaction (PCR) amplification of a sequence from genomic DNA was followed by digestion with a kind of restriction enzyme, which only cut the wild-type amplicon containing its recognition site. Reaction products were detected by electrochemiluminescence (ECL) assay after adsorption of the resulting DNA duplexes to the solid phase. One strand of PCR products carries biotin to be bound on a streptavidin-coated microbead for sample selection. Another strand carries Ru(bpy)32+ (TBR) to react with tripropylamine (TPA) to emit light for ECL detection. The method was applied to detect a specific point mutation in H-ras oncogene in T24 cell line. The results show that the detection limit for H-ras amplicon is 100 fmol and the linear range is more than 3 orders of magnitude, thus, make quantitative analysis possible. The genotype can be clearly discriminated. Results of the study suggest that ECL-PCR is a feasible quantitative method for safe, sensitive and rapid detection of point mutation in human genes.

  6. Use of quantitative PCR to evaluate methods of bacteria sampling in periodontal patients.

    PubMed

    Masunaga, Hiroshi; Tsutae, Wataru; Oh, Hyun; Shinozuka, Naoki; Kishimoto, Noriyoshi; Ogata, Yorimasa

    2010-12-01

    Periodontal disease is associated with specific periodontal pathogens and may persist as gingivitis or progress to more severe disease. The bacteria involved in disease initiation and progression have not been identified. We used quantitative polymerase chain reaction (PCR) to compare the levels of Porphyromonas gingivalis, Tannerella forsythia, and Treponema denticola, and total bacteria detected by different sampling methods. On the basis of the results of clinical examinations, 57 patients were divided into 3 groups: healthy group (group A), gingivitis group (group B), and periodontitis group (group C). Bacterial samples were collected from saliva, mouthwash, and by paper-point sampling of gingival crevicular fluid (GCF), and the samples were analyzed with quantitative PCR targeting 16S rRNA. The numbers of total bacteria in samples of GCF, saliva, and mouthwash were 10⁵ to 10⁶, 10⁸, and 10⁷, respectively, per milliliter. The number of P. gingivalis in GCF samples was lower than 10 in group A; however, in groups B and C, the values were 10³ and 10⁴, respectively, indicating that the number of P. gingivalis increased with worsening clinical status. Findings were similar in the samples of saliva and mouthwash. The numbers of T. forsythia showed a pattern similar to that of P. gingivalis in all 3 samples. These results suggest that saliva and mouthwash samples are clinically useful for bacterial testing of periodontal diseases by quantitative PCR. In addition, mouthwash sampling is more feasible and straightforward than saliva sampling.

  7. An experimental method for quantitatively evaluating the elemental processes of indoor radioactive aerosol behavior.

    PubMed

    Yamazawa, H; Yamada, S; Xu, Y; Hirao, S; Moriizumi, J

    2015-11-01

    An experimental method for quantitatively evaluating the elemental processes governing the indoor behaviour of naturally occurring radioactive aerosols was proposed. This method utilises transient response of aerosol concentrations to an artificial change in aerosol removal rate by turning on and off an air purifier. It was shown that the indoor-outdoor exchange rate and the indoor deposition rate could be estimated by a continuous measurement of outdoor and indoor aerosol number concentration measurements and by the method proposed in this study. Although the scatter of the estimated parameters is relatively large, both the methods gave consistent results. It was also found that the size distribution of radioactive aerosol particles and hence activity median aerodynamic diameter remained not largely affected by the operation of the air purifier, implying the predominance of the exchange and deposition processes over other processes causing change in the size distribution such as the size growth by coagulation and the size dependence of deposition.

  8. Rapid quantitative method for total brominated vegetable oil in soft drinks using ion chromatography.

    PubMed

    Yousef, Ashraf A; Abbas, Alaa B; Badawi, Bassam Sh; Al-Jowhar, Wafaa Y; Zain, Esam A; El-Mufti, Seham A

    2012-08-01

    A simple, quantitative and rapid method for total brominated vegetable oil (BVO) using ion chromatography (IC) with suppressed conductivity detection was developed and successfully applied to soft drinks with results expressed as inorganic bromide anion. The procedure involves extraction of BVO with diethyl ether and treatment with zinc dust in a solution of acetic acid, giving recoveries ranging between 92.5 and 98.5%. The calibration curves obtained were linear with correlation coefficients (r²) of 0.998, a coefficient of variation (CV) of less than 5% and limit of detection (LOD) and limit of quantification (LOQ) of 250 and 750 µg l⁻¹, respectively. The method was successfully applied to the determination of BVO in several commercial soft drinks which were found to contain BVO in the range 1.8-14.510 mg l⁻¹. The method has less sources of error compared to previously published methods.

  9. Full quantitative phase analysis of hydrated lime using the Rietveld method

    SciTech Connect

    Lassinantti Gualtieri, Magdalena

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  10. Stress echocardiography: methods, indications and results

    PubMed Central

    Baur, L.H.B.

    2002-01-01

    Stress echocardiography has become an important clinical tool to detect cardiac ischaemia and viability in addition to single photon emission tomography. Stress echocardiography has a high positive and negative predictive value, is less expensive than the nuclear methods and has no radiation exposure. It can easily be used in an emergency room and coronary care unit. Because of its feasibility, low cost and high diagnostic accuracy, it will become a very important technique in every hospital and will soon be a real alternative to the more time-consuming and expensive nuclear techniques. The current article gives a review of the methods of stress echocardiography. ImagesFigure 2Figure 3Figure 4 PMID:25696080

  11. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  12. Reconstruction-classification method for quantitative photoacoustic tomography.

    PubMed

    Malone, Emma; Powell, Samuel; Cox, Ben T; Arridge, Simon

    2015-01-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches. PMID:26662815

  13. Reconstruction-classification method for quantitative photoacoustic tomography.

    PubMed

    Malone, Emma; Powell, Samuel; Cox, Ben T; Arridge, Simon

    2015-01-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches.

  14. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  15. University Students' Research Orientations: Do Negative Attitudes Exist toward Quantitative Methods?

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2005-01-01

    This paper examines university social science and education students' views of research methodology, especially asking whether a negative research orientation towards quantitative methods exists. Finnish (n = 196) and US (n = 122) students answered a questionnaire concerning their views on quantitative, qualitative, empirical, and theoretical…

  16. A quantitative method based on total relative change for dynamic electrical impedance tomography.

    PubMed

    You, Fusheng; Shi, Xuetao; Dong, Xiuzhen; Fu, Feng; Liu, Ruigang; Shuai, Wanjun; Li, Zheng

    2008-03-01

    We proposed a new method based on total relative change (TRC) from measured boundary voltages to quantify the volume changes of fluid during electrical impedance tomography (EIT) monitoring. The results showed that TRC linearly correlated with the volume of infused saline solution into a phantom, and the slope of TRC changes was approximately linear with the infusion speed. A inserted copper tube at different positions did not affect TRC significantly. The linear relationship between TRC and volume change indicates that TRC could be a good quantitative index for dynamic EIT.

  17. Critical appraisal of quantitative PCR results in colorectal cancer research: can we rely on published qPCR results?

    PubMed

    Dijkstra, J R; van Kempen, L C; Nagtegaal, I D; Bustin, S A

    2014-06-01

    The use of real-time quantitative polymerase chain reaction (qPCR) in cancer research has become ubiquitous. The relative simplicity of qPCR experiments, which deliver fast and cost-effective results, means that each year an increasing number of papers utilizing this technique are being published. But how reliable are the published results? Since the validity of gene expression data is greatly dependent on appropriate normalisation to compensate for sample-to-sample and run-to-run variation, we have evaluated the adequacy of normalisation procedures in qPCR-based experiments. Consequently, we assessed all colorectal cancer publications that made use of qPCR from 2006 until August 2013 for the number of reference genes used and whether they had been validated. Using even these minimal evaluation criteria, the validity of only three percent (6/179) of the publications can be adequately assessed. We describe common errors, and conclude that the current state of reporting on qPCR in colorectal cancer research is disquieting. Extrapolated to the study of cancer in general, it is clear that the majority of studies using qPCR cannot be reliably assessed and that at best, the results of these studies may or may not be valid and at worst, pervasive incorrect normalisation is resulting in the wholesale publication of incorrect conclusions. This survey demonstrates that the existence of guidelines, such as MIQE, is necessary but not sufficient to address this problem and suggests that the scientific community should examine its responsibility and be aware of the implications of these findings for current and future research.

  18. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  19. Comparative evaluation of two quantitative precipitation estimation methods in Korea

    NASA Astrophysics Data System (ADS)

    Ko, H.; Nam, K.; Jung, H.

    2013-12-01

    The spatial distribution and intensity of rainfall is necessary for hydrological model, particularly, grid based distributed model. The weather radar is much higher spatial resolution (1kmx1km) than rain gauges (~13km) although radar is indirect measurement of rainfall and rain gauges are directly observed it. And also, radar is provided areal and gridded rainfall information while rain gauges are provided point data. Therefore, radar rainfall data can be useful for input data on the hydrological model. In this study, we compared two QPE schemes to produce radar rainfall for hydrological utilization. The two methods are 1) spatial adjustment and 2) real-time Z-R relationship adjustment (hereafter RAR; Radar-Aws Rain rate). We computed and analyzed the statistics such as ME (Mean Error), RMSE (Root mean square Error), and correlation using cross-validation method (here, leave-one-out method).

  20. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  1. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1983-09-06

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions. 6 figs.

  2. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  3. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, Edward F.; Keller, Richard A.; Apel, Charles T.

    1983-01-01

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions.

  4. Quantitative Methods in the Study of Drug Use.

    ERIC Educational Resources Information Center

    Huba, George J.

    1983-01-01

    Addresses the use of data sources and multivariate data analysis methods for modeling naturally occurring data. Two studies in this special section deal with longitudinal perspectives on drug use and emotional adjustment and an application of log-linear model to examine the stress-buffering function of alcohol. (LLL)

  5. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models.

    PubMed

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  6. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models

    PubMed Central

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  7. Limitations of the ferrozine method for quantitative assay of mineral systems for ferrous and total iron

    NASA Astrophysics Data System (ADS)

    Anastácio, Alexandre S.; Harris, Brittany; Yoo, Hae-In; Fabris, José Domingos; Stucki, Joseph W.

    2008-10-01

    The quantitative assay of clay minerals, soils, and sediments for Fe(II) and total Fe is fundamental to understanding biogeochemical cycles occurring therein. The commonly used ferrozine method was originally designed to assay extracted forms of Fe(II) from non-silicate aqueous systems. It is becoming, however, increasingly the method of choice to report the total reduced state of Fe in soils and sediments. Because Fe in soils and sediments commonly exists in the structural framework of silicates, extraction by HCl, as used in the ferrozine method, fails to dissolve all of the Fe. The phenanthroline (phen) method, on the other hand, was designed to assay silicate minerals for Fe(II) and total Fe and has been proven to be highly reliable. In the present study potential sources of error in the ferrozine method were evaluated by comparing its results to those obtained by the phen method. Both methods were used to analyze clay mineral and soil samples for Fe(II) and total Fe. Results revealed that the conventional ferrozine method under reports total Fe in samples containing Fe in silicates and gives erratic results for Fe(II). The sources of error in the ferrozine method are: (1) HCl fails to dissolve silicates and (2) if the analyte solution contains Fe 3+, the analysis for Fe 2+ will be photosensitive, and reported Fe(II) values will likely be greater than the actual amount in solution. Another difficulty with the ferrozine method is that it is tedious and much more labor intensive than the phen method. For these reasons, the phen method is preferred and recommended. Its procedure is simpler, takes less time, and avoids the errors found in the ferrozine method.

  8. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR

    PubMed Central

    Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J.; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H.

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  9. Quantitative trait locus gene mapping: a new method for locating alcohol response genes.

    PubMed

    Crabbe, J C

    1996-01-01

    Alcoholism is a multigenic trait with important non-genetic determinants. Studies with genetic animal models of susceptibility to several of alcohol's effects suggest that several genes contributing modest effects on susceptibility (Quantitative Trait Loci, or QTLs) are important. A new technique of QTL gene mapping has allowed the identification of the location in mouse genome of several such QTLs. The method is described, and the locations of QTLs affecting the acute alcohol withdrawal reaction are described as an example of the method. Verification of these QTLs in ancillary studies is described and the strengths, limitations, and future directions to be pursued are discussed. QTL mapping is a promising method for identifying genes in rodents with the hope of directly extrapolating the results to the human genome. This review is based on a paper presented at the First International Congress of the Latin American Society for Biomedical Research on Alcoholism, Santiago, Chile, November 1994. PMID:12893462

  10. A non-invasive method of qualitative and quantitative measurement of drugs.

    PubMed

    Westerman, S T; Gilbert, L M

    1981-09-01

    Methods for quick qualitative and quantitative evaluation of drug intake are needed, especially during emergency situations such as drug overdose and alcohol intoxication. The electronystagmograph was used in an attempt to develop a non-invasive method for identification of drug intake, and to study the effects of alcohol and other drugs on the vestibular system. Results of the study reveal that alcohol, diazepam, opiates, barbiturates, cocaine, marijuana, and hallucinogenic drugs produce a characteristic printout pattern which can be evaluated qualitatively. This method is a practical, non-invasive, objective procedure that provides rapid assessment of quality of drug intake. Its potential uses are extensive, including such possibilities as evaluation of drug intake in emergency drug overdose situations, monitoring anesthesia during surgery, evaluating drug intake in women about to deliver, (as well as the effects on the newborn), and determining whether or not persons who are being tested on a polygraph are under the influence of drugs.

  11. [Study on quantitative methods of cleistocalycis operculati cortex].

    PubMed

    Chen, Li-Si; Ou, Jia-Ju; Li, Shu-Yuan; Lu, Song-Gui

    2014-08-01

    Cleistocalycis Operculati Cortex is the dry bark of Cleistocalyx operculatus. It is the raw material of Compound Hibiscuse which is external sterilization antipruritic drugs. The quality standard of Cleistocalycis Operculati Cortex in Guangdong Province "standard for the traditional Chinese medicine" (second volumes) only contains TLC identification. It is unable to effectively monitor and control the quality of Cleistocalycis Operculati Cortex. A reversed-phase HPLC method was established for the determination of 3, 3'-O-dimethylellagic acid from Cleistocalycis Operculati Cortex and the content was calculated by external standard method for the first time. Under the selected chromatographic conditions, the target components between peaks to achieve effective separation. 3,3'-O- dimethylellagic acid standard solution at the concentration of 1.00 - 25.0 mg x L(-1) showed a good linear relationship. The standard curve was Y = 77.33X + 7.904, r = 0.999 5. The average recovery was 101.0%, RSD was 1.3%. The HPLC method for the determination of 3,3'-O-dimethylellagic acid in Cleistocalycis Operculati Cortex is accurate and reliable. It can provide a strong technical support for monitoring the quality of Cleistocalycis Operculati Cortex. PMID:25509300

  12. Qualitative and quantitative determination of ubiquinones by the method of high-efficiency liquid chromatography

    SciTech Connect

    Yanotovskii, M.T.; Mogilevskaya, M.P.; Obol'nikova, E.A.; Kogan, L.M.; Samokhvalov, G.I.

    1986-07-10

    A method has been developed for the qualitative and quantitative determination of ubiquinones CoQ/sub 6/-CoQ/sub 10/, using high-efficiency reversed-phase liquid chromatography. Tocopherol acetate was used as the internal standard.

  13. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  14. A bead-based method for multiplexed identification and quantitation of DNA sequences using flow cytometry.

    PubMed

    Spiro, A; Lowe, M; Brown, D

    2000-10-01

    A new multiplexed, bead-based method which utilizes nucleic acid hybridizations on the surface of microscopic polystyrene spheres to identify specific sequences in heterogeneous mixtures of DNA sequences is described. The method consists of three elements: beads (5.6-microm diameter) with oligomer capture probes attached to the surface, three fluorophores for multiplexed detection, and flow cytometry instrumentation. Two fluorophores are impregnated within each bead in varying amounts to create different bead types, each associated with a unique probe. The third fluorophore is a reporter. Following capture of fluorescent cDNA sequences from environmental samples, the beads are analyzed by flow cytometric techniques which yield a signal intensity for each capture probe proportional to the amount of target sequences in the analyte. In this study, a direct hybrid capture assay was developed and evaluated with regard to sequence discrimination and quantitation of abundances. The target sequences (628 to 728 bp in length) were obtained from the 16S/23S intergenic spacer region of microorganisms collected from polluted groundwater at the nuclear waste site in Hanford, Wash. A fluorescence standard consisting of beads with a known number of fluorescent DNA molecules on the surface was developed, and the resolution, sensitivity, and lower detection limit for measuring abundances were determined. The results were compared with those of a DNA microarray using the same sequences. The bead method exhibited far superior sequence discrimination and possesses features which facilitate accurate quantitation. PMID:11010868

  15. Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method

    PubMed Central

    Nuzzo, Genoveffa; Gallo, Carmela; d’Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo

    2013-01-01

    Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790

  16. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    SciTech Connect

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-15

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10{sup -6}, comparable with the recent results reported in the literature.

  17. A bead-based method for multiplexed identification and quantitation of DNA sequences using flow cytometry.

    PubMed

    Spiro, A; Lowe, M; Brown, D

    2000-10-01

    A new multiplexed, bead-based method which utilizes nucleic acid hybridizations on the surface of microscopic polystyrene spheres to identify specific sequences in heterogeneous mixtures of DNA sequences is described. The method consists of three elements: beads (5.6-microm diameter) with oligomer capture probes attached to the surface, three fluorophores for multiplexed detection, and flow cytometry instrumentation. Two fluorophores are impregnated within each bead in varying amounts to create different bead types, each associated with a unique probe. The third fluorophore is a reporter. Following capture of fluorescent cDNA sequences from environmental samples, the beads are analyzed by flow cytometric techniques which yield a signal intensity for each capture probe proportional to the amount of target sequences in the analyte. In this study, a direct hybrid capture assay was developed and evaluated with regard to sequence discrimination and quantitation of abundances. The target sequences (628 to 728 bp in length) were obtained from the 16S/23S intergenic spacer region of microorganisms collected from polluted groundwater at the nuclear waste site in Hanford, Wash. A fluorescence standard consisting of beads with a known number of fluorescent DNA molecules on the surface was developed, and the resolution, sensitivity, and lower detection limit for measuring abundances were determined. The results were compared with those of a DNA microarray using the same sequences. The bead method exhibited far superior sequence discrimination and possesses features which facilitate accurate quantitation.

  18. A Combinatorial Partitioning Method to Identify Multilocus Genotypic Partitions That Predict Quantitative Trait Variation

    PubMed Central

    Nelson, M.R.; Kardia, S.L.R.; Ferrell, R.E.; Sing, C.F.

    2001-01-01

    Recent advances in genome research have accelerated the process of locating candidate genes and the variable sites within them and have simplified the task of genotype measurement. The development of statistical and computational strategies to utilize information on hundreds — soon thousands — of variable loci to investigate the relationships between genome variation and phenotypic variation has not kept pace, particularly for quantitative traits that do not follow simple Mendelian patterns of inheritance. We present here the combinatorial partitioning method (CPM) that examines multiple genes, each containing multiple variable loci, to identify partitions of multilocus genotypes that predict interindividual variation in quantitative trait levels. We illustrate this method with an application to plasma triglyceride levels collected on 188 males, ages 20–60 yr, ascertained without regard to health status, from Rochester, Minnesota. Genotype information included measurements at 18 diallelic loci in six coronary heart disease–candidate susceptibility gene regions: APOA1-C3-A4, APOB, APOE, LDLR, LPL, and PON1. To illustrate the CPM, we evaluated all possible partitions of two-locus genotypes into two to nine partitions (∼106 evaluations). We found that many combinations of loci are involved in sets of genotypic partitions that predict triglyceride variability and that the most predictive sets show nonadditivity. These results suggest that traditional methods of building multilocus models that rely on statistically significant marginal, single-locus effects, may fail to identify combinations of loci that best predict trait variability. The CPM offers a strategy for exploring the high-dimensional genotype state space so as to predict the quantitative trait variation in the population at large that does not require the conditioning of the analysis on a prespecified genetic model. PMID:11230170

  19. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted.

  20. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  1. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  2. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE PAGES

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary; Geller, Jil; Fisher, Susan; Hall, Steven; Hazen, Terry C.; et al

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  3. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions.

    PubMed

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E; Geller, Jil T; Fisher, Susan J; Hall, Steven C; Hazen, Terry C; Brenner, Steven E; Butland, Gareth; Jin, Jian; Witkowska, H Ewa; Chandonia, John-Marc; Biggin, Mark D

    2016-06-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  4. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    SciTech Connect

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary; Geller, Jil; Fisher, Susan; Hall, Steven; Hazen, Terry C; Brenner, Steven; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.

  5. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions*

    PubMed Central

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E.; Geller, Jil T.; Fisher, Susan J.; Hall, Steven C.; Hazen, Terry C.; Brenner, Steven E.; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  6. Quantitative Trait Locus Mapping Methods for Diversity Outbred Mice

    PubMed Central

    Gatti, Daniel M.; Svenson, Karen L.; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J.; Broman, Karl W.; Churchill, Gary A.

    2014-01-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  7. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran.

    PubMed

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2015-09-22

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning.

  8. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran

    PubMed Central

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-01-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  9. Quantitative assessment of breast lesion viscoelasticity: initial clinical results using supersonic shear imaging.

    PubMed

    Tanter, Mickael; Bercoff, Jeremy; Athanasiou, Alexandra; Deffieux, Thomas; Gennisson, Jean-Luc; Montaldo, Gabriel; Muller, Marie; Tardivon, Anne; Fink, Mathias

    2008-09-01

    This paper presents an initial clinical evaluation of in vivo elastography for breast lesion imaging using the concept of supersonic shear imaging. This technique is based on the combination of a radiation force induced in tissue by an ultrasonic beam and an ultrafast imaging sequence capable of catching in real time the propagation of the resulting shear waves. The local shear wave velocity is recovered using a time-offlight technique and enables the 2-D mapping of shear elasticity. This imaging modality is implemented on a conventional linear probe driven by a dedicated ultrafast echographic device. Consequently, it can be performed during a standard echographic examination. The clinical investigation was performed on 15 patients, which corresponded to 15 lesions (4 cases BI-RADS 3, 7 cases BI-RADS 4 and 4 cases BI-RADS 5). The ability of the supersonic shear imaging technique to provide a quantitative and local estimation of the shear modulus of abnormalities with a millimetric resolution is illustrated on several malignant (invasive ductal and lobular carcinoma) and benign cases (fibrocystic changes and viscous cysts). In the investigated cases, malignant lesions were found to be significantly different from benign solid lesions with respect to their elasticity values. Cystic lesions have shown no shear wave propagate at all in the lesion (because shear waves do not propage in liquid). These preliminary clinical results directly demonstrate the clinical feasibility of this new elastography technique in providing quantitative assessment of relative stiffness of breast tissues. This technique of evaluating tissue elasticity gives valuable information that is complementary to the B-mode morphologic information. More extensive studies are necessary to validate the assumption that this new mode potentially helps the physician in both false-positive and false-negative rejection.

  10. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  11. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  12. Automatic segmentation of cell nuclei in Feulgen-stained histological sections of prostate cancer and quantitative evaluation of segmentation results.

    PubMed

    Nielsen, Birgitte; Albregtsen, Fritz; Danielsen, Håvard E

    2012-07-01

    Digital image analysis of cell nuclei is useful to obtain quantitative information for the diagnosis and prognosis of cancer. However, the lack of a reliable automatic nuclear segmentation is a limiting factor for high-throughput nuclear image analysis. We have developed a method for automatic segmentation of nuclei in Feulgen-stained histological sections of prostate cancer. A local adaptive thresholding with an object perimeter gradient verification step detected the nuclei and was combined with an active contour model that featured an optimized initialization and worked within a restricted region to improve convergence of the segmentation of each nucleus. The method was tested on 30 randomly selected image frames from three cases, comparing the results from the automatic algorithm to a manual delineation of 924 nuclei. The automatic method segmented a few more nuclei compared to the manual method, and about 73% of the manually segmented nuclei were also segmented by the automatic method. For each nucleus segmented both manually and automatically, the accuracy (i.e., agreement with manual delineation) was estimated. The mean segmentation sensitivity/specificity were 95%/96%. The results from the automatic method were not significantly different from the ground truth provided by manual segmentation. This opens the possibility for large-scale nuclear analysis based on automatic segmentation of nuclei in Feulgen-stained histological sections.

  13. Quantitative mineralogical composition of complex mineral wastes--contribution of the Rietveld method.

    PubMed

    Mahieux, P-Y; Aubert, J-E; Cyr, M; Coutand, M; Husson, B

    2010-03-01

    The objective of the work presented in this paper is the quantitative determination of the mineral composition of two complex mineral wastes: a sewage sludge ash (SSA) and a municipal solid waste incineration fly ash (MSWIFA). The mineral compositions were determined by two different methods: the first based on calculation using the qualitative mineralogical composition of the waste combined with physicochemical analyses; the second the Rietveld method, which uses only X-ray diffraction patterns. The results obtained are coherent, showing that it is possible to quantify the mineral compositions of complex mineral waste with such methods. The apparent simplicity of the Rietveld method (due principally to the availability of software packages implementing the method) facilitates its use. However, care should be taken since the crystal structure analysis based on powder diffraction data needs experience and a thorough understanding of crystallography. So the use of another, complementary, method such as the first one used in this study, may sometimes be needed to confirm the results. PMID:19948395

  14. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  15. Quantitative mineralogical composition of complex mineral wastes - Contribution of the Rietveld method

    SciTech Connect

    Mahieux, P.-Y.; Aubert, J.-E.; Cyr, M.; Coutand, M.; Husson, B.

    2010-03-15

    The objective of the work presented in this paper is the quantitative determination of the mineral composition of two complex mineral wastes: a sewage sludge ash (SSA) and a municipal solid waste incineration fly ash (MSWIFA). The mineral compositions were determined by two different methods: the first based on calculation using the qualitative mineralogical composition of the waste combined with physicochemical analyses; the second the Rietveld method, which uses only X-ray diffraction patterns. The results obtained are coherent, showing that it is possible to quantify the mineral compositions of complex mineral waste with such methods. The apparent simplicity of the Rietveld method (due principally to the availability of software packages implementing the method) facilitates its use. However, care should be taken since the crystal structure analysis based on powder diffraction data needs experience and a thorough understanding of crystallography. So the use of another, complementary, method such as the first one used in this study, may sometimes be needed to confirm the results.

  16. A Comparative Study on Tobacco Cessation Methods: A Quantitative Systematic Review

    PubMed Central

    Heydari, Gholamreza; Masjedi, Mohammadreza; Ahmady, Arezoo Ebn; Leischow, Scott J.; Lando, Harry A.; Shadmehr, Mohammad Behgam; Fadaizadeh, Lida

    2014-01-01

    Background: During recent years, there have been many advances in different types of pharmacological and non-pharmacological tobacco control treatments. In this study, we aimed to identify the most effective smoking cessation methods used in quit based upon a review of the literature. Methods: We did a search of PubMed, limited to English publications from 2000 to 2012. Two trained reviewers independently assessed titles, abstracts and full texts of articles after a pilot inter-rater reliability assessment which was conducted by the author (GH). The total number of papers and their conclusions including recommendation of that method (positive) or not supporting (negative) was computed for each method. The number of negative papers was subtracted from the number of positive ones for each method. In cases of inconsistency between the two reviewers, these were adjudicated by author. Results: Of the 932 articles that were critically assessed, 780 studies supported quit smoking methods. In 90 studies, the methods were not supported or rejected and in 62 cases the methods were not supported. Nicotine replacement therapy (NRT), Champix and Zyban with 352, 117 and 71 studies respectively were the most supported methods and e-cigarettes and non-Nicotine medications with one case were the least supported methods. Finally, NRT with 39 and Champix and education with 36 scores were the most supported methods. Conclusions: Results of this review indicate that the scientific papers in the most recent decade recommend the use of NRT and Champix in combination with educational interventions. Additional research is needed to compare qualitative and quantitative studies for smoking cessation. PMID:25013685

  17. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  18. Quantitative assessment of trabecular bone micro-architecture of the wrist via 7 Tesla MRI: preliminary results

    PubMed Central

    Wang, Ligong; Liang, Guoyuan; Babb, James S.; Wiggins, Graham C.; Saha, Punam K.; Regatte, Ravinder R.

    2013-01-01

    Object The goal of this study was to determine the feasibility of performing quantitative 7T magnetic resonance imaging (MRI) assessment of trabecular bone micro-architecture of the wrist, a common fracture site. Materials and methods The wrists of 4 healthy subjects (1 woman, 3 men, 28±8.9 years) were scanned on a 7T whole body MR scanner using a 3D fast low-angle shot (FLASH) sequence (TR/TE = 20/4.5ms, 0.169 × 0.169 × 0.5mm). Trabecular bone was segmented and divided into 4 or 8 angular subregions. Total bone volume (TBV), bone volume fraction (BVF), surface-curve ratio (SC), and erosion index (EI) were computed. Subjects were scanned twice to assess measurement reproducibility. Results Group mean subregional values for TBV, BVF, SC, and EI (8 subregion analysis) were as follows: 8489 ± 3686, 0.27 ± 0.045, 9.61 ± 6.52; and 1.43 ± 1.25. Within each individual, there was subregional variation in TBV, SC, and EI (>5%), but not BVF (<5%). Intersubject variation (≥12%) existed for all parameters. Within-subject coefficients of variation were ≤10%. Conclusion This is the first study to perform quantitative 7T MRI assessment of trabecular bone micro-architecture of the wrist. This method could be utilized to study perturbations in bone structure in subjects with osteoporosis or other bone disorders. PMID:21544680

  19. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  20. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  1. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  2. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  3. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  4. "Per cell" normalization method for mRNA measurement by quantitative PCR and microarrays

    PubMed Central

    Kanno, Jun; Aisaki, Ken-ichi; Igarashi, Katsuhide; Nakatsu, Noriyuki; Ono, Atsushi; Kodama, Yukio; Nagao, Taku

    2006-01-01

    Background Transcriptome data from quantitative PCR (Q-PCR) and DNA microarrays are typically obtained from a fixed amount of RNA collected per sample. Therefore, variations in tissue cellularity and RNA yield across samples in an experimental series compromise accurate determination of the absolute level of each mRNA species per cell in any sample. Since mRNAs are copied from genomic DNA, the simplest way to express mRNA level would be as copy number per template DNA, or more practically, as copy number per cell. Results Here we report a method (designated the "Percellome" method) for normalizing the expression of mRNA values in biological samples. It provides a "per cell" readout in mRNA copy number and is applicable to both quantitative PCR (Q-PCR) and DNA microarray studies. The genomic DNA content of each sample homogenate was measured from a small aliquot to derive the number of cells in the sample. A cocktail of five external spike RNAs admixed in a dose-graded manner (dose-graded spike cocktail; GSC) was prepared and added to each homogenate in proportion to its DNA content. In this way, the spike mRNAs represented absolute copy numbers per cell in the sample. The signals from the five spike mRNAs were used as a dose-response standard curve for each sample, enabling us to convert all the signals measured to copy numbers per cell in an expression profile-independent manner. A series of samples was measured by Q-PCR and Affymetrix GeneChip microarrays using this Percellome method, and the results showed up to 90 % concordance. Conclusion Percellome data can be compared directly among samples and among different studies, and between different platforms, without further normalization. Therefore, "percellome" normalization can serve as a standard method for exchanging and comparing data across different platforms and among different laboratories. PMID:16571132

  5. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the

  6. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  7. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  8. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;…

  9. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  10. Development of a new quantitative gas permeability method for dental implant-abutment connection tightness assessment

    PubMed Central

    2011-01-01

    Background Most dental implant systems are presently made of two pieces: the implant itself and the abutment. The connection tightness between those two pieces is a key point to prevent bacterial proliferation, tissue inflammation and bone loss. The leak has been previously estimated by microbial, color tracer and endotoxin percolation. Methods A new nitrogen flow technique was developed for implant-abutment connection leakage measurement, adapted from a recent, sensitive, reproducible and quantitative method used to assess endodontic sealing. Results The results show very significant differences between various sealing and screwing conditions. The remaining flow was lower after key screwing compared to hand screwing (p = 0.03) and remained different from the negative test (p = 0.0004). The method reproducibility was very good, with a coefficient of variation of 1.29%. Conclusions Therefore, the presented new gas flow method appears to be a simple and robust method to compare different implant systems. It allows successive measures without disconnecting the abutment from the implant and should in particular be used to assess the behavior of the connection before and after mechanical stress. PMID:21492459

  11. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods.

    PubMed

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-15

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  12. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    NASA Astrophysics Data System (ADS)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  13. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  14. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer.

    PubMed

    Fu, Guanglei; Sanjay, Sharma T; Dou, Maowei; Li, XiuJun

    2016-03-14

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.

  15. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  16. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  17. Quantitative imaging mass spectrometry of renal sulfatides: validation by classical mass spectrometric methods1[S

    PubMed Central

    Marsching, Christian; Jennemann, Richard; Heilig, Raphael; Gröne, Hermann-Josef; Hopf, Carsten; Sandhoff, Roger

    2014-01-01

    Owing to its capability of discriminating subtle mass-altering structural differences such as double bonds or elongated acyl chains, MALDI-based imaging MS (IMS) has emerged as a powerful technique for analysis of lipid distribution in tissue at moderate spatial resolution of about 50 μm. However, it is still unknown if MS1-signals and ion intensity images correlate with the corresponding apparent lipid concentrations. Analyzing renal sulfated glycosphingolipids, sulfatides, we validate for the first time IMS-signal identities using corresponding sulfatide-deficient kidneys. To evaluate the extent of signal quenching effects interfering with lipid quantification, we surgically dissected the three major renal regions (papillae, medulla, and cortex) and systematically compared MALDI IMS of renal sulfatides with quantitative analyses of corresponding lipid extracts by on-target MALDI TOF-MS and by ultra-performance LC-ESI-(triple-quadrupole)tandem MS. Our results demonstrate a generally strong correlation (R2 > 0.9) between the local relative sulfatide signal intensity in MALDI IMS and absolute sulfatide quantities determined by the other two methods. However, high concentrations of sulfatides in the papillae and medulla result in an up to 4-fold signal suppression. In conclusion, our study suggests that MALDI IMS is useful for semi-quantitative dissection of relative local changes of sulfatides and possibly other lipids in tissue. PMID:25274613

  18. Assessment and application of quantitative schlieren methods: Calibrated color schlieren and background oriented schlieren

    NASA Astrophysics Data System (ADS)

    Elsinga, G. E.; van Oudheusden, B. W.; Scarano, F.; Watt, D. W.

    Two quantitative schlieren methods are assessed and compared: calibrated color schlieren (CCS) and background oriented schlieren (BOS). Both methods are capable of measuring the light deflection angle in two spatial directions, and hence the projected density gradient vector field. Spatial integration using the conjugate gradient method returns the projected density field. To assess the performance of CCS and BOS, density measurements of a two-dimensional benchmark flow (a Prandtl-Meyer expansion fan) are compared with the theoretical density field and with the density inferred from PIV velocity measurements. The method's performance is also evaluated a priori from an experiment ray-tracing simulation. The density measurements show good agreement with theory. Moreover, CCS and BOS return comparable results with respect to each other and with respect to the PIV measurements. BOS proves to be very sensitive to displacements of the wind tunnel during the experiment and requires a correction for it, making it necessary to apply extra boundary conditions in the integration procedure. Furthermore, spatial resolution can be a limiting factor for accurate measurements using BOS. CCS suffers from relatively high noise in the density gradient measurement due to camera noise and has a smaller dynamic range when compared to BOS. Finally the application of the two schlieren methods to a separated wake flow is demonstrated. Flow features such as shear layers and expansion and recompression waves are measured with both methods.

  19. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  20. Defining the knee joint flexion-extension axis for purposes of quantitative gait analysis: an evaluation of methods.

    PubMed

    Schache, Anthony G; Baker, Richard; Lamoreux, Larry W

    2006-08-01

    Minimising measurement variability associated with hip axial rotation and avoiding knee joint angle cross-talk are two fundamental objectives of any method used to define the knee joint flexion-extension axis for purposes of quantitative gait analysis. The aim of this experiment was to compare three different methods of defining this axis: the knee alignment device (KAD) method, a method based on the transepicondylar axis (TEA) and an alternative numerical method (Dynamic). The former two methods are common approaches that have been applied clinically in many quantitative gait analysis laboratories; the latter is an optimisation procedure. A cohort of 20 subjects performed three different functional tasks (normal gait; squat; non-weight bearing knee flexion) on repeated occasions. Three-dimensional hip and knee angles were computed using the three alternative methods of defining the knee joint flexion-extension axis. The repeatability of hip axial rotation measurements during normal gait was found to be significantly better for the Dynamic method (p<0.01). Furthermore, both the variance in the knee varus-valgus kinematic profile and the degree of knee joint angle cross-talk were smallest for the Dynamic method across all functional tasks. The Dynamic method therefore provided superior results in comparison to the KAD and TEA-based methods and thus represents an attractive solution for orientating the knee joint flexion-extension axis for purposes of quantitative gait analysis.

  1. The use of electromagnetic induction methods for establishing quantitative permafrost models in West Greenland

    NASA Astrophysics Data System (ADS)

    Ingeman-Nielsen, Thomas; Brandt, Inooraq

    2010-05-01

    The sedimentary settings at West Greenlandic town and infrastructural development sites are dominated by fine-grained marine deposits of late to post glacial origin. Prior to permafrost formation, these materials were leached by percolating precipitation, resulting in depletion of salts. Present day permafrost in these deposits is therefore very ice-rich with ice contents approaching 50-70% vol. in some areas. Such formations are of great concern in building and construction projects in Greenland, as they loose strength and bearing capacity upon thaw. It is therefore of both technical and economical interest to develop methods to precisely investigate and determine parameters such as ice-content and depth to bedrock in these areas. In terms of geophysical methods for near surface investigations, traditional methods such as Electrical Resistivity Tomography (ERT) and Refraction Seismics (RS) have generally been applied with success. The Georadar method usually fails due to very limited penetration depth in the fine-grained materials, and Electromagnetic Induction (EMI) methods are seldom applicable for quantitative interpretation due to the very high resistivities causing low induced currents and thus small secondary fields. Nevertheless, in some areas of Greenland the marine sequence was exposed relatively late, and as a result the sediments may not be completely leached of salts. In such cases, layers with pore water salinity approaching that of sea water, may be present below an upper layer of very ice rich permafrost. The saline pore water causes a freezing-point depression which results in technically unfrozen sediments at permafrost temperatures around -3 °C. Traditional ERT and VES measurements are severely affected by equivalency problems in these settings, practically prohibiting reasonable quantitative interpretation without constraining information. Such prior information may be obtained of course from boreholes, but equipment capable of drilling

  2. Does contraceptive treatment in wildlife result in side effects? A review of quantitative and anecdotal evidence.

    PubMed

    Gray, Meeghan E; Cameron, Elissa Z

    2010-01-01

    The efficacy of contraceptive treatments has been extensively tested, and several formulations are effective at reducing fertility in a range of species. However, these formulations should minimally impact the behavior of individuals and populations before a contraceptive is used for population manipulation, but these effects have received less attention. Potential side effects have been identified theoretically and we reviewed published studies that have investigated side effects on behavior and physiology of individuals or population-level effects, which provided mixed results. Physiological side effects were most prevalent. Most studies reported a lack of secondary effects, but were usually based on qualitative data or anecdotes. A meta-analysis on quantitative studies of side effects showed that secondary effects consistently occur across all categories and all contraceptive types. This contrasts with the qualitative studies, suggesting that anecdotal reports are insufficient to investigate secondary impacts of contraceptive treatment. We conclude that more research is needed to address fundamental questions about secondary effects of contraceptive treatment and experiments are fundamental to conclusions. In addition, researchers are missing a vital opportunity to use contraceptives as an experimental tool to test the influence of reproduction, sex and fertility on the behavior of wildlife species.

  3. Application of bias correction methods to improve the accuracy of quantitative radar rainfall in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.-K.; Kim, J.-H.; Suk, M.-K.

    2015-11-01

    There are many potential sources of the biases in the radar rainfall estimation process. This study classified the biases from the rainfall estimation process into the reflectivity measurement bias and the rainfall estimation bias by the Quantitative Precipitation Estimation (QPE) model and also conducted the bias correction methods to improve the accuracy of the Radar-AWS Rainrate (RAR) calculation system operated by the Korea Meteorological Administration (KMA). In the Z bias correction for the reflectivity biases occurred by measuring the rainfalls, this study utilized the bias correction algorithm. The concept of this algorithm is that the reflectivity of the target single-pol radars is corrected based on the reference dual-pol radar corrected in the hardware and software bias. This study, and then, dealt with two post-process methods, the Mean Field Bias Correction (MFBC) method and the Local Gauge Correction method (LGC), to correct the rainfall estimation bias by the QPE model. The Z bias and rainfall estimation bias correction methods were applied to the RAR system. The accuracy of the RAR system was improved after correcting Z bias. For the rainfall types, although the accuracy of the Changma front and the local torrential cases was slightly improved without the Z bias correction the accuracy of the typhoon cases got worse than the existing results in particular. As a result of the rainfall estimation bias correction, the Z bias_LGC was especially superior to the MFBC method because the different rainfall biases were applied to each grid rainfall amount in the LGC method. For the rainfall types, the results of the Z bias_LGC showed that the rainfall estimates for all types was more accurate than only the Z bias and, especially, the outcomes in the typhoon cases was vastly superior to the others.

  4. Meta-analysis of results from quantitative trait loci mapping studies on pig chromosome 4.

    PubMed

    Silva, K M; Bastiaansen, J W M; Knol, E F; Merks, J W M; Lopes, P S; Guimarães, S E F; van Arendonk, J A M

    2011-06-01

    Meta-analysis of results from multiple studies could lead to more precise quantitative trait loci (QTL) position estimates compared to the individual experiments. As the raw data from many different studies are not readily available, the use of results from published articles may be helpful. In this study, we performed a meta-analysis of QTL on chromosome 4 in pig, using data from 25 separate experiments. First, a meta-analysis was performed for individual traits: average daily gain and backfat thickness. Second, a meta-analysis was performed for the QTL of three traits affecting loin yield: loin eye area, carcass length and loin meat weight. Third, 78 QTL were selected from 20 traits that could be assigned to one of three broad categories: carcass, fatness or growth traits. For each analysis, the number of identified meta-QTL was smaller than the number of initial QTL. The reduction in the number of QTL ranged from 71% to 86% compared to the total number before the meta-analysis. In addition, the meta-analysis reduced the QTL confidence intervals by as much as 85% compared to individual QTL estimates. The reduction in the confidence interval was greater when a large number of independent QTL was included in the meta-analysis. Meta-QTL related to growth and fatness were found in the same region as the FAT1 region. Results indicate that the meta-analysis is an efficient strategy to estimate the number and refine the positions of QTL when QTL estimates are available from multiple populations and experiments. This strategy can be used to better target further studies such as the selection of candidate genes related to trait variation.

  5. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  6. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  7. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms. PMID:26643074

  8. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  9. Establishment and assessment of two methods for quantitative detection of serum duck hepatitis B virus DNA

    PubMed Central

    Chen, Ya-Xi; Huang, Ai-Long; Qi, Zhen-Yuan; Guo, Shu-Hua

    2004-01-01

    AIM: To establish and assess the methods for quantitative detection of serum duck hepatitis B virus (DHBV) DNA by quantitative membrane hybridization using DHBV DNA probe labeled directly with alkaline phosphatase and fluorescence quantitative PCR (qPCR). METHODS: Probes of DHBV DNA labeled directly with alkaline phosphatase and chemiluminescent substrate CDP-star were used in this assay. DHBV DNA was detected by autoradiography, and then scanned by DNA dot-blot. In addition, three primers derived from DHBV DNA S gene were designed. Semi-nested primer was labeled by AmpliSensor. Standard curve of the positive standards of DHBV DNA was established after asymmetric preamplification, semi-nested amplification and on-line detection. Results from 100 samples detected separately by alkaline phosphatase direct-labeled DHBV DNA probe with dot-blot hybridization and digoxigenin-labeled DHBV DNA probe hybridization. Seventy samples of duck serum were tested by fluorescent qPCR and digoxigenin-labeled DHBV DNA probe in dot-blot hybridization assay and the correlation of results was analysed. RESULTS: Sensitivity of alkaline phosphatase direct-labeled DHBV DNA probe was 10 pg. The coincidence was 100% compared with digoxigenin-labeled DHBV DNA probe assay. After 30 cycles, amplification products showed two bands of about 180 bp and 70 bp by 20 g/L agarose gel electrophoresis. Concentration of amplification products was in direct proportion to the initial concentration of positive standards. The detection index was in direct proportion to the quantity of amplification products accumulated in the current cycle. The initial concentration of positive standards was in inverse proportion to the number of cycles needed for enough quantities of amplification products. Correlation coefficient of the results was (0.97, P < 0.01) between fluorescent qPCR and dot-blot hybridization. CONCLUSION: Alkaline phosphatase direct-labeled DHBV DNA probe in dot-blot hybridization and fluorescent q

  10. Quantitative measurement of ultrasound pressure field by optical phase contrast method and acoustic holography

    NASA Astrophysics Data System (ADS)

    Oyama, Seiji; Yasuda, Jun; Hanayama, Hiroki; Yoshizawa, Shin; Umemura, Shin-ichiro

    2016-07-01

    A fast and accurate measurement of an ultrasound field with various exposure sequences is necessary to ensure the efficacy and safety of various ultrasound applications in medicine. The most common method used to measure an ultrasound pressure field, that is, hydrophone scanning, requires a long scanning time and potentially disturbs the field. This may limit the efficiency of developing applications of ultrasound. In this study, an optical phase contrast method enabling fast and noninterfering measurements is proposed. In this method, the modulated phase of light caused by the focused ultrasound pressure field is measured. Then, a computed tomography (CT) algorithm used to quantitatively reconstruct a three-dimensional (3D) pressure field is applied. For a high-intensity focused ultrasound field, a new approach that combines the optical phase contrast method and acoustic holography was attempted. First, the optical measurement of focused ultrasound was rapidly performed over the field near a transducer. Second, the nonlinear propagation of the measured ultrasound was simulated. The result of the new approach agreed well with that of the measurement using a hydrophone and was improved from that of the phase contrast method alone with phase unwrapping.

  11. Background estimation methods for quantitative x-ray fluorescence analysis of gold nanoparticles in biomedical applications

    NASA Astrophysics Data System (ADS)

    Ren, Liqiang; Wu, Di; Li, Yuhua; Chen, Wei R.; Liu, Hong

    2014-02-01

    Accurate background estimation to isolate the fluorescence signals is an important issue for quantitative X-ray fluorescence (XRF) analysis of gold nanoparticles (GNPs). Though a good estimation can be obtained experimentally through acquiring the background spectrum of water solution, it inevitably leads to unnecessary second exposure in reality. Thus, several numerical methods such as trapezoidal shape estimation, interpolation by polynomial fitting and SNIP (Statistics sensitive Nonlinear Iterative Peak-Clipping) algorithm are proposed to achieve this goal. This paper aims to evaluate the estimation results calculated by these numerical methods through comparing with that acquired using the experimental way, in term of mean squared error (MSE). Four GNP/water solutions with various concentrations from 0.0% to 1.0% by weight are prepared. Then, ten spectra are acquired for each solution for further analysis, under the identical condition of using pencil beam x-ray and single spectrometer. Finally, the experimental and numerical methods are performed on these spectra within the optimally determined energy window and their statistical characteristics are analyzed and compared. These numerical background estimation methods as well as the evaluation methods can be easily extended to analyze the fluorescence signals of other nanoparticle biomarkers such as gadolinium, platinum and Barium in multiple biomedical applications.

  12. Calibration-curve-free quantitative PCR: a quantitative method for specific nucleic acid sequences without using calibration curves.

    PubMed

    Tani, Hidenori; Kanagawa, Takahiro; Morita, Nao; Kurata, Shinya; Nakamura, Kazunori; Tsuneda, Satoshi; Noda, Naohiro

    2007-10-01

    We have developed a simple quantitative method for specific nucleic acid sequences without using calibration curves. This method is based on the combined use of competitive polymerase chain reaction (PCR) and fluorescence quenching. We amplified a gene of interest (target) from DNA samples and an internal standard (competitor) with a sequence-specific fluorescent probe using PCR and measured the fluorescence intensities before and after PCR. The fluorescence of the probe is quenched on hybridization with the target by guanine bases, whereas the fluorescence is not quenched on hybridization with the competitor. Therefore, quench rate (i.e., fluorescence intensity after PCR divided by fluorescence intensity before PCR) is always proportional to the ratio of the target to the competitor. Consequently, we can calculate the ratio from quench rate without using a calibration curve and then calculate the initial copy number of the target from the ratio and the initial copy number of the competitor. We successfully quantified the copy number of a recombinant DNA of genetically modified (GM) soybean and estimated the GM soybean contents. This method will be particularly useful for rapid field tests of the specific gene contamination in samples.

  13. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  14. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  15. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a particular…

  16. Developing Investigative Entry Points: Exploring the Use of Quantitative Methods in English Education Research

    ERIC Educational Resources Information Center

    McGraner, Kristin L.; Robbins, Daniel

    2010-01-01

    Although many research questions in English education demand the use of qualitative methods, this paper will briefly explore how English education researchers and doctoral students may use statistics and quantitative methods to inform, complement, and/or deepen their inquiries. First, the authors will provide a general overview of the survey areas…

  17. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  18. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  19. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  20. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-03-01

    This data article describes a controlled, spiked proteomic dataset for which the "ground truth" of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values.

  1. An efficient quantitation method of next-generation sequencing libraries by using MiSeq sequencer.

    PubMed

    Katsuoka, Fumiki; Yokozawa, Junji; Tsuda, Kaoru; Ito, Shin; Pan, Xiaoqing; Nagasaki, Masao; Yasuda, Jun; Yamamoto, Masayuki

    2014-12-01

    Library quantitation is a critical step to obtain high data output in Illumina HiSeq sequencers. Here, we introduce a library quantitation method that uses the Illumina MiSeq sequencer designated as quantitative MiSeq (qMiSeq). In this procedure, 96 dual-index libraries, including control samples, are denatured, pooled in equal volume, and sequenced by MiSeq. We found that relative concentration of each library can be determined based on the observed index ratio and can be used to determine HiSeq run condition for each library. Thus, qMiSeq provides an efficient way to quantitate a large number of libraries at a time.

  2. Quantitative fluctuation electron microscopy in the STEM: methods to identify, avoid, and correct for artifacts.

    PubMed

    Li, Tian T; Bogle, Stephanie N; Abelson, John R

    2014-10-01

    Fluctuation electron microscopy can reveal the nanoscale order in amorphous materials via the statistical variance in the scattering intensity as a function of position, scattering vector, and resolution. However, several sources of experimental artifacts can seriously affect the magnitude of the variance peaks. The use of a scanning transmission electron microscope for data collection affords a convenient means to check whether artifacts are present. As nanodiffraction patterns are collected in serial, any spatial or temporal dependence of the scattering intensity across the series can easily be detected. We present examples of the major types of artifact and methods to correct the data or to avoid the problem experimentally. We also re-cast the statistical formalism used to identify sources of noise in view of the present results. The present work provides a basis on which to perform fluctuation electron microscopy with a high level of reliability and confidence in the quantitative magnitude of the data.

  3. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  4. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.

  5. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    PubMed

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens. PMID:19007521

  6. Methods for the Specific Detection and Quantitation of Amyloid-β Oligomers in Cerebrospinal Fluid.

    PubMed

    Schuster, Judith; Funke, Susanne Aileen

    2016-05-01

    Protein misfolding and aggregation are fundamental features of the majority of neurodegenerative diseases, like Alzheimer's disease (AD), Parkinson's disease, frontotemporal dementia, and prion diseases. Proteinaceous deposits in the brain of the patient, e.g., amyloid plaques consisting of the amyloid-β (Aβ) peptide and tangles composed of tau protein, are the hallmarks of AD. Soluble oligomers of Aβ and tau play a fundamental role in disease progression, and specific detection and quantification of the respective oligomeric proteins in cerebrospinal fluid may provide presymptomatically detectable biomarkers, paving the way for early diagnosis or even prognosis. Several studies on the development of techniques for the specific detection of Aβ oligomers were published, but some of the existing tools do not yet seem to be satisfactory, and the study results are contradicting. The detection of oligomers is challenging due to their polymorphous and unstable nature, their low concentration, and the presence of competing proteins and Aβ monomers in body fluids. Here, we present an overview of the current state of the development of methods for Aβ oligomer specific detection and quantitation. The methods are divided in the three subgroups: (i) enzyme linked immunosorbent assays (ELISA), (ii) methods for single oligomer detection, and (iii) others, which are mainly biosensor based methods. PMID:27163804

  7. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study.

  8. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    PubMed Central

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R.; Twa, Michael D.; Larin, Kirill V.

    2015-01-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessing biomechanical properties of tissues with a micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young’s modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods. PMID:25860076

  9. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas.

  10. Comparative assessment of fluorescent transgene methods for quantitative imaging in human cells

    PubMed Central

    Mahen, Robert; Koch, Birgit; Wachsmuth, Malte; Politi, Antonio Z.; Perez-Gonzalez, Alexis; Mergenthaler, Julia; Cai, Yin; Ellenberg, Jan

    2014-01-01

    Fluorescence tagging of proteins is a widely used tool to study protein function and dynamics in live cells. However, the extent to which different mammalian transgene methods faithfully report on the properties of endogenous proteins has not been studied comparatively. Here we use quantitative live-cell imaging and single-molecule spectroscopy to analyze how different transgene systems affect imaging of the functional properties of the mitotic kinase Aurora B. We show that the transgene method fundamentally influences level and variability of expression and can severely compromise the ability to report on endogenous binding and localization parameters, providing a guide for quantitative imaging studies in mammalian cells. PMID:25232003

  11. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    PubMed

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances.

  12. Comparison of Concentration Methods for Quantitative Detection of Sewage-Associated Viral Markers in Environmental Waters

    PubMed Central

    Harwood, V. J.; Gyawali, P.; Sidhu, J. P. S.; Toze, S.

    2015-01-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ∼10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg2+) and elution with acid were inefficient and more prone to

  13. Infectious titres of sheep scrapie and bovine spongiform encephalopathy agents cannot be accurately predicted from quantitative laboratory test results.

    PubMed

    González, Lorenzo; Thorne, Leigh; Jeffrey, Martin; Martin, Stuart; Spiropoulos, John; Beck, Katy E; Lockey, Richard W; Vickery, Christopher M; Holder, Thomas; Terry, Linda

    2012-11-01

    It is widely accepted that abnormal forms of the prion protein (PrP) are the best surrogate marker for the infectious agent of prion diseases and, in practice, the detection of such disease-associated (PrP(d)) and/or protease-resistant (PrP(res)) forms of PrP is the cornerstone of diagnosis and surveillance of the transmissible spongiform encephalopathies (TSEs). Nevertheless, some studies question the consistent association between infectivity and abnormal PrP detection. To address this discrepancy, 11 brain samples of sheep affected with natural scrapie or experimental bovine spongiform encephalopathy were selected on the basis of the magnitude and predominant types of PrP(d) accumulation, as shown by immunohistochemical (IHC) examination; contra-lateral hemi-brain samples were inoculated at three different dilutions into transgenic mice overexpressing ovine PrP and were also subjected to quantitative analysis by three biochemical tests (BCTs). Six samples gave 'low' infectious titres (10⁶·⁵ to 10⁶·⁷ LD₅₀ g⁻¹) and five gave 'high titres' (10⁸·¹ to ≥ 10⁸·⁷ LD₅₀ g⁻¹) and, with the exception of the Western blot analysis, those two groups tended to correspond with samples with lower PrP(d)/PrP(res) results by IHC/BCTs. However, no statistical association could be confirmed due to high individual sample variability. It is concluded that although detection of abnormal forms of PrP by laboratory methods remains useful to confirm TSE infection, infectivity titres cannot be predicted from quantitative test results, at least for the TSE sources and host PRNP genotypes used in this study. Furthermore, the near inverse correlation between infectious titres and Western blot results (high protease pre-treatment) argues for a dissociation between infectivity and PrP(res).

  14. Simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and dead reckoning

    NASA Astrophysics Data System (ADS)

    Davey, Neil S.; Godil, Haris

    2013-05-01

    This article presents a comparative study between a well-known SLAM (Simultaneous Localization and Mapping) algorithm, called Gmapping, and a standard Dead-Reckoning algorithm; the study is based on experimental results of both approaches by using a commercial skid-based turning robot, P3DX. Five main base-case scenarios are conducted to evaluate and test the effectiveness of both algorithms. The results show that SLAM outperformed the Dead Reckoning in terms of map-making accuracy in all scenarios but one, since SLAM did not work well in a rapidly changing environment. Although the main conclusion about the excellence of SLAM is not surprising, the presented test method is valuable to professionals working in this area of mobile robots, as it is highly practical, and provides solid and valuable results. The novelty of this study lies in its simplicity. The simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and Dead Reckoning and some applications using autonomous robots are being patented by the authors in U.S. Patent Application Nos. 13/400,726 and 13/584,862.

  15. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  16. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-01

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed. PMID:26928571

  17. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-01

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  18. A multiplex lectin-channel monitoring method for human serum glycoproteins by quantitative mass spectrometry.

    PubMed

    Ahn, Yeong Hee; Ji, Eun Sun; Shin, Park Min; Kim, Kwang Hoe; Kim, Yong-Sam; Ko, Jeong Heon; Yoo, Jong Shin

    2012-02-01

    A mass profiling method and multiple reaction monitoring (MRM)-based quantitative approach were used to analyze multiple lectin-captured fractions of human serum using different lectins such as aleuria aurantia lectin (AAL), phytohemagglutinin-L(4) (L-PHA), concanavalin A (Con A), and Datura stramonium agglutinin (DSA) to quantitatively monitor protein glycosylation diversity. Each fraction, prepared by multiple lectin-fractionation and tryptic digestion, was analyzed by 1-D LC-MS/MS. Semi-quantitative profiling showed that the list of glycoproteins identified from each lectin-captured fraction is significantly different according to the used lectin. Thus, it was confirmed that the multiplex lectin-channel monitoring (LCM) using multiple lectins is useful for investigating protein glycosylation diversity in a proteome sample. Based on the semi-quantitative mass profiling, target proteins showing lectin-specificity among each lectin-captured fraction were selected and analyzed by the MRM-based method in triplicate using each lectin-captured fraction (average CV 7.9%). The MRM-based analysis for each lectin-captured fraction was similar to those obtained by the profiling experiments. The abundance of each target protein measured varied dramatically, based on the lectin-specificity. The multiplex LCM approach using MRM-based analyses is useful for quantitatively monitoring target protein glycoforms selectively fractionated by multiple lectins. Thus through multiplex LCM rather than single, we could inquire minutely into protein glycosylation states. PMID:22158852

  19. Quantitative determination of aflatoxin B1 concentration in acetonitrile by chemometric methods using terahertz spectroscopy.

    PubMed

    Ge, Hongyi; Jiang, Yuying; Lian, Feiyu; Zhang, Yuan; Xia, Shanhong

    2016-10-15

    Aflatoxins contaminate and colonize agricultural products, such as grain, and thereby potentially cause human liver carcinoma. Detection via conventional methods has proven to be time-consuming and complex. In this paper, the terahertz (THz) spectra of aflatoxin B1 in acetonitrile solutions with concentration ranges of 1-50μg/ml and 1-50μg/l are obtained and analyzed for the frequency range of 0.4-1.6THz. Linear and nonlinear regression models are constructed to relate the absorption spectra and the concentrations of 160 samples using the partial least squares (PLS), principal component regression (PCR), support vector machine (SVM), and PCA-SVM methods. Our results indicate that PLS and PCR models are more accurate for the concentration range of 1-50μg/ml, whereas SVM and PCA-SVM are more accurate for the concentration range of 1-50μg/l. Furthermore, ten unknown concentration samples extracted from mildewed maize are analyzed quantitatively using these methods. PMID:27173565

  20. Effect of platform, reference material, and quantification model on enumeration of Enterococcus by quantitative PCR methods.

    PubMed

    Cao, Yiping; Sivaganesan, Mano; Kinzelman, Julie; Blackwood, A Denene; Noble, Rachel T; Haugland, Richard A; Griffith, John F; Weisberg, Stephen B

    2013-01-01

    Quantitative polymerase chain reaction (qPCR) is increasingly being used for the quantitative detection of fecal indicator bacteria in beach water. QPCR allows for same-day health warnings, and its application is being considered as an option for recreational water quality testing in the United States (USEPA, 2011. EPA-OW-2011-0466, FRL-9609-3, Notice of Availability of Draft Recreational Water Quality Criteria and Request for Scientific Views). However, transition of qPCR from a research tool to routine water quality testing requires information on how various method variations affect target enumeration. Here we compared qPCR performance and enumeration of enterococci in spiked and environmental water samples using three qPCR platforms (Applied Biosystem StepOnePlus™, the BioRad iQ™5 and the Cepheid SmartCycler(®) II), two reference materials (lyophilized cells and frozen cells on filters) and two comparative CT quantification models (ΔCT and ΔΔCT). Reference materials exerted the biggest influence, consistently affecting results by approximately 0.5 log(10) unit. Platform had the smallest effect, generally exerting <0.1 log(10) unit difference in final results. Quantification model led to small differences (0.04-0.2 log(10) unit) in this study with relatively uninhibited samples, but has the potential to cause as much as 8-fold (0.9 log(10) unit) difference in potentially inhibitory samples. Our findings indicate the need for a certified and centralized source of reference materials and additional studies to assess applicability of the quantification models in analyses of PCR inhibitory samples.

  1. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  2. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  3. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  4. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  5. The Use of Quantitative Methods as an Aid to Decision Making in Educational Administration.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.

    Three quantitative methods are outlined, with suggestions for application to particular problem areas of educational administration: (1) The Leontief input-output analysis, incorporating a "transaction table" for displaying relationships between economic outputs and inputs, mainly applicable to budget analysis and planning; (2) linear programing,…

  6. Overcoming Methods Anxiety: Qualitative First, Quantitative Next, Frequent Feedback along the Way

    ERIC Educational Resources Information Center

    Bernstein, Jeffrey L.; Allen, Brooke Thomas

    2013-01-01

    Political Science research methods courses face two problems. First is what to cover, as there are too many techniques to explore in any one course. Second is dealing with student anxiety around quantitative material. We explore a novel way to approach these issues. Our students began by writing a qualitative paper. They followed with a term…

  7. Examination of Quantitative Methods Used in Early Intervention Research: Linkages with Recommended Practices.

    ERIC Educational Resources Information Center

    Snyder, Patricia; Thompson, Bruce; McLean, Mary E.; Smith, Barbara J.

    2002-01-01

    Findings are reported related to the research methods and statistical techniques used in 450 group quantitative studies examined by the Council for Exceptional Children's Division for Early Childhood Recommended Practices Project. Studies were analyzed across seven dimensions including sampling procedures, variable selection, variable definition,…

  8. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  9. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    ERIC Educational Resources Information Center

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining…

  10. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    ERIC Educational Resources Information Center

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  11. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  12. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles.

    PubMed

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki

    2012-09-28

    The detection of estrogen receptors (ERs) by immunohistochemistry (IHC) using 3,3'-diaminobenzidine (DAB) is slightly weak as a prognostic marker, but it is essential to the application of endocrine therapy, such as antiestrogen tamoxifen-based therapy. IHC using DAB is a poor quantitative method because horseradish peroxidase (HRP) activity depends on reaction time, temperature and substrate concentration. However, IHC using fluorescent material provides an effective method to quantitatively use IHC because the signal intensity is proportional to the intensity of the photon excitation energy. However, the high level of autofluorescence has impeded the development of quantitative IHC using fluorescence. We developed organic fluorescent material (tetramethylrhodamine)-assembled nanoparticles for IHC. Tissue autofluorescence is comparable to the fluorescence intensity of quantum dots, which are the most representative fluorescent nanoparticles. The fluorescent intensity of our novel nanoparticles was 10.2-fold greater than quantum dots, and they did not bind non-specifically to breast cancer tissues due to the polyethylene glycol chain that coated their surfaces. Therefore, the fluorescent intensity of our nanoparticles significantly exceeded autofluorescence, which produced a significantly higher signal-to-noise ratio on IHC-imaged cancer tissues than previous methods. Moreover, immunostaining data from our nanoparticle fluorescent IHC and IHC with DAB were compared in the same region of adjacent tissues sections to quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal. PMID

  13. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  14. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival.

  15. Quantitative measurement of acoustic pressure in the focal zone of acoustic lens-line focusing using the Schlieren method.

    PubMed

    Jiang, Xueping; Cheng, Qian; Xu, Zheng; Qian, Menglu; Han, Qingbang

    2016-04-01

    This paper proposes a theory and method for quantitative measurement of the acoustic lens-line focusing ultrasonic (ALLFU) field in its focal spot size and acoustic pressure using the Schlieren imaging technique. Using Fourier transformation, the relationship between the brightness of the Schlieren image and the acoustic pressure was introduced. The ALLFU field was simulated using finite element method and compared with the Schlieren acoustic field image. The measurement of the focal spot size was performed using the Schlieren method. The acoustic pressure in the focal zone of the ALLFU field and the transducer-transmitting voltage response were quantitatively determined by measuring the diffraction light fringe intensity. The results show that the brightness of the Schlieren image is a linear function of the acoustic intensity when the acousto-optic interaction length remains constant and the acoustic field is weak. PMID:27139646

  16. Quantitative methods for evaluating optical and frictional properties of cationic polymers.

    PubMed

    Wu, W; Alkema, J; Shay, G D; Basset, D R

    2001-01-01

    This paper presents three quantitative methods to examine gloss, opacity, and friction of cationic polymers. The adsorption of cationic polymers onto hair and skin can be regarded as a thin film coating. Therefore, optical and frictional properties of polymer films are of significant relevance to the applications of cationic polymers in hair care products. Such properties reflect the desirable hair condition attributes consumers seek in shampoo and conditioner products. Using these test methods, polyquaternium-10 and cationic guar samples of varying molecular weight and cationic substitution were compared. The effect of an anionic surfactant, sodium dodecyl sulfate (SDS), on polymer film properties was also investigated. Neat guar hydroxypropyl trimonium chloride imparts less friction than polyquaternium-10 but dulls the substrate employed in this study. The optical data show that polyquaternium-10 provides greater film clarity and gloss than cationic guars. In the presence of SDS, polyquaternium-10 also displays similar or lower friction than cationic guar. The comparative optical and frictional results are in good agreement with the visual assessment of the cationic polymer films. These results clearly demonstrate that polyquaternium-10 exhibits superior film properties in the forms of both neat polymer and polymer/surfactant complex. In addition, microscopic techniques such as scanning electron microscopy (SEM) and atomic force microscopy (AFM) provide powerful explanations for the differences noted between the two popular classes of cationic polymers. The test methods described in this paper can be utilized to differentiate the upper performance potential of cationic polymers. These objective and standardized test methods derived from the coatings industry are not affected by the variability of hair or the formulation complexity of end products. They can be useful tools in the product development process in quickly screening the relative performance of

  17. Electron paramagnetic resonance method for the quantitative assay of ketoconazole in pharmaceutical preparations.

    PubMed

    Morsy, Mohamed A; Sultan, Salah M; Dafalla, Hatim

    2009-08-15

    In this study, electron paramagnetic resonance (EPR) is used, for the first time, as an analytical tool for the quantitative assay of ketoconazole (KTZ) in drug formulations. The drug was successfully characterized by the prominent signals by two radical species produced as a result of its oxidation with 400 microg/mL cerium(IV) in 0.10 mol dm(-3) sulfuric acid. The EPR signal of the reaction mixture was measured in eight capillary tubes housed in a 4 mm EPR sample tube. The radical stability was investigated by obtaining multi-EPR scans of each KTZ sample solution at time intervals of 2.5 min of the reaction mixing time. The plot of the disappearance of the radical species show that the disappearance is apparently of zero order. The zero-time intercept of the EPR signal amplitude, which should be proportional to the initial radical concentration, is linear in the sample concentration in the range between 100 and 400 microg/mL, with a correlation coefficient, r, of 0.999. The detection limit was determined to be 11.7 +/- 2.5 microg/mL. The method newly adopted was fully validated following the United States Pharmacopeia (USP) monograph protocol in both the generic and the proprietary forms. The method is very accurate, such that we were able to measure the concentration at confidence levels of 99.9%. The method was also found to be suitable for the assay of KTZ in its tablet and cream pharmaceutical preparations, as no interferences were encountered from excipients of the proprietary drugs. High specificity, simplicity, and rapidity are the merits of the present method compared to the previously reported methods.

  18. Linking multidimensional functional diversity to quantitative methods: a graphical hypothesis--evaluation framework.

    PubMed

    Boersma, Kate S; Dee, Laura E; Miller, Steve J; Bogan, Michael T; Lytle, David A; Gitelman, Alix I

    2016-03-01

    Functional trait analysis is an appealing approach to study differences among biological communities because traits determine species' responses to the environment and their impacts on ecosystem functioning. Despite a rapidly expanding quantitative literature, it remains challenging to conceptualize concurrent changes in multiple trait dimensions ("trait space") and select quantitative functional diversity methods to test hypotheses prior to analysis. To address this need, we present a widely applicable framework for visualizing ecological phenomena in trait space to guide the selection, application, and interpretation of quantitative functional diversity methods. We describe five hypotheses that represent general patterns of responses to disturbance in functional community ecology and then apply a formal decision process to determine appropriate quantitative methods to test ecological hypotheses. As a part of this process, we devise a new statistical approach to test for functional turnover among communities. Our combination of hypotheses and metrics can be applied broadly to address ecological questions across a range of systems and study designs. We illustrate the framework with a case study of disturbance in freshwater communities. This hypothesis-driven approach will increase the rigor and transparency of applied functional trait studies. PMID:27197386

  19. Linking multidimensional functional diversity to quantitative methods: a graphical hypothesis--evaluation framework.

    PubMed

    Boersma, Kate S; Dee, Laura E; Miller, Steve J; Bogan, Michael T; Lytle, David A; Gitelman, Alix I

    2016-03-01

    Functional trait analysis is an appealing approach to study differences among biological communities because traits determine species' responses to the environment and their impacts on ecosystem functioning. Despite a rapidly expanding quantitative literature, it remains challenging to conceptualize concurrent changes in multiple trait dimensions ("trait space") and select quantitative functional diversity methods to test hypotheses prior to analysis. To address this need, we present a widely applicable framework for visualizing ecological phenomena in trait space to guide the selection, application, and interpretation of quantitative functional diversity methods. We describe five hypotheses that represent general patterns of responses to disturbance in functional community ecology and then apply a formal decision process to determine appropriate quantitative methods to test ecological hypotheses. As a part of this process, we devise a new statistical approach to test for functional turnover among communities. Our combination of hypotheses and metrics can be applied broadly to address ecological questions across a range of systems and study designs. We illustrate the framework with a case study of disturbance in freshwater communities. This hypothesis-driven approach will increase the rigor and transparency of applied functional trait studies.

  20. Accuracy, precision, and method detection limits of quantitative PCR for airborne bacteria and fungi.

    PubMed

    Hospodsky, Denina; Yamamoto, Naomichi; Peccia, Jordan

    2010-11-01

    Real-time quantitative PCR (qPCR) for rapid and specific enumeration of microbial agents is finding increased use in aerosol science. The goal of this study was to determine qPCR accuracy, precision, and method detection limits (MDLs) within the context of indoor and ambient aerosol samples. Escherichia coli and Bacillus atrophaeus vegetative bacterial cells and Aspergillus fumigatus fungal spores loaded onto aerosol filters were considered. Efficiencies associated with recovery of DNA from aerosol filters were low, and excluding these efficiencies in quantitative analysis led to underestimating the true aerosol concentration by 10 to 24 times. Precision near detection limits ranged from a 28% to 79% coefficient of variation (COV) for the three test organisms, and the majority of this variation was due to instrument repeatability. Depending on the organism and sampling filter material, precision results suggest that qPCR is useful for determining dissimilarity between two samples only if the true differences are greater than 1.3 to 3.2 times (95% confidence level at n = 7 replicates). For MDLs, qPCR was able to produce a positive response with 99% confidence from the DNA of five B. atrophaeus cells and less than one A. fumigatus spore. Overall MDL values that included sample processing efficiencies ranged from 2,000 to 3,000 B. atrophaeus cells per filter and 10 to 25 A. fumigatus spores per filter. Applying the concepts of accuracy, precision, and MDL to qPCR aerosol measurements demonstrates that sample processing efficiencies must be accounted for in order to accurately estimate bioaerosol exposure, provides guidance on the necessary statistical rigor required to understand significant differences among separate aerosol samples, and prevents undetected (i.e., nonquantifiable) values for true aerosol concentrations that may be significant.

  1. The quantitative assessment of the pre- and postoperative craniosynostosis using the methods of image analysis.

    PubMed

    Fabijańska, Anna; Węgliński, Tomasz

    2015-12-01

    This paper considers the problem of the CT based quantitative assessment of the craniosynostosis before and after the surgery. First, fast and efficient brain segmentation approach is proposed. The algorithm is robust to discontinuity of skull. As a result it can be applied both in pre- and post-operative cases. Additionally, image processing and analysis algorithms are proposed for describing the disease based on CT scans. The proposed algorithms automate determination of the standard linear indices used for assessment of the craniosynostosis (i.e. cephalic index CI and head circumference HC) and allow for planar and volumetric analysis which so far have not been reported. Results of applying the introduced methods to sample craniosynostotic cases before and after the surgery are presented and discussed. The results show that the proposed brain segmentation algorithm is characterized by high accuracy when applied both in the pre- and postoperative craniosynostosis, while the introduced planar and volumetric indices for the disease description may be helpful to distinguish between the types of the disease.

  2. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations.

    PubMed

    Wagner, Karla D; Davidson, Peter J; Pollini, Robin A; Strathdee, Steffanie A; Washburn, Rachel; Palinkas, Lawrence A

    2012-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, whilst conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors' research on HIV risk amongst injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a Needle/Syringe Exchange Program in Los Angeles, CA, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative parts.

  3. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations

    PubMed Central

    Wagner, Karla D.; Davidson, Peter J.; Pollini, Robin A.; Strathdee, Steffanie A.; Washburn, Rachel; Palinkas, Lawrence A.

    2011-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, while conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors’ research on HIV risk among injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a needle/syringe exchange program in Los Angeles, California, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative

  4. Deep neural nets as a method for quantitative structure-activity relationships.

    PubMed

    Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir

    2015-02-23

    Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable.

  5. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis.

  6. Deep neural nets as a method for quantitative structure-activity relationships.

    PubMed

    Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir

    2015-02-23

    Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable. PMID:25635324

  7. An Evaluation of Quantitative Methods of Determining the Degree of Melting Experienced by a Chondrule

    NASA Technical Reports Server (NTRS)

    Nettles, J. W.; Lofgren, G. E.; Carlson, W. D.; McSween, H. Y., Jr.

    2004-01-01

    Many workers have considered the degree to which partial melting occurred in chondrules they have studied, and this has led to attempts to find reliable methods of determining the degree of melting. At least two quantitative methods have been used in the literature: a convolution index (CVI), which is a ratio of the perimeter of the chondrule as seen in thin section divided by the perimeter of a circle with the same area as the chondrule, and nominal grain size (NGS), which is the inverse square root of the number density of olivines and pyroxenes in a chondrule (again, as seen in thin section). We have evaluated both nominal grain size and convolution index as melting indicators. Nominal grain size was measured on the results of a set of dynamic crystallization experiments previously described, where aliquots of LEW97008(L3.4) were heated to peak temperatures of 1250, 1350, 1370, and 1450 C, representing varying degrees of partial melting of the starting material. Nominal grain size numbers should correlate with peak temperature (and therefore degree of partial melting) if it is a good melting indicator. The convolution index is not directly testable with these experiments because the experiments do not actually create chondrules (and therefore they have no outline on which to measure a CVI). Thus we had no means to directly test how well the CVI predicted different degrees of melting. Therefore, we discuss the use of the CVI measurement and support the discussion with X-ray Computed Tomography (CT) data.

  8. A validated LC-MS-MS method for simultaneous identification and quantitation of rodenticides in blood.

    PubMed

    Bidny, Sergei; Gago, Kim; David, Mark; Duong, Thanh; Albertyn, Desdemona; Gunja, Naren

    2015-04-01

    A rapid, highly sensitive and specific analytical method for the extraction, identification and quantification of nine rodenticides from whole blood has been developed and validated. Commercially available rodenticides in Australia include coumatetralyl, warfarin, brodifacoum, bromadiolone, difenacoum, flocoumafen, difethialone, diphacinone and chlorophacinone. A Waters ACQUITY UPLC TQD system operating in multiple reaction monitoring mode was used to conduct the analysis. Two different ionization techniques, ES+ and ES-, were examined to achieve optimal sensitivity and selectivity resulting in detection by MS-MS using electrospray ionization in positive mode for difenacoum and brodifacoum and in negative mode for all other analytes. All analytes were extracted from 200 µL of whole blood with ethylacetate and separated on a Waters ACQUITY UPLC BEH-C18 column using gradient elution. Ammonium acetate (10 mM, pH 7.5) and methanol were used as mobile phases with a total run time of 8 min. Recoveries were between 70 and 105% with limits of detection ranging from 0.5 to 1 ng/mL. The limit of quantitation was 2 ng/mL for all analytes. Calibration curves were linear within the range 2-200 ng/mL for all analytes with the coefficient of determination ≥0.98. The application of the proposed method using liquid-liquid extraction in a series of clinical investigations and forensic toxicological analyses was successful. PMID:25595137

  9. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-05-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

  10. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  11. Event-specific qualitative and quantitative PCR methods for the detection of genetically modified rapeseed Oxy-235.

    PubMed

    Wu, Gang; Wu, Yuhua; Xiao, Ling; Lu, Changming

    2008-10-01

    Oxy-235 is an oxynil-tolerant genetically modified rapeseed approved for commercialized planting in Canada. The aim of this study was to establish event-specific qualitative and quantitative detection methods for Oxy-235. Both the 5'- and 3'-junction sequences spanning the plant DNA and the integrated gene construct of the Oxy-235 event were isolated, sequenced and analyzed. A 1298-bp deletion of the rapeseed genomic DNA that showed a high similarity to the mRNA sequence of Arabidopsis thaliana was found in the integration site of the insert DNA. Event-specific qualitative PCR methods were established, with one method producing a 105-bp product specific for the 5'-integration junction and the other method producing a 124-bp product specific for the 3'-junction. The absolute detection limits for the qualitative PCR were determined to be 100 initial template copies for the 5'-junction and ten for the 3'-junction. Quantitative methods were also developed that targeted both of the junction fragments. The limit of detection of the quantitative PCR analysis was ten initial template copies for either the 5'- or 3'-junction, while the limit of quantification was determined to be approximately 50 initial template copies. The real-time PCR systems so established were examined with two mixed rapeseed samples with known Oxy-235 contents and found to obtain the expected results.

  12. Quantitative direct probe method for the detection of parvovirus B19.

    PubMed

    Boggino, H; Payne, D A

    2000-01-01

    Parvovirus B19 infection is associated with anemia and spontaneous abortions. While many qualitative assays are available, a few molecular-based quantitative methods have been described. This study reports the development and optimization of a quantitative direct-probe method for the detection of Parvovirus B19 DNA. Different concentrations of RNA probes were used to identify the optimal conditions for hybridizing to the target DNA. Detection of DNA was linear between concentrations of 2 ng/ml to 200 pg/ml. Because this method requires no enzymatic amplification, it is not susceptible to amplifier contamination or enzymatic inhibitors, and it can be applied to serum samples or paraffin-embedded tissue.

  13. Quantitative direct probe method for the detection of parvovirus B19.

    PubMed

    Boggino, H; Payne, D A

    2000-01-01

    Parvovirus B19 infection is associated with anemia and spontaneous abortions. While many qualitative assays are available, a few molecular-based quantitative methods have been described. This study reports the development and optimization of a quantitative direct-probe method for the detection of Parvovirus B19 DNA. Different concentrations of RNA probes were used to identify the optimal conditions for hybridizing to the target DNA. Detection of DNA was linear between concentrations of 2 ng/ml to 200 pg/ml. Because this method requires no enzymatic amplification, it is not susceptible to amplifier contamination or enzymatic inhibitors, and it can be applied to serum samples or paraffin-embedded tissue. PMID:10645984

  14. Laboratory and field validation of a Cry1Ab protein quantitation method for water.

    PubMed

    Strain, Katherine E; Whiting, Sara A; Lydy, Michael J

    2014-10-01

    The widespread planting of crops expressing insecticidal proteins derived from the soil bacterium Bacillus thuringiensis (Bt) has given rise to concerns regarding potential exposure to non-target species. These proteins are released from the plant throughout the growing season into soil and surface runoff and may enter adjacent waterways as runoff, erosion, aerial deposition of particulates, or plant debris. It is crucial to be able to accurately quantify Bt protein concentrations in the environment to aid in risk analyses and decision making. Enzyme-linked immunosorbent assay (ELISA) is commonly used for quantitation of Bt proteins in the environment; however, there are no published methods detailing and validating the extraction and quantitation of Bt proteins in water. The objective of the current study was to optimize the extraction of a Bt protein, Cry1Ab, from three water matrices and validate the ELISA method for specificity, precision, accuracy, stability, and sensitivity. Recovery of the Cry1Ab protein was matrix-dependent and ranged from 40 to 88% in the validated matrices, with an overall method detection limit of 2.1 ng/L. Precision among two plates and within a single plate was confirmed with a coefficient of variation less than 20%. The ELISA method was verified in field and laboratory samples, demonstrating the utility of the validated method. The implementation of a validated extraction and quantitation protocol adds consistency and reliability to field-collected data regarding transgenic products. PMID:25059137

  15. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  16. Quantitative Evaluation of Peptide-Material Interactions by a Force Mapping Method: Guidelines for Surface Modification.

    PubMed

    Mochizuki, Masahito; Oguchi, Masahiro; Kim, Seong-Oh; Jackman, Joshua A; Ogawa, Tetsu; Lkhamsuren, Ganchimeg; Cho, Nam-Joon; Hayashi, Tomohiro

    2015-07-28

    Peptide coatings on material surfaces have demonstrated wide application across materials science and biotechnology, facilitating the development of nanobio interfaces through surface modification. A guiding motivation in the field is to engineer peptides with a high and selective binding affinity to target materials. Herein, we introduce a quantitative force mapping method in order to evaluate the binding affinity of peptides to various hydrophilic oxide materials by atomic force microscopy (AFM). Statistical analysis of adhesion forces and probabilities obtained on substrates with a materials contrast enabled us to simultaneously compare the peptide binding affinity to different materials. On the basis of the experimental results and corresponding theoretical analysis, we discuss the role of various interfacial forces in modulating the strength of peptide attachment to hydrophilic oxide solid supports as well as to gold. The results emphasize the precision and robustness of our approach to evaluating the adhesion strength of peptides to solid supports, thereby offering guidelines to improve the design and fabrication of peptide-coated materials.

  17. The effect of hydraulic loading on bioclogging in porous media: Quantitative results from tomographic imaging

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Davit, Y.; Connolly, J. M.; Gerlach, R.; Wood, B. D.; Wildenschild, D.

    2013-12-01

    Biofilm growth in porous media is generally surface attached, and pore filling. A direct result of biofilm formation is the clogging of pore space available for fluid transport. This clogging effect has come to be termed bioclogging. In physical experiments bioclogging expresses as an increase in differential pressure across experimental specimens and traditional investigations of bioclogging in 3D porous media have included measurements of bulk differential pressure changes in order to evaluate changes in permeability or hydraulic conductivity. Due to the opaque nature of most types of porous media, visualization of bioclogging has been limited to the use of 2D or pseudo-3D micromodels. As a result, bioclogging models have relied on parameters derived from 2D visualization experiments. Results from these studies have shown that even small changes in pore morphology associated with biofilm growth can significantly alter fluid hydrodynamics. Recent advances in biofilm imaging facilitate the investigation of biofilm growth and bioclogging in porous media through the implementation of x-ray computed microtomography (CMT) and a functional contrast agent. We used barium sulfate as the contrast agent which consists of a particle suspension that fills all pore space available to fluid flow. Utilization of x-ray CMT with a barium sulfate contrast agent facilitates the examination of biofilm growth at the micron scale throughout experimental porous media growth reactors. This method has been applied to investigate changes in macropore morphology associated with biofilm growth. Applied fluid flow rates correspond to initial Reynolds numbers ranging from 0.1 to 100. Results include direct comparison of measured changes in porosity and hydraulic conductivity as calculated using differential pressure measurements vs. images. In addition, parameters such as biofilm thickness, reactive surface area, and attachment surface area will be presented in order to help characterize

  18. Vascular Corrosion Casting: Review of Advantages and Limitations in the Application of Some Simple Quantitative Methods.

    PubMed

    Hossler, Fred E.; Douglas, John E.

    2001-05-01

    Vascular corrosion casting has been used for about 40 years to produce replicas of normal and abnormal vasculature and microvasculature of various tissues and organs that could be viewed at the ultrastructural level. In combination with scanning electron microscopy (SEM), the primary application of corrosion casting has been to describe the morphology and anatomical distribution of blood vessels in these tissues. However, such replicas should also contain quantitative information about that vasculature. This report summarizes some simple quantitative applications of vascular corrosion casting. Casts were prepared by infusing Mercox resin or diluted Mercox resin into the vasculature. Surrounding tissues were removed with KOH, hot water, and formic acid, and the resulting dried casts were observed with routine SEM. The orientation, size, and frequency of vascular endothelial cells were determined from endothelial nuclear imprints on various cast surfaces. Vascular volumes of heart, lung, and avian salt gland were calculated using tissue and resin densities, and weights. Changes in vascular volume and functional capillary density in an experimentally induced emphysema model were estimated from confocal images of casts. Clearly, corrosion casts lend themselves to quantitative analysis. However, because blood vessels differ in their compliances, in their responses to the toxicity of casting resins, and in their response to varying conditions of corrosion casting procedures, it is prudent to use care in interpreting this quantitative data. Some of the applications and limitations of quantitative methodology with corrosion casts are reviewed here.

  19. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  20. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  1. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Cates, Michael R.; Franks, Larry A.

    1985-01-01

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fissions are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for .sup.239 Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  2. QUANTITATIVE EVALUATION OF ASR DETERIORATION LEVEL BASED ON SURVEY RESULT OF EXISTING STRUCTURE

    NASA Astrophysics Data System (ADS)

    Kawashima, Yasushi; Kosa, Kenji; Matsumoto, Shigeru; Miura, Masatsugu

    The relationship between the crack density and compressive strength of the core cylinder, which drilled from actual structure damaged by ASR, was investigated. The results showed that even if the crack density increased about 1.0m/m2, the compressive strength decreased only 2N/mm2. Then, the new method for estimating future compressive strength using the accumulation crack density in the current is proposed. In addition, the declining tendency of compressive strength by the ASR expansion was early proportional to the expansion, and it was examined on the reason for becoming gentle curve afterwards. As a technique, the detailed observation of ASR crack which arose in the loading test for the plane was carried out, after cylindrical specimen for test was cut in longitudinal direction. As the result, It was proven that the proportion in which line of rupture overlaps with the ASR crack was low, and the load is resisted by interlocking between coarse aggregate and concrete in the crack plane.

  3. Method Development and Validation of a Stability-Indicating RP-HPLC Method for the Quantitative Analysis of Dronedarone Hydrochloride in Pharmaceutical Tablets

    PubMed Central

    Dabhi, Batuk; Jadeja, Yashwantsinh; Patel, Madhavi; Jebaliya, Hetal; Karia, Denish; Shah, Anamik

    2013-01-01

    A simple, precise, and accurate HPLC method has been developed and validated for the quantitative analysis of Dronedarone Hydrochloride in tablet form. An isocratic separation was achieved using a Waters Symmetry C8 (100 × 4.6 mm), 5 μm particle size column with a flow rate of 1 ml/min and UV detector at 290 nm. The mobile phase consisted of buffer: methanol (40:60 v/v) (buffer: 50 mM KH2PO4 + 1 ml triethylamine in 1 liter water, pH=2.5 adjusted with ortho-phosphoric acid). The method was validated for specificity, linearity, precision, accuracy, robustness, and solution stability. The specificity of the method was determined by assessing interference from the placebo and by stress testing the drug (forced degradation). The method was linear over the concentration range 20–80 μg/ml (r2 = 0.999) with a Limit of Detection (LOD) and Limit of Quantitation (LOQ) of 0.1 and 0.3 μg/ml respectively. The accuracy of the method was between 99.2–100.5%. The method was found to be robust and suitable for the quantitative analysis of Dronedarone Hydrochloride in a tablet formulation. Degradation products resulting from the stress studies did not interfere with the detection of Dronedarone Hydrochloride so the assay is thus stability-indicating. PMID:23641332

  4. Multi-Window Classical Least Squares Multivariate Calibration Methods for Quantitative ICP-AES Analyses

    SciTech Connect

    CHAMBERS,WILLIAM B.; HAALAND,DAVID M.; KEENAN,MICHAEL R.; MELGAARD,DAVID K.

    1999-10-01

    The advent of inductively coupled plasma-atomic emission spectrometers (ICP-AES) equipped with charge-coupled-device (CCD) detector arrays allows the application of multivariate calibration methods to the quantitative analysis of spectral data. We have applied classical least squares (CLS) methods to the analysis of a variety of samples containing up to 12 elements plus an internal standard. The elements included in the calibration models were Ag, Al, As, Au, Cd, Cr, Cu, Fe, Ni, Pb, Pd, and Se. By performing the CLS analysis separately in each of 46 spectral windows and by pooling the CLS concentration results for each element in all windows in a statistically efficient manner, we have been able to significantly improve the accuracy and precision of the ICP-AES analyses relative to the univariate and single-window multivariate methods supplied with the spectrometer. This new multi-window CLS (MWCLS) approach simplifies the analyses by providing a single concentration determination for each element from all spectral windows. Thus, the analyst does not have to perform the tedious task of reviewing the results from each window in an attempt to decide the correct value among discrepant analyses in one or more windows for each element. Furthermore, it is not necessary to construct a spectral correction model for each window prior to calibration and analysis: When one or more interfering elements was present, the new MWCLS method was able to reduce prediction errors for a selected analyte by more than 2 orders of magnitude compared to the worst case single-window multivariate and univariate predictions. The MWCLS detection limits in the presence of multiple interferences are 15 rig/g (i.e., 15 ppb) or better for each element. In addition, errors with the new method are only slightly inflated when only a single target element is included in the calibration (i.e., knowledge of all other elements is excluded during calibration). The MWCLS method is found to be vastly

  5. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years

    PubMed Central

    Mushegyan, Vagan; Eronen, Jussi T.; Lawing, A. Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D.

    2015-01-01

    Summary The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine if evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic, and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem-cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  6. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix

    PubMed Central

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Background: Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. Methods and Results: In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. Conclusion: CFLCM is showed as a useful quantitative method for pleomorphism

  7. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix

    PubMed Central

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Background: Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. Methods and Results: In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. Conclusion: CFLCM is showed as a useful quantitative method for pleomorphism

  8. Qualitative and quantitative event-specific PCR detection methods for oxy-235 canola based on the 3' integration flanking sequence.

    PubMed

    Yang, Litao; Guo, Jinchao; Zhang, Haibo; Liu, Jia; Zhang, Dabing

    2008-03-26

    As more genetically modified plant events are approved for commercialization worldwide, the event-specific PCR method has become the key method for genetically modified organism (GMO) identification and quantification. This study reveals the 3' flanking sequence of the exogenous integration of Oxy-235 canola employing thermal asymmetric interlaced PCR (TAIL-PCR). On the basis of the revealed 3' flanking sequence, PCR primers and TaqMan probe were designed and qualitative and quantitative PCR assays were established for Oxy-235 canola. The specificity and limits of detection (LOD) and quantification (LOQ) of these two PCR assays were validated to as low as 0.1% for the relative LOD of qualitative PCR assay; the absolute LOD and LOQ were low to 10 and 20 copies of canola genomic DNA in quantitative PCR assay, respectively. Furthermore, ideal quantified results were obtained in the practical canola sample detection. All of the results indicate that the developed qualitative and quantitative PCR methods based on the revealed 3' integration flanking sequence are suitable for GM canola Oxy-235 identification and quantification.

  9. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  10. Comparison of analytical methods for quantitation of isolated porcine hepatocyte yields.

    PubMed

    Stegemann, J P; Raina, S; Nicholson, D T; Jimenez, P; Shah, L; Cain, S; Chandler, B; Pitkin, Z; Mullon, C; Custer, L

    2000-06-01

    As cell-based therapies receive approval for clinical evaluation and use, the development of reliable methods to quantify cell number and control the dose of therapy delivered is becoming increasingly important. An example is the determination of the number and volume of primary porcine hepatocytes used in an extracorporeal treatment for patients with liver disease. Conventional cell counting using optical microscopy was compared against two alternate methods to quantify isolated porcine hepatocytes: (1) automated cell counting using a commercially available particle characterization instrument, and (2) quantitation by cell mass. Methods were compared based on accuracy, precision, specificity, linear range, and ruggedness. The automated method delivered substantially improved accuracy, precision, and ruggedness when compared to the conventional optical method. It also provided valuable information about the size distribution of cell preparations, which often contained clumps of cells, and showed that processing steps such as cryopreservation can alter the size characteristics of a cell population. The automated method was also faster, and was well suited to use in a commercial manufacturing process. The mass-based method was simple and inexpensive, but suffered from nonlinearity at low cell concentrations. Automated cell quantitation using a commercially available particle characterization instrument proved to be the preferred method for obtaining accurate and consistent porcine hepatocyte counts in a timely manner. PMID:10941220

  11. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions.

  12. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  13. Novel method for quantitative assessment of physical workload of healthcare workers by a tetherless ergonomics workstation.

    PubMed

    Smith, Warren D; Alharbi, Kamal A; Dixon, Jeremy B; Reggad, Hind

    2012-01-01

    Healthcare workers are at risk of physical injury. Our laboratory has developed a tetherless ergonomics workstation that is suitable for studying physicians' and nurses' physical workloads in clinical settings. The workstation uses wearable sensors to record multiple channels of body orientation and muscle activity and wirelessly transmits them to a base station laptop computer for display, storage, and analysis. The ergonomics workstation generates long records of multi-channel data, so it is desired that the workstation automatically process these records and provide graphical and quantitative summaries of the physical workloads experienced by the healthcare workers. This paper describes a novel method of automated quantitative assessment of physical workload, termed joint cumulative amplitude-duration (JCAD) analysis, that has advantages over previous methods and illustrates its use in a comparison of the physical workloads of robotically-assisted surgery versus manual video-endoscopic surgery.

  14. Three-dimensional quantitative analysis of adhesive remnants and enamel loss resulting from debonding orthodontic molar tubes

    PubMed Central

    2014-01-01

    Aims Presenting a new method for direct, quantitative analysis of enamel surface. Measurement of adhesive remnants and enamel loss resulting from debonding molar tubes. Material and methods Buccal surfaces of fifteen extracted human molars were directly scanned with an optic blue-light 3D scanner to the nearest 2 μm. After 20 s etching molar tubes were bonded and after 24 h storing in 0.9% saline - debonded. Then 3D scanning was repeated. Superimposition and comparison were proceeded and shape alterations of the entire objects were analyzed using specialized computer software. Residual adhesive heights as well as enamel loss depths have been obtained for the entire buccal surfaces. Residual adhesive volume and enamel loss volume have been calculated for every tooth. Results The maximum height of adhesive remaining on enamel surface was 0.76 mm and the volume on particular teeth ranged from 0.047 mm3 to 4.16 mm3. The median adhesive remnant volume was 0.988 mm3. Mean depths of enamel loss for particular teeth ranged from 0.0076 mm to 0.0416 mm. Highest maximum depth of enamel loss was 0.207 mm. Median volume of enamel loss was 0.104 mm3 and maximum volume was 1.484 mm3. Conclusions Blue-light 3D scanning is able to provide direct precise scans of the enamel surface, which can be superimposed in order to calculate shape alterations. Debonding molar tubes leaves a certain amount of adhesive remnants on the enamel, however the interface fracture pattern varies for particular teeth and areas of enamel loss are present as well. PMID:25208969

  15. Models and methods for quantitative analysis of surface-enhanced Raman spectra.

    PubMed

    Li, Shuo; Nyagilo, James O; Dave, Digant P; Gao, Jean

    2014-03-01

    The quantitative analysis of surface-enhanced Raman spectra using scattering nanoparticles has shown the potential and promising applications in in vivo molecular imaging. The diverse approaches have been used for quantitative analysis of Raman pectra information, which can be categorized as direct classical least squares models, full spectrum multivariate calibration models, selected multivariate calibration models, and latent variable regression (LVR) models. However, the working principle of these methods in the Raman spectra application remains poorly understood and a clear picture of the overall performance of each model is missing. Based on the characteristics of the Raman spectra, in this paper, we first provide the theoretical foundation of the aforementioned commonly used models and show why the LVR models are more suitable for quantitative analysis of the Raman spectra. Then, we demonstrate the fundamental connections and differences between different LVR methods, such as principal component regression, reduced-rank regression, partial least square regression (PLSR), canonical correlation regression, and robust canonical analysis, by comparing their objective functions and constraints.We further prove that PLSR is literally a blend of multivariate calibration and feature extraction model that relates concentrations of nanotags to spectrum intensity. These features (a.k.a. latent variables) satisfy two purposes: the best representation of the predictor matrix and correlation with the response matrix. These illustrations give a new understanding of the traditional PLSR and explain why PLSR exceeds other methods in quantitative analysis of the Raman spectra problem. In the end, all the methods are tested on the Raman spectra datasets with different evaluation criteria to evaluate their performance.

  16. Quantitative evaluation of periprosthetic infection by real-time polymerase chain reaction: a comparison with conventional methods.

    PubMed

    Miyamae, Yushi; Inaba, Yutaka; Kobayashi, Naomi; Choe, Hyonmin; Ike, Hiroyuki; Momose, Takako; Fujiwara, Shusuke; Saito, Tomoyuki

    2012-10-01

    Several recent studies have demonstrated the limited accuracy of conventional culture methods for diagnosing periprosthetic infections. We have applied real-time polymerase chain reaction (PCR) assays for the rapid identification of bacteria around implants and reported its utility. However, the capability of quantification is also a useful feature of this type of assay. The aim of our study was to validate the usefulness of quantitative analyses using real-time PCR of cases with clinical periprosthetic infections in comparison with more established tests, such as C-reactive protein (CRP) levels, microbiologic cultures, and histopathology. Fifty-six joints with suspected infections were reviewed retrospectively. A universal PCR assay was used to perform the quantitative analyses. The differences in the threshold cycles between clinical samples and a negative control (∆Ct) in each case were calculated. The results of the quantitative PCR assay were compared with CRP levels, microbiologic cultures, and histopathology. There was a significant correlation found between the CRP and ∆Ct values. There were also significant differences found in the ∆Ct values according to CRP levels, with higher CRP levels showing higher ∆Ct values. Similarly, there were significant differences in the ∆Ct measurements in our culture results and among our pathologic evaluations. We confirmed that quantification by universal PCR based on the ∆Ct correlated with preoperative CRP levels and was associated with the microbiologic culture results and pathologic severity. This quantification method may be valuable for assessing infection severity.

  17. Integrated multiplatform method for in vitro quantitative assessment of cellular uptake for fluorescent polymer nanoparticles

    NASA Astrophysics Data System (ADS)

    Ferrari, Raffaele; Lupi, Monica; Falcetta, Francesca; Bigini, Paolo; Paolella, Katia; Fiordaliso, Fabio; Bisighini, Cinzia; Salmona, Mario; D'Incalci, Maurizio; Morbidelli, Massimo; Moscatelli, Davide; Ubezio, Paolo

    2014-01-01

    Studies of cellular internalization of nanoparticles (NPs) play a paramount role for the design of efficient drug delivery systems, but so far they lack a robust experimental technique able to quantify the NP uptake in terms of number of NPs internalized in each cell. In this work we propose a novel method which provides a quantitative evaluation of fluorescent NP uptake by combining flow cytometry and plate fluorimetry with measurements of number of cells. Single cell fluorescence signals measured by flow cytometry were associated with the number of internalized NPs, exploiting the observed linearity between average flow cytometric fluorescence and overall plate fluorimeter measures, and previous calibration of the microplate reader with serial dilutions of NPs. This precise calibration has been made possible by using biocompatible fluorescent NPs in the range of 20-300 nm with a narrow particle size distribution, functionalized with a covalently bonded dye, Rhodamine B, and synthesized via emulsion free-radical polymerization. We report the absolute number of NPs internalized in mouse mammary tumor cells (4T1) as a function of time for different NP dimensions and surface charges and at several exposure concentrations. The obtained results indicate that 4T1 cells incorporated 103-104 polymer NPs in a short time, reaching an intracellular concentration 15 times higher than the external one.

  18. Localization and Quantitation of Chloroplast Enzymes and Light-Harvesting Components Using Immunocytochemical Methods 12

    PubMed Central

    Mustardy, Laszlo; Cunningham, Francis X.; Gantt, Elisabeth

    1990-01-01

    Seven chloroplast proteins were localized in Porphyridium cruentum (ATCC 50161) by immunolabeling with colloidal gold on electron microscope sections of log phase cells grown under red, green, and white light. Ribulose bisphosphate carboxylase labeling occurred almost exclusively in the pyrenoid. The major apoproteins of photosystem I (56-64 kD) occurred mostly over the stromal thylakoid region and also appeared over the thylakoids passing through the pyrenoid. Labeling for photosystem II core components (D2 and a 45 kD Chl-binding protein), for phycobilisomes (allophycocyanin, and a 91 kD Lcm linker) and for ATP synthase (β subunit) were predominantly present in the thylakoid region but not in the pyrenoid region of the chloroplast. Red light cells had increased labeling per thylakoid length for polypeptides of photosystem II and of phycobilisomes, while photosystem I density decreased, compared to white light cells. Conversely, green light cells had a decreased density of photosystem II and phycobilisome polypeptides, while photosystem I density changed little compared with white light cells. A comparison of the immunogold labeling results with data from spectroscopic methods and from rocket immunoelectrophoresis indicates that it can provide a quantitative measure of the relative amounts of protein components as well as their localization in specific organellar compartments. Images Figure 1 Figure 2 PMID:16667706

  19. Localization and quantitation of chloroplast enzymes and light-harvesting components using immunocytochemical methods

    SciTech Connect

    Mustardy, L.; Cunningham, F.X Jr.; Gantt, E. )

    1990-09-01

    Seven chloroplast proteins were localized in Porphyridium cruentum (ATCC 50161) by immunolabeling with colloidal gold on electron microscope sections of log phase cells grown under red, green, and white light. Ribulose bisphosphate carboxylase labeling occurred almost exclusively in the pyrenoid. The major apoproteins of photosystem I (56-64 kD) occurred mostly over the stromal thylakoid region and also appeared over the thylakoids passing through the pyrenoid. Labeling for photosystem II core components (D2 and a 45 kD Chl-binding protein), for phycobilisomes (allophycocyanin, and a 91 kD L{sub CM} linker) and for ATP synthase ({beta} subunit) were predominantly present in the thylakoid region but not in the pyrenoid region of the chloroplast. Red light cells had increased labeling per thylakoid length for polypeptides of photosystem II and of phycobilisomes, while photosystem I density decreased, compared to white light cells. Conversely, green light cells had a decreased density of photosystem II and phycobilisome polypeptides, while photosystem I density changed little compared with white light cells. A comparison of the immunogold labeling results with data from spectroscopic methods and from rocket immunoelectrophoresis indicates that it can provide a quantitative measure of the relative amounts of protein components as well as their localization in specific organeller compartments.

  20. A quantitative and standardized robotic method for the evaluation of arm proprioception after stroke.

    PubMed

    Simo, Lucia S; Ghez, Claude; Botzer, Lior; Scheidt, Robert A

    2011-01-01

    Stroke often results in both motor and sensory deficits, which may interact in the manifested functional impairment. Proprioception is known to play important roles in the planning and control of limb posture and movement; however, the impact of proprioceptive deficits on motor function has been difficult to elucidate due in part to the qualitative nature of available clinical tests. We present a quantitative and standardized method for evaluating proprioception in tasks directly relevant to those used to assess motor function. Using a robotic manipulandum that exerted controlled displacements of the hand, stroke participants were evaluated, and compared with a control group, in their ability to detect such displacements in a 2-alternative, forced-choice paradigm. A psychometric function parameterized the decision process underlying the detection of the hand displacements. The shape of this function was determined by a signal detection threshold and by the variability of the response about this threshold. Our automatic procedure differentiates between participants with and without proprioceptive deficits and quantifies functional proprioceptive sensation on a magnitude scale that is meaningful for ongoing studies of degraded motor function in comparable horizontal movements.

  1. Methods for the quantitation of abscisic acid and its precursors from plant tissues.

    PubMed

    Duffield, P H; Netting, A G

    2001-02-15

    Methods are given for the quantitation of the plant stress hormone, abscisic acid (ABA), and its two metabolic precursors, ketone and enolate, that are applicable to all species tested so far. The plant extract is homogenized at neutral pH, hexane-soluble neutrals are extracted and discarded, and then the free ABA and other organic acids are extracted as ion pairs. The remaining aqueous phase is acidified, allowed to stand, neutralized, and extracted to give the ABA ex ketone fraction and then the aqueous phase is treated with base and again extracted to give the ABA ex enolate fraction. Each of these three fractions, free ABA, ABA ex ketone, and ABA ex enolate, along with a deuteriated internal standard, [side-chain-(2)H(4)]ABA, is then derivatized with pentafluorobenzyl bromide and purified on an automated sample preparation system. The resulting pentafluorobenzyl abscisate samples are then quantified using electron capture negative ionization mass spectrometry with methane as the reagent gas. Using these procedures free ABA, and ABA from its precursors, can be quantified at the level of 100 fg on column. If a large volume injector is used so that the total sample is injected it should be possible to quantify ABA and its precursors in the parts per billion range on a few milligrams of plant tissue.

  2. a Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis.

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Available from UMI in association with The British Library. Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4-tetrahidro-1, 6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 Br.H_2O) has been determined, first using monochromatic Mo Kalpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed a R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop- 2-en-1-one, (C_{25 }H_{20}N _2O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analysis respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analysis of the benzil compound ((C_6H_5 O.CO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature -114 ^circC. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation

  3. A Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4 -tetrahidro-1,6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 BrcdotH_2O) has been determined, first using monochromatic Mo K alpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed an R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop-2-en-1-one, (C_{25}H _{20}N_2 O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analyses respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analyses of the benzil compound ((C_6H_5 OcdotCO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new

  4. Quantitative method of analyzing the interaction of slightly selective radioligands with multiple receptor subtypes

    SciTech Connect

    McGonigle, P.; Neve, K.A.; Molinoff, P.B.

    1986-10-01

    Subclasses of receptors exist for most neurotransmitters. Frequently, two subtypes of receptors coexist in the same tissue and, in some cases, they mediate the same physiological response. In tissues with two classes of binding sites for a given hormone, an estimate of the proportion of each class of binding sites is obtained by inhibiting the binding of a single concentration of a radioligand with a selective unlabeled ligand. Accurate estimates of the density of each class of receptors will only be obtained, however, if the radioligand is entirely nonselective. Selectivity of just 2- to 3-fold can markedly influence the results of subtype analysis. The conclusion that a radioligand is nonselective is usually based on the results of a saturation binding curve. If Scatchard analysis results in a linear plot, the radioligand is nonselective. Scatchard analysis cannot distinguish between a radioligand that is nonselective and one that is slightly selective. The use of a slightly selective radioligand can lead to errors of 50% or more, depending on the concentration of the radioligand relative to the Kd values of the two classes of sites. A new method has been developed that can be used to quantitate 2- to 3-fold differences in the affinity of two distinct classes of binding sites for a radioligand. This approach requires that a series of inhibition experiments with a selective unlabeled ligand be performed in the presence of increasing concentrations of the radioligand. Analysis of the resulting inhibition curves, utilizing the mathematical modeling program MLAB on the PROPHET system, yields accurate estimates of the density of each class of receptor as well as the affinity of each receptor for the labeled and unlabeled ligands. This approach was used to determine whether /sup 125/I-iodopindolol shows selectivity for beta 1- or beta 2-adrenergic receptors.

  5. A method of quantitative risk assessment for transmission pipeline carrying natural gas.

    PubMed

    Jo, Young-Do; Ahn, Bum Jong

    2005-08-31

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline.

  6. A method for estimating and removing streaking artifacts in quantitative susceptibility mapping.

    PubMed

    Li, Wei; Wang, Nian; Yu, Fang; Han, Hui; Cao, Wei; Romero, Rebecca; Tantiwongkosi, Bundhit; Duong, Timothy Q; Liu, Chunlei

    2015-03-01

    Quantitative susceptibility mapping (QSM) is a novel MRI method for quantifying tissue magnetic property. In the brain, it reflects the molecular composition and microstructure of the local tissue. However, susceptibility maps reconstructed from single-orientation data still suffer from streaking artifacts which obscure structural details and small lesions. We propose and have developed a general method for estimating streaking artifacts and subtracting them from susceptibility maps. Specifically, this method uses a sparse linear equation and least-squares (LSQR)-algorithm-based method to derive an initial estimation of magnetic susceptibility, a fast quantitative susceptibility mapping method to estimate the susceptibility boundaries, and an iterative approach to estimate the susceptibility artifact from ill-conditioned k-space regions only. With a fixed set of parameters for the initial susceptibility estimation and subsequent streaking artifact estimation and removal, the method provides an unbiased estimate of tissue susceptibility with negligible streaking artifacts, as compared to multi-orientation QSM reconstruction. This method allows for improved delineation of white matter lesions in patients with multiple sclerosis and small structures of the human brain with excellent anatomical details. The proposed methodology can be extended to other existing QSM algorithms.

  7. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles

    SciTech Connect

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki

    2012-09-28

    quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal.

  8. FTIR quantitative analysis of calcium carbonate (calcite) and silica (quartz) mixtures using the constant ratio method. Application to geological samples.

    PubMed

    Reig, F Bosch; Adelantado, J V Gimeno; Moya Moreno, M C M

    2002-10-16

    A methodology for quantifying calcium carbonate (875 and 712 cm(-1)) and silica (798 and 779 cm(-1)) by FTIR spectroscopy applying the constant ratio method is proposed. The studied method is applied for quantitative analysis of calcite and quartz in geological samples. The suggested method uses potassium ferricyanide (2115 cm(-1)) as standard and samples are prepared as potassium bromide pellets, with statistically satisfactory results (relative standard deviation less than 5%). The influence of particle size (pulverised samples) on the accuracy of the results found by FTIR spectroscopy applying the constant ratio method has been studied. The granulometric study suggests that the homogeneity of particle size is important. An absorbances correction method is proposed to correct spectral interferences. Due to the fact that most spectra are characterised by a high number of absorbance peaks, spectral interferences may occur so that some peaks cannot be used for quantitative analysis. These interferences can be evaluated and eliminated with the proposed method and the peaks can be used in the analysis.

  9. Perceived future career prospects in general practice: quantitative results from questionnaire surveys of UK doctors

    PubMed Central

    Lambert, Trevor W; Smith, Fay; Goldacre, Michael J

    2016-01-01

    Background There are more studies of current job satisfaction among GPs than of their views about their future career prospects, although both are relevant to commitment to careers in general practice. Aim To report on the views of GPs compared with clinicians in other specialties about their future career prospects. Design and setting Questionnaire surveys were sent to UK medical doctors who graduated in selected years between 1974 and 2008. Method Questionnaires were sent to the doctors at different times after graduation, ranging from 3 to 24 years. Results Based on the latest survey of each graduation year of the 20 940 responders, 66.2% of GPs and 74.2% of hospital doctors were positive about their prospects and 9.7% and 8.3%, respectively, were negative. However, with increasing time since graduation and increasing levels of seniority, GPs became less positive about their prospects; by contrast, over time, surgeons became more positive. Three to 5 years after graduation, 86.3% of those training in general practice were positive about their prospects compared with 52.9% of surgical trainees: in surveys conducted 12–24 years after graduation, 60.2% of GPs and 76.6% of surgeons were positive about their prospects. Conclusion GPs held broadly positive views of their career prospects, as did other doctors. However, there was an increase in negativity with increasing time since graduation that was not seen in hospital doctors. Research into the causes of this negativity and policy measures to ameliorate it would contribute to the continued commitment of GPs and may help to reduce attrition. PMID:27578813

  10. A new quantitative method for the non-invasive documentation of morphological damage in paintings using RTI surface normals.

    PubMed

    Manfredi, Marcello; Bearman, Greg; Williamson, Greg; Kronkright, Dale; Doehne, Eric; Jacobs, Megan; Marengo, Emilio

    2014-07-09

    In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI) we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time.

  11. The Isotope-Coded Affinity Tag Method for Quantitative Protein Profile Comparison and Relative Quantitation of Cysteine Redox Modifications.

    PubMed

    Chan, James Chun Yip; Zhou, Lei; Chan, Eric Chun Yong

    2015-11-02

    The isotope-coded affinity tag (ICAT) technique has been applied to measure pairwise changes in protein expression through differential stable isotopic labeling of proteins or peptides followed by identification and quantification using a mass spectrometer. Changes in protein expression are observed when the identical peptide from each of two biological conditions is identified and a difference is detected in the measurements comparing the peptide labeled with the heavy isotope to the one with a normal isotopic distribution. This approach allows the simultaneous comparison of the expression of many proteins between two different biological states (e.g., yeast grown on galactose versus glucose, or normal versus cancer cells). Due to the cysteine-specificity of the ICAT reagents, the ICAT technique has also been applied to perform relative quantitation of cysteine redox modifications such as oxidation and nitrosylation. This unit describes both protein quantitation and profiling of cysteine redox modifications using the ICAT technique.

  12. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  13. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  14. Response monitoring using quantitative ultrasound methods and supervised dictionary learning in locally advanced breast cancer

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.

    2016-03-01

    A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.

  15. Depth determination for shallow teleseismic earthquakes Methods and results

    SciTech Connect

    Stein, S.; Wiens, D.A.

    1986-11-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented. 119 references.

  16. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  17. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is

  18. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    PubMed

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy.

  19. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    PubMed

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.

  20. A quantitative ¹H nuclear magnetic resonance (qHNMR) method for assessing the purity of iridoids and secoiridoids.

    PubMed

    Li, Zeyun; Welbeck, Edward; Yang, Li; He, Chunyong; Hu, Haijun; Song, Ming; Bi, Kaishun; Wang, Zhengtao

    2015-01-01

    This paper utilized a quantitative (1)H nuclear magnetic resonance (qHNMR) method for assessing the purity of iridoids and secoiridoids. The method was fully validated, including specificity, linearity, accuracy, precision, reproducibility, and robustness. For optimization of experimental conditions, several experimental parameters were investigated, including relaxation delay (D1), scan numbers (NS) and power length (PL1). The quantification was based on the area ratios of H-3 from analytes relative to aromatic protons from 1,4-dinitrobenzene (internal standard) with methanol-d4 as solvent. Five iridoids and secoiridoids (sweroside, swertiamarin, gentiopicroside, geniposide, genipin) were analyzed. Furthermore, the results were validated by the high performance liquid chromatography coupled with ultraviolet detection (HPLC-UV) method. It can be concluded that the qHNMR method was simple, rapid, and accurate, providing a reliable and superior method for assessing the purity of iridoids and secoiridoids.

  1. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18 O-Labeling Method for Quantitative Proteomics

    SciTech Connect

    Lopez-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather S.; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-08-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min and minimized the amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from bacteria Shewanella oneidensis, and mouse plasma, as well as for the labeling of complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, and rapid, and thus well-suited for automation.

  2. Trace iodine quantitation in biological samples by mass spectrometric methods: the optimum internal standard.

    PubMed

    Dyke, Jason V; Dasgupta, Purnendu K; Kirk, Andrea B

    2009-07-15

    Accurate quantitation of iodine in biological samples is essential for studies of nutrition and medicine, as well as for epidemiological studies for monitoring intake of this essential nutrient. Despite the importance of accurate measurement, a standardized method for iodine analysis of biological samples is yet to be established. We have evaluated the effectiveness of (72)Ge, (115)In, and (129)I as internal standards for measurement of iodine in milk and urine samples by induction coupled plasma mass spectrometry (ICP-MS) and of (35)Cl(18)O(4)(-), (129)I(-), and 2-chlorobenzenesulfonate (2-CBS) as internal standards for ion chromatography-tandem mass spectrometry (IC-MS/MS). We found recovery of iodine to be markedly low when IC-MS/MS was used without an internal standard. Percent recovery was similarly low using (35)Cl(18)O(4) as an internal standard for milk and unpredictable when used for urine. 2-Chlorobenzebenzenesulfonate provided accurate recovery of iodine from milk, but overestimated iodine in urine samples by as much as a factor of 2. Percent recovery of iodine from milk and urine using ICP-MS without an internal standard was approximately 120%. Use of (115)In predicted approximately 60% of known values for both milk and urine samples. (72)Ge provided reasonable and consistent percent recovery for iodine in milk samples (approximately 108%) but resulted in approximately 80% recovery of iodine from urine. Use of (129)I as an internal standard resulted in excellent recovery of iodine from both milk and urine samples using either IC-MS/MS and ICP-MS.

  3. Quantitative analysis of gene expression in fixed colorectal carcinoma samples as a method for biomarker validation

    PubMed Central

    OSTASIEWICZ, BEATA; OSTASIEWICZ, PAWEŁ; DUŚ-SZACHNIEWICZ, KAMILA; OSTASIEWICZ, KATARZYNA; ZIÓŁKOWSKI, PIOTR

    2016-01-01

    Biomarkers have been described as the future of oncology. Modern proteomics provide an invaluable tool for the near-whole proteome screening for proteins expressed differently in neoplastic vs. healthy tissues. However, in order to select the most promising biomarkers, an independent method of validation is required. The aim of the current study was to propose a methodology for the validation of biomarkers. Due to material availability the majority of large scale biomarker studies are performed using formalin-fixed paraffin-embedded (FFPE) tissues, therefore these were selected for use in the current study. A total of 10 genes were selected from what have been previously described as the most promising candidate biomarkers, and the expression levels were analyzed with reverse transcription-quantitative polymerase chain reaction (RT-qPCR) using calibrator normalized relative quantification with the efficiency correction. For 6/10 analyzed genes, the results were consistent with the proteomic data; for the remaining four genes, the results were inconclusive. The upregulation of karyopherin α 2 (KPNA2) and chromosome segregation 1-like (CSE1L) in colorectal carcinoma, in addition to downregulation of chloride channel accessory 1 (CLCA1), fatty acid binding protein 1 (FABP1), sodium channel, voltage gated, type VII α subunit (SCN7A) and solute carrier family 26 (anion exchanger), member 3 (SLC26A3) was confirmed. With the combined use of proteomic and genetic tools, it was reported, for the first time to the best of our knowledge, that SCN7A was downregulated in colorectal carcinoma at mRNA and protein levels. It had been previously suggested that the remaining five genes served an important role in colorectal carcinogenesis, however the current study provided strong evidence to support their use as biomarkers. Thus, it was concluded that combination of RT-qPCR with proteomics offers a powerful methodology for biomarker identification, which can be used to analyze

  4. Quantitative elastography of skin and skin lesion using phase-sensitive OCT (PhS-OCT) and surface wave method

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang

    2012-01-01

    This paper presents the combination of phase sensitive optical coherence tomography (PhS-OCT) imaging system and surface wave method to achieve quantitative evaluation and elastography of the mechanical properties of in vivo human skin. PhS-OCT measures the surface acoustic waves (SAWs) generated by impulse stimulation from a home-made shaker, and provide the B-frame images for the sample. The surface wave phase velocity dispersion curves were calculated, from which the elasticity of different skin layers were determined. The combination of phase velocities from adjacent two locations generates a quantitative elastography of sample. The experimental results agree well with theoretical expectations and may offer potential use in clinical situations.

  5. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  6. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.

  7. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  8. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl. PMID:24931408

  9. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    PubMed

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge.

  10. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl.

  11. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  12. A powerful latent variable method for detecting and characterizing gene-based gene-gene interaction on multiple quantitative traits

    PubMed Central

    2013-01-01

    Background On thinking quantitatively of complex diseases, there are at least three statistical strategies for analyzing the gene-gene interaction: SNP by SNP interaction on single trait, gene-gene (each can involve multiple SNPs) interaction on single trait and gene-gene interaction on multiple traits. The third one is the most general in dissecting the genetic mechanism underlying complex diseases underpinning multiple quantitative traits. In this paper, we developed a novel statistic for this strategy through modifying the Partial Least Squares Path Modeling (PLSPM), called mPLSPM statistic. Results Simulation studies indicated that mPLSPM statistic was powerful and outperformed the principal component analysis (PCA) based linear regression method. Application to real data in the EPIC-Norfolk GWAS sub-cohort showed suggestive interaction (γ) between TMEM18 gene and BDNF gene on two composite body shape scores (γ = 0.047 and γ = 0.058, with P = 0.021, P = 0.005), and BMI (γ = 0.043, P = 0.034). This suggested these scores (synthetically latent traits) were more suitable to capture the obesity related genetic interaction effect between genes compared to single trait. Conclusions The proposed novel mPLSPM statistic is a valid and powerful gene-based method for detecting gene-gene interaction on multiple quantitative phenotypes. PMID:24059907

  13. A method of quantitative characterization for the component of C/C composites based on the PLM video

    NASA Astrophysics Data System (ADS)

    Li, Y. X.; Qi, L. H.; Song, Y. S.; Li, H. J.

    2016-07-01

    PLM video is used for studying the microstructure of C/C composites, because it contains the structure and motion information at the same time. It means that PLM video could provide more comprehensive microstructure features of C/C composites, and then the microstructure could be quantitatively characterized by image processing. However, several unavoidable displacements still exist in the PLM video, which could occur during the process of image acquisition. Therefore, an image registration method was put forward to correct the displacements by the phase correlation, and further to achieve the quantitative characterization of component combined with image fusion and threshold segmentation based on the PLM video of C/C composites. Specifically, PLM video was decomposed to a frame sequence firstly. Then a series of processes was carried out on this basis, including selecting the frame as equal interval, segmenting the static and dynamic regions and correcting the relative displacements between the adjacent frames. Meanwhile, the result of image registration was verified through image fusion, and it indicates that the proposed method could eliminate the displacements effectively. Finally, some operations of image processing were used to segment the components and calculate their fractions, thus the quantitative calculation was achieved successfully.

  14. Multiresidue method for the quantitation of 20 pesticides in aquatic products.

    PubMed

    Cho, Ha Ra; Park, Jun Seo; Kim, Junghyun; Han, Sang Beom; Choi, Yong Seok

    2015-12-01

    As the consumption of aquatic products increased, the need for regulation of pesticide residues in aquatic products also emerged. Thus, in this study, a scheduled multiple reaction monitoring (sMRM) method employing a novel extraction and purification step based on QuEChERS with EDTA was developed for the simultaneous quantitation of 20 pesticides (alachlor, aldicarb, carbofuran, diazinon, dimethoate, dimethomorph, ethoprophos, ferimzone, fluridone, hexaconazole, iprobenfos, malathion, methidathion, methiocarb, phenthoate, phosalone, phosmet, phosphamidon, pirimicarb, and simazine) in aquatic products. Additionally, the present method was validated in the aspects of specificity, linearity (r ≥ 0.980), sensitivity (the limit of quantitation (LOQ) ≤ 5 ng/g), relative standard deviation, RSD (1.0% ≤ RSD ≤ 19.4%), and recovery (60.1% ≤ recovery ≤ 117.9%). Finally, the validated method was applied for the determination of the 20 pesticide residues in eel and shrimp purchased from local food markets. In the present study, QuEChERS with EDTA was successfully expanded to residual pesticide analysis for the first time. The present method could contribute to the rapid and successful establishment of the positive list system in South Korea. PMID:26466578

  15. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  16. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating.

  17. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities.

  18. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  19. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  20. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities. PMID:26963561

  1. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  2. Broad-spectrum detection and quantitation methods of Soil-borne cereal mosaic virus isolates.

    PubMed

    Vaïanopoulos, Céline; Legrève, Anne; Moreau, Virginie; Bragard, Claude

    2009-08-01

    A broad-spectrum reverse transcription-polymerase chain reaction (RT-PCR) protocol was developed for detecting Soil-borne cereal mosaic virus (SBCMV) isolates, responsible for mosaic diseases in Europe, using primers targeting the highly conserved 3'-untranslated region of RNA-1 and RNA-2 of SBCMV. The 3'-end region is a privileged target for the detection of a wide range of isolates, because of sequence conservation, of the tRNA-like structure, the major role in viral replication and the signal amplification due to the presence of numerous genomic and subgenomic RNAs. The primers were also designed for virus quantitation using real-time RT-PCR with SYBR-Green chemistry. No cross-reaction with Wheat spindle streak mosaic virus, frequently associated with SBCMV, was observed. The use of RT-PCR and real-time quantitative RT-PCR allowed a more sensitive detection and quantitation of SBCMV to be made than was the case with ELISA. The methods enabled European isolates of SBCMV from Belgium, France, Germany, Italy and the UK to be detected and quantified. Real-time RT-PCR represents a new tool for comparing soil inoculum potential as well as cultivar resistance to SBCMV.

  3. Comparative Evaluation of Four Real-Time PCR Methods for the Quantitative Detection of Epstein-Barr Virus from Whole Blood Specimens.

    PubMed

    Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall

    2016-07-01

    Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays.

  4. Modeling Bone Surface Morphology: A Fully Quantitative Method for Age-at-Death Estimation Using the Pubic Symphysis.

    PubMed

    Slice, Dennis E; Algee-Hewitt, Bridget F B

    2015-07-01

    The pubic symphysis is widely used in age estimation for the adult skeleton. Standard practice requires the visual comparison of surface morphology against criteria representing predefined phases and the estimation of case-specific age from an age range associated with the chosen phase. Known problems of method and observer error necessitate alternative tools to quantify age-related change in pubic morphology. This paper presents an objective, fully quantitative method for estimating age-at-death from the skeleton, which exploits a variance-based score of surface complexity computed from vertices obtained from a scanner sampling the pubic symphysis. For laser scans from 41 modern American male skeletons, this method produces results that are significantly associated with known age-at-death (RMSE = 17.15 years). Chronological age is predicted, therefore, equally well, if not, better, with this robust, objective, and fully quantitative method than with prevailing phase-aging systems. This method contributes to forensic casework by responding to medico-legal expectations for evidence standards.

  5. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    NASA Astrophysics Data System (ADS)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  6. Immunologic methods for quantitative estimation of small peptides and their application to bradykinin.

    PubMed

    Anumula, K R; Schulz, R; Back, N

    1990-12-31

    A simple strategy was developed for the immunologic quantitative determination of small, biologically active peptides utilizing bradykinin (BK) as the model peptide prototype. Methods were developed for the preparation of a peptide-carrier complex suitable for immunization and for immobilization of peptides onto the plastic surface of enzyme-linked immunosorbent assay (ELISA) plates. An avidin-bound biotinylated peptide complex was used for raising peptide antibodies with high titers (1:4000) in the rabbit. The peptide BK was coupled to synthetic polymeric carriers poly-D-lysine (PL) and poly-D-lysine-succinylated (PLS) via the BK carboxy and amino terminus, respectively, with the aid of a water soluble carbodiimide. These carriers with antigen peptide side chains as well as avidin-biotinyl-peptide complexes were efficient surface immobilizing reagents for microwell plastic plates used in the detection of kinins by ELISA. Monoclonal antibodies reacted competitively with kinins in plates coated with either PL-BK or PLS-BK. In contrast, rabbit (polyclonal) antibodies reacted specifically in the plates coated with PLS-BK but only a non-specific reaction could be obtained with the PL-BK coated plates (i.e., could not be displaced with BK). Based on results using synthetic BK analogues, the carboxy terminal half of the BK molecule appears to be the stronger antigenic determinant in both mouse and rabbit systems. The polyclonal antibodies demonstrated a greater affinity to bradykinin compared to the monoclonal antibodies. Their use improved the sensitivity of the ELISA for kinin determination by one order of magnitude. Kinin levels determined in plasma tryptic digests by ELISA with the polyclonal antibodies and PLS-BK system were in agreement with published values.

  7. Development and application of quantitative methods for monitoring dermal and inhalation exposure to propiconazole.

    PubMed

    Flack, Sheila; Goktepe, Ipek; Ball, Louise M; Nylander-French, Leena A

    2008-03-01

    Quantitative methods to measure dermal and inhalation exposure to the fungicide propiconazole were developed in the laboratory and applied in the occupational exposure setting for monitoring five farm workers' exposure during pesticide preparation and application to peach crops. Dermal exposure was measured with tape-strips applied to the skin, and the amount of propiconazole was normalized to keratin content in the tape-strip. Inhalation exposure was measured with an OVS tube placed in the worker's breathing-zone during pesticide handling. Samples were analyzed by GC-MS in EI+ mode (limit of detection 6 pg microl(-1)). Dermal exposure ranged from non-detectable to 32.1 +/- 22.6 ng per microg keratin while breathing-zone concentrations varied from 0.2 to 2.2 microg m(-3). A positive correlation was observed between breathing-zone concentrations and ambient air temperature (r2 = 0.87, p < 0.01). Breathing-zone concentrations did not correlate with dermal exposure levels (r2 = 0.11, p = 0.52). Propiconazole levels were below limit of detection when rubber gloves, coveralls, and full-face mask were used. The total-body propiconazole dose, determined for each worker by summing the estimated dermal dose and inhalation dose, ranged from 0.01 to 12 microg per kg body weight per day. Our results show that tape-stripping of the skin and the OVS can be effectively utilized to measure dermal and inhalation exposure to propiconazole, respectively, and that the dermal route of exposure contributed substantially more to the total dose than the inhalation route.

  8. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  9. A comparison of quantitative methods for clinical imaging with hyperpolarized 13C‐pyruvate

    PubMed Central

    Daniels, Charlie J.; McLean, Mary A.; Schulte, Rolf F.; Robb, Fraser J.; Gill, Andrew B.; McGlashan, Nicholas; Graves, Martin J.; Schwaiger, Markus; Lomas, David J.; Brindle, Kevin M.

    2016-01-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized 13C‐labelled molecules, such as the conversion of [1‐13C]pyruvate to [1‐13C]lactate, to be dynamically and non‐invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model‐free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two‐way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time‐to‐peak and the lactate‐to‐pyruvate area under the curve ratio were simple model‐free approaches that accurately represented the full reaction, with the time‐to‐peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized 13C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd. PMID:27414749

  10. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data.

  11. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  12. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12. PMID:21515963

  13. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  14. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  15. Quantitative methods for the analysis of protein phosphorylation in drug development.

    PubMed

    Olive, D Michael

    2004-10-01

    Most signal transduction and cell signaling pathways are mediated by protein kinases. Protein kinases have emerged as important cellular regulatory proteins in many aspects of neoplasia. Protein kinase inhibitors offer the opportunity to target diseases such as cancer with chemotherapeutic agents specific for the causative molecular defect. In order to identify possible targets and assess kinase inhibitors, quantitative methods for analyzing protein phosphorylation have been developed. This review examines some of the current formats used for quantifying kinase function for drug development. PMID:15966829

  16. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-09-01

    Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  17. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-10-01

    Abstract Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  18. Simple methods for the qualitative identification and quantitative determination of macrolide antibiotics.

    PubMed

    Danielson, N D; Holeman, J A; Bristol, D C; Kirzner, D H

    1993-02-01

    Pyrolysis-gas chromatography is shown to be a rapid straightforward method for the qualitative differentiation of the macrolide antibiotics erythromycin, oleandomycin, troleandomycin, spiramycin and tylosin. Organic salts do not interfere and identification of erythromycin and troleandomycin in commercial products is viable. Spectrophotometric quantitation of these same five antibiotics after reaction with concentrated sulphuric acid is studied at about 470 nm. Reaction conditions such as acid concentration, time and temperature are provided. The sugar moieties of the antibiotics are proposed as the reactive sites. Detection limits are about 0.2-1.0 microg ml-1 [corrected] and analysis of pharmaceutical products should be possible.

  19. A Method to Prioritize Quantitative Traits and Individuals for Sequencing in Family-Based Studies

    PubMed Central

    Shah, Kaanan P.; Douglas, Julie A.

    2013-01-01

    Owing to recent advances in DNA sequencing, it is now technically feasible to evaluate the contribution of rare variation to complex traits and diseases. However, it is still cost prohibitive to sequence the whole genome (or exome) of all individuals in each study. For quantitative traits, one strategy to reduce cost is to sequence individuals in the tails of the trait distribution. However, the next challenge becomes how to prioritize traits and individuals for sequencing since individuals are often characterized for dozens of medically relevant traits. In this article, we describe a new method, the Rare Variant Kinship Test (RVKT), which leverages relationship information in family-based studies to identify quantitative traits that are likely influenced by rare variants. Conditional on nuclear families and extended pedigrees, we evaluate the power of the RVKT via simulation. Not unexpectedly, the power of our method depends strongly on effect size, and to a lesser extent, on the frequency of the rare variant and the number and type of relationships in the sample. As an illustration, we also apply our method to data from two genetic studies in the Old Order Amish, a founder population with extensive genealogical records. Remarkably, we implicate the presence of a rare variant that lowers fasting triglyceride levels in the Heredity and Phenotype Intervention (HAPI) Heart study (p = 0.044), consistent with the presence of a previously identified null mutation in the APOC3 gene that lowers fasting triglyceride levels in HAPI Heart study participants. PMID:23626830

  20. A quantitative solid-state Raman spectroscopic method for control of fungicides.

    PubMed

    Ivanova, Bojidarka; Spiteller, Michael

    2012-07-21

    A new analytical procedure using solid-state Raman spectroscopy within the THz-region for the quantitative determination of mixtures of different conformations of trifloxystrobin (EE, EZ, ZE and ZZ), tebuconazole (1), and propiconazole (2) as an effective method for the fungicide product quality monitoring programmes and control has been developed and validated. The obtained quantities were controlled independently by the validated hybrid HPLC electrospray ionization (ESI) tandem mass spectrometric (MS) and matrix-assisted laser desorption/ionization (MALDI) MS methods in the condensed phase. The quantitative dependences were obtained on the twenty binary mixtures of the analytes and were further tested on the three trade fungicide products, containing mixtures of trifloxystrobin-tebuconazole and trifloxystrobin-propiconazole, as an emissive concentrate or water soluble granules of the active ingredients. The present methods provided sufficient sensitivity as reflected by the metrologic quantities, evaluating the concentration limit of detection (LOD) and quantification (LOQ), linear limit (LL), measurement accuracy and precision, true quantity value, trueness of measurement and more. PMID:22679621

  1. Quantitative imaging of volcanic plumes — Results, needs, and future trends

    USGS Publications Warehouse

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-01-01

    Recent technology allows two-dimensional “imaging” of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry–Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  2. A quantitative autoradiographic method for the measurement of local rates of brain protein synthesis

    SciTech Connect

    Dwyer, B.E.; Donatoni, P.; Wasterlain, C.G.

    1982-05-01

    We have developed a new method for measuring local rates of brain protein synthesis in vivo. It combines the intraperitoneal injection of a large dose of low specific activity amino acid with quantitative autoradiography. This method has several advantages: 1) It is ideally suited for young or small animals or where immobilizing an animal is undesirable. 2 The amino acid injection ''floods'' amino acid pools so that errors in estimating precursor specific activity, which is especially important in pathological conditions, are minimized. 3) The method provides for the use of a radioautographic internal standard in which valine incorporation is measured directly. Internal standards from experimental animals correct for tissue protein content and self-absorption of radiation in tissue sections which could vary under experimental conditions.

  3. [Quantitative determination of morphine in opium powder by addition and correlation method using capillary electrophoresis].

    PubMed

    Sun, Guo-xiang; Miao, Ju-ru; Wang, Yu; Sun, Yu-qing

    2002-01-01

    The morphine in opium powder has been quantitatively determined by addition and correlation method (ACM), in which capillary zone electrophoresis was applied, and the average recovery was 100.6%. The relative standard deviation (RSD) of migration time was not more than 2.4%, the RSD of relative migration time was not more than 1.1%, and the RSD of the relative area was not more than 0.51%. Meanwhile, the contrast test has been done by the calibration curve method with an internal standard correlation. The content of morphine in opium powder determined by ACM was the same as that by using the calibration curve method with an internal standard correlated. The study shows that ACM is simple, quick and accurate.

  4. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  5. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  6. A Pyrosequencing Assay for the Quantitative Methylation Analysis of GALR1 in Endometrial Samples: Preliminary Results

    PubMed Central

    Kottaridi, Christine; Koureas, Nikolaos; Margari, Niki; Terzakis, Emmanouil; Bilirakis, Evripidis; Pappas, Asimakis; Chrelias, Charalampos; Spathis, Aris; Aga, Evangelia; Pouliakis, Abraham; Panayiotides, Ioannis; Karakitsos, Petros

    2015-01-01

    Endometrial cancer is the most common malignancy of the female genital tract while aberrant DNA methylation seems to play a critical role in endometrial carcinogenesis. Galanin's expression has been involved in many cancers. We developed a new pyrosequencing assay that quantifies DNA methylation of galanin's receptor-1 (GALR1). In this study, the preliminary results indicate that pyrosequencing methylation analysis of GALR1 promoter can be a useful ancillary marker to cytology as the histological status can successfully predict. This marker has the potential to lead towards better management of women with endometrial lesions and eventually reduce unnecessary interventions. In addition it can provide early warning for women with negative cytological result. PMID:26504828

  7. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci. PMID:23973416

  8. Validation of the method of quantitative phase analysis by X-ray diffraction in API: case of Tibolone

    NASA Astrophysics Data System (ADS)

    Silva, R. P.; Ambrósio, M. F. S.; Epprecht, E. K.; Avillez, R. R.; Achete, C. A.; Kuznetsov, A.; Visentin, L. C.

    2016-07-01

    In this study, different structural and microstructural models applied to X-ray analysis of powder diffraction data of polymorphic mixtures of known concentrations of Tibolone were investigated. The X-ray data obtained in different diffraction instruments were analysed via Rietveld method using the same analytical models. The results of quantitative phase analysis show that regardless of the instrument used, the values of the calculated concentrations follow the same systematics with respect to the final errors. The strategy to select a specific analytical model that leads to lower measurement errors is here presented.

  9. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations. PMID:25420387

  10. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations. PMID:25507371

  11. Quantitative analysis of in situ hybridization methods for the detection of actin gene expression.

    PubMed Central

    Lawrence, J B; Singer, R H

    1985-01-01

    We have implemented an efficient, quantitative approach for the optimization of in situ hybridization using double-stranded recombinant DNA probes. The model system studied was actin mRNA expression in chicken embryonic muscle cultures. Actin and control (pBR322) probes were nick-translated with p32 labeled nucleotides, hybridized to cells grown on coverslips, and quantitated in a scintillation counter. Cellular RNA retention was monitored via the incorporation of H3-Uridine into RNA prior to cell fixation. Over a thousand samples were analyzed, and among the technical variables examined were the fixation protocol, proteolytic cell pretreatment, the time course of hybridization, saturation kinetics, hybridization efficiency, and effect of probe size on hybridization and network formation. Results have allowed us to develop a reproducible in situ hybridization methodology which is simpler and less destructive to cellular RNA and morphology than other protocols. Moreover, this technique is highly sensitive and efficient in detection of cellular RNAs. Lastly, the rapid quantitative approach used for this analysis is valuable in itself as a potential alternative to filter or solution hybridizations. Images PMID:3889842

  12. The Quantitative Ideas and Methods in Assessment of Four Properties of Chinese Medicinal Herbs.

    PubMed

    Fu, Jialei; Pang, Jingxiang; Zhao, Xiaolei; Han, Jinxiang

    2015-04-01

    The purpose of this review is to summarize and reflect on the current status and problems of the research on the properties of Chinese medicinal herbs. Hot, warm, cold, and cool are the four properties/natures of Chinese medicinal herbs. They are defined based on the interaction between the herbs with human body. How to quantitatively assess the therapeutic effect of Chinese medicinal herbs based on the theoretical system of Traditional Chinese medicine (TCM) remains to be a challenge. Previous studies on the topic from several perspectives have been presented. Results and problems were discussed. New ideas based on the technology of biophoton radiation detection are proposed. With the development of biophoton detection technology, detection and characterization of human biophoton emission has led to its potential applications in TCM. The possibility of using the biophoton analysis system to study the interaction of Chinese medicinal herbs with human body and to quantitatively determine the effect of the Chinese medicinal herbal is entirely consistent with the holistic concept of TCM theory. The statistical entropy of electromagnetic radiations from the biological systems can characterize the four properties of Chinese medicinal herbs, and the spectrum can characterize the meridian tropism of it. Therefore, we hypothesize that by the use of biophoton analysis system, the four properties and meridian tropism of Chinese medicinal herbs can be quantitatively expressed.

  13. Quantitative Assessment of the CCMC's Experimental Real-time SWMF-Geospace Results

    NASA Astrophysics Data System (ADS)

    Liemohn, Michael; Ganushkina, Natalia; De Zeeuw, Darren; Welling, Daniel; Toth, Gabor; Ilie, Raluca; Gombosi, Tamas; van der Holst, Bart; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz

    2016-04-01

    Experimental real-time simulations of the Space Weather Modeling Framework (SWMF) are conducted at the Community Coordinated Modeling Center (CCMC), with results available there (http://ccmc.gsfc.nasa.gov/realtime.php), through the CCMC Integrated Space Weather Analysis (iSWA) site (http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/), and the Michigan SWMF site (http://csem.engin.umich.edu/realtime). Presently, two configurations of the SWMF are running in real time at CCMC, both focusing on the geospace modules, using the BATS-R-US magnetohydrodynamic model, the Ridley Ionosphere Model, and with and without the Rice Convection Model for inner magnetospheric drift physics. While both have been running for several years, nearly continuous results are available since July 2015. Dst from the model output is compared against the Kyoto real-time Dst, in particular the daily minimum value of Dst to quantify the ability of the model to capture storms. Contingency tables are presented, showing that the run with the inner magnetosphere model is much better at reproducing storm-time values. For disturbances with a minimum Dst lower than -50 nT, this version yields a probability of event detection of 0.86 and a Heidke Skill Score of 0.60. In the other version of the SWMF, without the inner magnetospheric module included, the modeled Dst never dropped below -50 nT during the examined epoch.

  14. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods.

    PubMed

    Ahmed, Rafay; Oborski, Matthew J; Hwang, Misun; Lieberman, Frank S; Mountz, James M

    2014-01-01

    Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12-15 months for glioblastomas and 2-5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies, and importantly, for facilitating patient management, sparing patients from weeks or months of toxicity and ineffective treatment. This review will present an overview of epidemiology, molecular pathogenesis and current advances in diagnoses, and management of malignant gliomas.

  15. A fast semi-quantitative method for Plutonium determination in an alpine firn/ice core

    NASA Astrophysics Data System (ADS)

    Gabrieli, J.; Cozzi, G.; Vallelonga, P.; Schwikowski, M.; Sigl, M.; Boutron, C.; Barbante, C.

    2009-04-01

    deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu profiles with smaller peaks (about 20-30% compared to the 1964 peak) which could be due to French and Chinese tests. Comparison with the Pu profiles obtained from the Col du Dome and Belukha ice cores by AMS (Accelerator Mass Spectrometry) shows very good agreement. Considering the semi-quantitative method and the analytical uncertainty, the results are also quantitatively comparable. However, the Pu concentrations at Colle Gnifetti are normally 2-3 times greater than in Col du Dome. This could be explained by different air mass transport or, more likely, different accumulation rates at each site.

  16. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development.

  17. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  18. Quantitative analysis of toxic and essential elements in human hair. Clinical validity of results.

    PubMed

    Kosanovic, Melita; Jokanovic, Milan

    2011-03-01

    Over the last three decades, there has been an increasing awareness of environmental and occupational exposures to toxic or potentially toxic trace elements. The evolution of biological monitoring includes knowledge of kinetics of toxic and/or essential elements and adverse health effects related to their exposure. The debate whether a hair is a valid sample for biomonitoring or not is still attracting the attention of analysts, health care professionals, and environmentalists. Although researchers have found many correlations of essential elements to diseases, metabolic disorders, environmental exposures, and nutritional status, opponents of the concept of hair analysis object that hair samples are unreliable due to the influence of external factors. This review discusses validity of hair as a sample for biomonitoring of essential and toxic elements, with emphasis on pre-analytical, analytical, and post-analytical factors influencing results.

  19. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  20. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  1. Setting health research priorities using the CHNRI method: VI. Quantitative properties of human collective opinion

    PubMed Central

    Yoshida, Sachiyo; Rudan, Igor; Cousens, Simon

    2016-01-01

    Introduction Crowdsourcing has become an increasingly important tool to address many problems – from government elections in democracies, stock market prices, to modern online tools such as TripAdvisor or Internet Movie Database (IMDB). The CHNRI method (the acronym for the Child Health and Nutrition Research Initiative) for setting health research priorities has crowdsourcing as the major component, which it uses to generate, assess and prioritize between many competing health research ideas. Methods We conducted a series of analyses using data from a group of 91 scorers to explore the quantitative properties of their collective opinion. We were interested in the stability of their collective opinion as the sample size increases from 15 to 90. From a pool of 91 scorers who took part in a previous CHNRI exercise, we used sampling with replacement to generate multiple random samples of different size. First, for each sample generated, we identified the top 20 ranked research ideas, among 205 that were proposed and scored, and calculated the concordance with the ranking generated by the 91 original scorers. Second, we used rank correlation coefficients to compare the ranks assigned to all 205 proposed research ideas when samples of different size are used. We also analysed the original pool of 91 scorers to to look for evidence of scoring variations based on scorers' characteristics. Results The sample sizes investigated ranged from 15 to 90. The concordance for the top 20 scored research ideas increased with sample sizes up to about 55 experts. At this point, the median level of concordance stabilized at 15/20 top ranked questions (75%), with the interquartile range also generally stable (14–16). There was little further increase in overlap when the sample size increased from 55 to 90. When analysing the ranking of all 205 ideas, the rank correlation coefficient increased as the sample size increased, with a median correlation of 0.95 reached at the sample size

  2. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  3. Quantitative Results from Shockless Compression Experiments on Solids to Multi-Megabar Pressure

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul; Brown, Justin; Knudson, Marcus; Lemke, Raymond

    2015-03-01

    Quasi-isentropic, shockless ramp-wave experiments promise accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multi-megabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited pressure accuracy due to reliance on theoretical EOS of calibration standards, thus accurate quasi-isentropic compression data would help immensely in constraining EOS models. Multi-megabar shockless compression experiments using the Z Machine at Sandia as a magnetic drive with stripline targets continue to be performed on a number of solids. New developments will be presented in the design and analysis of these experiments, including topics such as 2-D and magneto-hydrodynamic (MHD) effects and the use of LiF windows. Results will be presented for tantalum and/or gold metals, with comparisons to independently developed EOS. * Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  4. A method for the simultaneous identification and quantitation of five superwarfarin rodenticides in human serum.

    PubMed

    Kuijpers, E A; den Hartigh, J; Savelkoul, T J; de Wolff, F A

    1995-01-01

    A high-performance liquid chromatographic method with ultraviolet (UV) and fluorescence detection was developed for the analysis of one indandione and four hydroxycoumarin anticoagulant rodenticides in human serum. The superwarfarin rodenticides, chlorophacinone, bromadiolone, difenacoum, brodifacoum, and difethialone, can be identified and quantitated simultaneously with this method. After adding a buffer (pH 5.5), the anticoagulants were extracted from serum with chloroform-acetone. The organic phase was separated and evaporated to dryness, and the residue was subjected to chromatographic analysis. The anticoagulants were separated by reversed-phase chromatography and detected by UV absorption at 285 nm and by fluorescence at an excitation wavelength of 265 nm and an emission wavelength of 400 nm. Extraction efficiencies from 55 to 131% were obtained. The within-run precision ranged from 2.0 to 7.1% for UV detection and from 0.0 to 4.8% for fluorescence detection. Between-run precision ranged from 1.3 to 16.0% for UV detection and from 1.8 to 9.0% for fluorescence detection. The anticoagulants can be quantitated at serum concentrations down to 3-12 ng/mL for fluorescence detection and down to 20-75 ng/mL for UV detection. No interferences were observed with the related compounds warfarin and vitamin K1. PMID:8577178

  5. Quantitative assessment of apatite formation via a biomimetic method using quartz crystal microbalance.

    PubMed

    Tanahashi, M; Kokubo, T; Matsuda, T

    1996-06-01

    Quantitative assessment of hydroxyapatite formation on a gold surface via the biomimetic method, composed of a nucleation step in a simulated body fluid (SBF) containing glass powders and a subsequent apatite growth step in glass powder-free SBF, was made using a quartz crystal microbalance (QCM) technique. The frequency change of the QCM linearly increased with increasing soaking time, and largely depended on the nucleation period. The growth rates, defined as daily increase in thickness, increased monotonicly with an increasing nucleation period of up to 96 h, thereafter being constant at 2.0 microns/day. The growth rate of the apatite layer increased with increasing temperature of the SBF: 0.9, 2.0, and 3.8 microns/day at 25, 37, and 50 degrees C, respectively. The Arrhenius-type activation energy for the growth of apatite was 47.3 kJ/mol. The QCM method was found to be a very powerful tool for quantitative, in situ measurement of precipitation and growth of apatite in real time.

  6. Excised porcine skin experimental systems to validate quantitative microdialysis methods for determination of drugs in skin after topical application.

    PubMed

    Seki, Toshinobu; Wang, Aiping; Yuan, Dan; Saso, Yuko; Hosoya, Osamu; Chono, Sumio; Morimoto, Kazuhiro

    2004-11-24

    Microdialysis is useful as a method to evaluate the disposition of drugs in the skin to design improved transdermal delivery systems (TDDSs). In this study, quantitative microdialysis methods were validated in excised porcine skin experimental systems in vitro. Flurbiprofen (FP), used as a model drug, showed high affinity for the skin tissues in equilibrium states between the medium and skin. The membrane clearances of FP for permeation through the membrane of a dialysis fiber placed in the skin (CL(m in S)) were lower than that in the medium. The adsorption of components in the skin to the membrane surface of the dialysis fiber and accumulation of FP near the dialysis fiber are the most likely reasons for this. When CL(m in S) was used to predict the extracellular FP concentration in skin (C(T)), the value obtained was lower than that expected from the FP concentration in the medium on the dermis side, which should be equal to C(T) at equilibrium. In the zero net flux (ZNF) method, in which the concentration difference of perfusate (DeltaC) between the inflow and outflow were used to obtain C(T), the predicted C(T) was similar to the expected value. In an in vitro skin permeation experiment, the ZNF method was used for the prediction of C(T) near the dialysis fiber. The predicted C(T) was over 10 times higher than the FP concentration in the medium on the dermis side, suggesting a concentration gradient in the dermis. Although the ZNF method is good for predicting the C(T) in skin, the mass balance has to be considered for the quantitative evaluation of the skin permeation of drugs. In this study, the effect of the mass transfer of FP from the perfusate to the skin on the cumulative amount of FP passing through the skin was relatively low because of the use of suitable solutions as perfusate. The perfusion conditions and schedules should be designed carefully for quantitative evaluations using the ZNF method. These results provide useful information for the in

  7. Age-related changes in rat cerebellar basket cells: a quantitative study using unbiased stereological methods

    PubMed Central

    HENRIQUE, RUI M. F.; ROCHA, EDUARDO; REIS, ALCINDA; MARCOS, RICARDO; OLIVEIRA, MARIA H.; SILVA, MARIA W.; MONTEIRO, ROGÉRIO A. F.

    2001-01-01

    Cortical cerebellar basket cells are stable postmitotic cells; hence, they are liable to endure age-related changes. Since the cerebellum is a vital organ for the postural control, equilibrium and motor coordination, we aimed to determine the quantitative morphological changes in those interneurons with the ageing process, using unbiased techniques. Material from the cerebellar cortex (Crus I and Crus II) was collected from female rats aged 2, 6, 9, 12, 15, 18, 21 and 24 mo (5 animals per each age group), fixed by intracardiac perfusion, and processed for transmission electron microscopy, using conventional techniques. Serial semithin sections were obtained (5 blocks from each rat), enabling the determination of the number-weighted mean nuclear volume (by the nucleator method). On ultrathin sections, 25 cell profiles from each animal were photographed. The volume density of the nucleus, ground substance, mitochondria, Golgi apparatus (Golgi) and dense bodies (DB), and the mean surface density of the rough endoplasmic reticulum (RER) were determined, by point counting, using a morphometric grid. The mean total volumes of the soma and organelles and the mean total surface area of the RER [s̄N (RER)] were then calculated. The results were analysed with 1-way ANOVA; posthoc pairwise comparisons of group means were performed using the Newman-Keuls test. The relation between age and each of the parameters was studied by regression analysis. Significant age-related changes were observed for the mean volumes of the soma, ground substance, Golgi, DB, and s̄N (RER). Positive linear trends were found for the mean volumes of the ground substance, Golgi, and DB; a negative linear trend was found for the s̄N (RER). These results indicate that rat cerebellar basket cells endure important age-related changes. The significant decrease in the s̄N (RER) may be responsible for a reduction in the rate of protein synthesis. Additionally, it may be implicated in a cascade of events

  8. Quantitative analysis of gene expression in fixed colorectal carcinoma samples as a method for biomarker validation.

    PubMed

    Ostasiewicz, Beata; Ostasiewicz, Paweł; Duś-Szachniewicz, Kamila; Ostasiewicz, Katarzyna; Ziółkowski, Piotr

    2016-06-01

    Biomarkers have been described as the future of oncology. Modern proteomics provide an invaluable tool for the near‑whole proteome screening for proteins expressed differently in neoplastic vs. healthy tissues. However, in order to select the most promising biomarkers, an independent method of validation is required. The aim of the current study was to propose a methodology for the validation of biomarkers. Due to material availability the majority of large scale biomarker studies are performed using formalin‑fixed paraffin‑embedded (FFPE) tissues, therefore these were selected for use in the current study. A total of 10 genes were selected from what have been previously described as the most promising candidate biomarkers, and the expression levels were analyzed with reverse transcription‑quantitative polymerase chain reaction (RT‑qPCR) using calibrator normalized relative quantification with the efficiency correction. For 6/10 analyzed genes, the results were consistent with the proteomic data; for the remaining four genes, the results were inconclusive. The upregulation of karyopherin α 2 (KPNA2) and chromosome segregation 1‑like (CSE1L) in colorectal carcinoma, in addition to downregulation of chloride channel accessory 1 (CLCA1), fatty acid binding protein 1 (FABP1), sodium channel, voltage gated, type VII α subunit (SCN7A) and solute carrier family 26 (anion exchanger), member 3 (SLC26A3) was confirmed. With the combined use of proteomic and genetic tools, it was reported, for the first time to the best of our knowledge, that SCN7A was downregulated in colorectal carcinoma at mRNA and protein levels. It had been previously suggested that the remaining five genes served an important role in colorectal carcinogenesis, however the current study provided strong evidence to support their use as biomarkers. Thus, it was concluded that combination of RT‑qPCR with proteomics offers a powerful methodology for biomarker identification, which

  9. Simple saponification method for the quantitative determination of carotenoids in green vegetables.

    PubMed

    Larsen, Erik; Christensen, Lars P

    2005-08-24

    A simple, reliable, and gentle saponification method for the quantitative determination of carotenoids in green vegetables was developed. The method involves an extraction procedure with acetone and the selective removal of the chlorophylls and esterified fatty acids from the organic phase using a strongly basic resin (Ambersep 900 OH). Extracts from common green vegetables (beans, broccoli, green bell pepper, chive, lettuce, parsley, peas, and spinach) were analyzed by high-performance liquid chromatography (HPLC) for their content of major carotenoids before and after action of Ambersep 900 OH. The mean recovery percentages for most carotenoids [(all-E)-violaxanthin, (all-E)-lutein epoxide, (all-E)-lutein, neolutein A, and (all-E)-beta-carotene] after saponification of the vegetable extracts with Ambersep 900 OH were close to 100% (99-104%), while the mean recovery percentages of (9'Z)-neoxanthin increased to 119% and that of (all-E)-neoxanthin and neolutein B decreased to 90% and 72%, respectively.

  10. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity

    PubMed Central

    O’Brien-Simpson, Neil M.; Pantarat, Namfon; Attard, Troy J.; Walsh, Katrina A.; Reynolds, Eric C.

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  11. [Stich-plaque test--an economic method for quantitative determination of viruses].

    PubMed

    Waschke, K; Stäber, H

    1976-01-01

    An economic method for quantitative assay of viruses is presented. In this "canule stick-plaque test" (German abbreviation SPT) samples of viruses, geometrically diluted and taken up by a canule, are inoculated by sticking into monolayer cell cultures overlayed with agar medium. A plaquelike CPE detectable by neutral red staining develops in the area of the inoculation. The frequency of this CPE formation depends on the concentration of viruses in the inoculated dilution. This dose-response allows calculation of the ID 50. In this way it is possible to carry out titration involving 6 dilutions and 10 inoculations per dilution using 3 common Petri dishes (6 cm in diameter), only. The sensitivity , accuracy, and reproductibility of this method are described and discussed.

  12. [Stick-plaque test--an economic method of quantitative determination of viruses].

    PubMed

    Waschke, K; Stäber, H

    1976-01-01

    An economic method for quantitative assay of viruses is presented. In this "canule stick-plaque test" (German abbreviation SPT) samples of viruses, geometrically diluted and taken up by a canule, are inoculated by sticking into monolayer cell cultures overlayed with agar medium. A plaquelike CPE detectable by neutral red staining develops in the area of the inoculation. The frequency of this CPE formation depends on the concentration of viruses in the inoculated dilution. This dose-response allows calculation of the ID50. In this way it is possible to carry out titration involving 6 dilutions and 10 inoculations per dilution using 3 common Petri dishes (6 cm in diameter), only. The sensitivity, accuracy, and reproductibility of this method are described and discussed.

  13. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity.

    PubMed

    O'Brien-Simpson, Neil M; Pantarat, Namfon; Attard, Troy J; Walsh, Katrina A; Reynolds, Eric C

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  14. A quantitative method for measurement of HL-60 cell apoptosis based on diffraction imaging flow cytometry technique

    PubMed Central

    Yang, Xu; Feng, Yuanming; Liu, Yahui; Zhang, Ning; Lin, Wang; Sa, Yu; Hu, Xin-Hua

    2014-01-01

    A quantitative method for measurement of apoptosis in HL-60 cells based on polarization diffraction imaging flow cytometry technique is presented in this paper. Through comparative study with existing methods and the analysis of diffraction images by a gray level co-occurrence matrix algorithm (GLCM), we found 4 GLCM parameters of contrast (CON), cluster shade (CLS), correlation (COR) and dissimilarity (DIS) exhibit high sensitivities as the apoptotic rates. It was further demonstrated that the CLS parameter correlates significantly (R2 = 0.899) with the degree of nuclear fragmentation and other three parameters showed a very good correlations (R2 ranges from 0.69 to 0.90). These results demonstrated that the new method has the capability for rapid and accurate extraction of morphological features to quantify cellular apoptosis without the need for cell staining. PMID:25071957

  15. Characterization of thermal desorption instrumentation with a direct liquid deposition calibration method for trace 2,4,6-trinitrotoluene quantitation.

    PubMed

    Field, Christopher R; Giordano, Braden C; Rogers, Duane A; Lubrano, Adam L; Rose-Pehrsson, Susan L

    2012-03-01

    The use of thermal desorption systems for the analysis of trace vapors typically requires establishing a calibration curve from vapors generated with a permeation tube. The slow equilibration time of permeation tubes causes such an approach to become laborious when covering a wide dynamic range. Furthermore, many analytes of interest, such as explosives, are not available as permeation tubes. A method for easily and effectively establishing calibration curves for explosive vapor samples via direct deposition of standard solutions on thermal desorption tubes was investigated. The various components of the thermal desorption system were compared to a standard split/splitless inlet. Calibration curves using the direct liquid deposition method with a thermal desorption unit coupled to a cryo-focusing inlet were compared to a standard split/splitless inlet, and a statistical difference was observed but does not eliminate or deter the use of the direct liquid deposition method for obtaining quantitative results for explosive vapors.

  16. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  17. A computational quantitative structure-activity relationship study of carbamate anticonvulsants using quantum pharmacological methods.

    PubMed

    Knight, J L; Weaver, D F

    1998-10-01

    A pattern recognition quantitative structure-activity relationship (QSAR) study has been performed to determine the molecular features of carbamate anticonvulsants which influence biological activity. Although carbamates, such as felbamate, have been used to treat epilepsy, their mechanisms of efficacy and toxicity are not completely understood. Quantum and classical mechanics calculations have been exploited to describe 46 carbamate drugs. Employing a principal component analysis and multiple linear regression calculations, five crucial structural descriptors were identified which directly relate to the bioactivity of the carbamate family. With the resulting mathematical model, the biological activity of carbamate analogues can be predicted with 85-90% accuracy.

  18. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody.

    PubMed

    Yoshinari, Tomoya; Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena; Ohkawa, Hideo; Sugita-Konishi, Yoshiko

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L(-1). The coefficients of variation were 7.9% at 0.003 mg L(-1), 5.0% at 0.03 mg L(-1) and 13.7% at 0.3 mg L(-1), respectively. The limit of detection was 0.006 mg L(-1) for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9-100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg(-1). The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R(2) = 0.9760) than the immunochromatographic assay kit (R(2) = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety.

  19. Assessment of Riboflavin as a Tracer Substance: Comparison of a Qualitative to a Quantitative Method of Riboflavin Measurement

    PubMed Central

    Herron, Abigail J.; Mariani, John J.; Pavlicova, Martina; Parinello, Christina M.; Bold, Krysten W.; Levin, Frances R.; Nunes, Edward V.; Sullivan, Maria A.; Raby, Wilfred N.; Bisaga, Adam

    2013-01-01

    Background Noncompliance with medications may have major impacts on outcomes measured in research, potentially distorting the validity of controlled clinical trials. Riboflavin is frequently used in trials as a marker of adherence. It can be combined with study medication and is excreted in urine where it fluoresces under UV light. This study compares qualitative visual inspection of fluorescence to quantitative fluorometric analysis of riboflavin concentration in its ability to detect the presence of riboflavin in urine. Methods Twenty-four volunteers received 0 mg, 25 mg, and 50 mg doses of riboflavin under single-blind conditions, with 20 also receiving a 100 mg dose. Five serial urine samples were collected over the following 36 hours. Quantitative measurement of riboflavin by fluorometric analysis and qualitative assessment of each sample using visual inspection were performed. Results The overall false positive rate for qualitative assessment was 53%. For quantitative assessment, a riboflavin concentration of 900 ng/mL was established to classify positive samples. More than 80% of samples were positive 2 to 24 hours following ingestion of 25 mg and 50 mg, and less than 80% were positive at 36 hours. At least 95% of observations for the 100 mg dose were above 900 ng/mL at all timepoints. Conclusions Quantitative fluorometric assessment is superior to qualitative visual inspection alone in determining medication adherence. The combination of 25–50 mg of daily riboflavin and a cut-off level of 900 ng/mL allows for the acceptable sensitivity of missing detection of non-compliant participants while preserving a high level of power to detect all cases of medication compliance. PMID:22921475

  20. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  1. Initial Results of an MDO Method Evaluation Study

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley MDO method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of re- producible experiments. In the first phase of the study, three MDO methods were implemented in the SIGHT: framework and used to solve a set of ten relatively simple problems. In this paper, we comment on the general considerations for conducting method evaluation studies and report some initial results obtained to date. In particular, although the results are not conclusive because of the small initial test set, other formulations, optimality conditions, and sensitivity of solutions to various perturbations. Optimization algorithms are used to solve a particular MDO formulation. It is then appropriate to speak of local convergence rates and of global convergence properties of an optimization algorithm applied to a specific formulation. An analogous distinction exists in the field of partial differential equations. On the one hand, equations are analyzed in terms of regularity, well-posedness, and the existence and unique- ness of solutions. On the other, one considers numerous algorithms for solving differential equations. The area of MDO methods studies MDO formulations combined with optimization algorithms, although at times the distinction is blurred. It is important to

  2. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  3. A Dilute-and-Shoot LC-MS Method for Quantitating Opioids in Oral Fluid.

    PubMed

    Enders, Jeffrey R; McIntire, Gregory L

    2015-10-01

    Opioid testing represents a dominant share of the market in pain management clinical testing facilities. Testing of this drug class in oral fluid (OF) has begun to rise in popularity. OF analysis has traditionally required extensive clean-up protocols and sample concentration, which can be avoided. This work highlights the use of a fast, 'dilute-and-shoot' method that performs no considerable sample manipulation. A quantitative method for the determination of eight common opioids and associated metabolites (codeine, morphine, hydrocodone, hydromorphone, norhydrocodone, oxycodone, noroxycodone and oxymorphone) in OF is described herein. OF sample is diluted 10-fold in methanol/water and then analyzed using an Agilent chromatographic stack coupled with an AB SCIEX 4500. The method has a 2.2-min LC gradient and a cycle time of 2.9 min. In contrast to most published methods of this particular type, this method uses no sample clean-up or concentration and has a considerably faster LC gradient, making it ideal for very high-throughput laboratories. Importantly, the method requires only 100 μL of sample and is diluted 10-fold prior to injection to help with instrument viability. Baseline separation of all isobaric opioids listed above was achieved on a phenyl-hexyl column. The validated calibration range for this method is 2.5-1,000 ng/mL. This 'dilute-and-shoot' method removes the unnecessary, costly and time-consuming extraction steps found in traditional methods and still surpasses all analytical requirements. PMID:26378142

  4. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  5. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors

  6. Evaluating Multiple Prevention Programs: Methods, Results, and Lessons Learned

    ERIC Educational Resources Information Center

    Adler-Baeder, Francesca; Kerpelman, Jennifer; Griffin, Melody M.; Schramm, David G.

    2010-01-01

    Extension faculty and agents/educators are increasingly collaborating with local and state agencies to provide and evaluate multiple, distinct programs, yet there is limited information about measuring outcomes and combining results across similar program types. This article explicates the methods and outcomes of a state-level evaluation of…

  7. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  8. Evaluation of a quantitative fit testing method for N95 filtering facepiece respirators.

    PubMed

    Janssen, Larry; Luinenburg, Michael D; Mullins, Haskell E; Danisch, Susan G; Nelson, Thomas J

    2003-01-01

    A method for performing quantitative fit tests (QNFT) with N95 filtering facepiece respirators was developed by earlier investigators. The method employs a simple clamping device to allow the penetration of submicron aerosols through N95 filter media to be measured. The measured value is subtracted from total penetration, with the assumption that the remaining penetration represents faceseal leakage. The developers have used the clamp to assess respirator performance. This study evaluated the clamp's ability to measure filter penetration and determine fit factors. In Phase 1, subjects were quantitatively fit-tested with elastomeric half-facepiece respirators using both generated and ambient aerosols. QNFT were done with each aerosol with both P100 and N95 filters without disturbing the facepiece. In Phase 2 of the study elastomeric half facepieces were sealed to subjects' faces to eliminate faceseal leakage. Ambient aerosol QNFT were performed with P100 and N95 filters without disturbing the facepiece. In both phases the clamp was used to measure N95 filter penetration, which was then subtracted from total penetration for the N95 QNFT. It was hypothesized that N95 fit factors corrected for filter penetration would equal the P100 fit factors. Mean corrected N95 fit factors were significantly different from the P100 fit factors in each phase of the study. In addition, there was essentially no correlation between corrected N95 fit factors and P100 fit factors. It was concluded that the clamp method should not be used to fit-test N95 filtering facepieces or otherwise assess respirator performance. PMID:12908863

  9. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams.

  10. A quantitative method for zoning of protected areas and its spatial ecological implications.

    PubMed

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  11. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams. PMID:26395288

  12. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis.

  13. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  14. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  15. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research.

  16. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  17. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  18. Simple, Rapid and Inexpensive Quantitative Fluorescent PCR Method for Detection of Microdeletion and Microduplication Syndromes

    PubMed Central

    Stofanko, Martin; Gonçalves-Dornelas, Higgor; Cunha, Pricila Silva; Pena, Heloísa B.; Vianna-Morgante, Angela M.; Pena, Sérgio Danilo Junho

    2013-01-01

    Because of economic limitations, the cost-effective diagnosis of patients affected with rare microdeletion or microduplication syndromes is a challenge in developing countries. Here we report a sensitive, rapid, and affordable detection method that we have called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR). Our procedure is based on the finding of genomic regions with high homology to segments of the critical microdeletion/microduplication region. PCR amplification of both using the same primer pair, establishes competitive kinetics and relative quantification of amplicons, as happens in microsatellite-based Quantitative Fluorescence PCR. We used patients with two common microdeletion syndromes, the Williams-Beuren syndrome (7q11.23 microdeletion) and the 22q11.2 microdeletion syndromes and discovered that MQF-PCR could detect both with 100% sensitivity and 100% specificity. Additionally, we demonstrated that the same principle could be reliably used for detection of microduplication syndromes, by using patients with the Lubs (MECP2 duplication) syndrome and the 17q11.2 microduplication involving the NF1 gene. We propose that MQF-PCR is a useful procedure for laboratory confirmation of the clinical diagnosis of microdeletion/microduplication syndromes, ideally suited for use in developing countries, but having general applicability as well. PMID:23620743

  19. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  20. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  1. [THE COMPARATIVE ANALYSIS OF RESULTS OF DETECTION OF CARCINOGENIC TYPES OF HUMAN PAPILLOMA VIRUS BY QUALITATIVE AND QUANTITATIVE TESTS].

    PubMed

    Kuzmenko, E T; Labigina, A V; Leshenko, O Ya; Rusanov, D N; Kuzmenko, V V; Fedko, L P; Pak, I P

    2015-05-01

    The analysis of results of screening (n = 3208; sexually active citizen aged from 18 to 59 years) was carried out to detect oncogene types of human papilloma virus in using qualitative (1150 females and 720 males) and quantitative (polymerase chain reaction in real-time (843 females and 115 males) techniques. The human papilloma virus of high oncogene type was detected in 65% and 68.4% of females and in 48.6% and 53% of males correspondingly. Among 12 types of human papilloma virus the most frequently diagnosed was human papilloma virus 16 independently of gender of examined and technique of analysis. In females, under application of qualitative tests rate of human papilloma virus 16 made up to 18.3% (n = 280) and under application of quantitative tests Rte of human papilloma virus made up to 14.9% (n = 126; p ≤ 0.05). Under examination of males using qualitative tests rate of human papilloma virus 16 made up to 8.3% (n = 60) and under application of qualitative tests made up to 12.2% (n = 14; p ≥ 0.05). Under application of qualitative tests rate of detection on the rest ofoncogene types of human papilloma virus varied in females from 3.4% to 8.4% and in males from 1.8% to 5.9%. Under application of qualitative tests to females rate of human papilloma virus with high viral load made up to 68.4%, with medium viral load - 2.85% (n = 24) and with low viral load -0.24% (n = 2). Under application of quantitative tests in males rate of detection of types of human papilloma virus made up to 53% and at that in all high viral load was established. In females, the most of oncogene types of human papilloma virus (except for 31, 39, 59) are detected significantly more often than in males.

  2. Setting health research priorities using the CHNRI method: V. Quantitative properties of human collective knowledge

    PubMed Central

    Rudan, Igor; Yoshida, Sachiyo; Wazny, Kerri; Chan, Kit Yee; Cousens, Simon

    2016-01-01

    Introduction The CHNRI method for setting health research priorities has crowdsourcing as the major component. It uses the collective opinion of a group of experts to generate, assess and prioritize between many competing health research ideas. It is difficult to compare the accuracy of human individual and collective opinions in predicting uncertain future outcomes before the outcomes are known. However, this limitation does not apply to existing knowledge, which is an important component underlying opinion. In this paper, we report several experiments to explore the quantitative properties of human collective knowledge and discuss their relevance to the CHNRI method. Methods We conducted a series of experiments in groups of about 160 (range: 122–175) undergraduate Year 2 medical students to compare their collective knowledge to their individual knowledge. We asked them to answer 10 questions on each of the following: (i) an area in which they have a degree of expertise (undergraduate Year 1 medical curriculum); (ii) an area in which they likely have some knowledge (general knowledge); and (iii) an area in which they are not expected to have any knowledge (astronomy). We also presented them with 20 pairs of well–known celebrities and asked them to identify the older person of the pair. In all these experiments our goal was to examine how the collective answer compares to the distribution of students’ individual answers. Results When answering the questions in their own area of expertise, the collective answer (the median) was in the top 20.83% of the most accurate individual responses; in general knowledge, it was in the top 11.93%; and in an area with no expertise, the group answer was in the top 7.02%. However, the collective answer based on mean values fared much worse, ranging from top 75.60% to top 95.91%. Also, when confronted with guessing the older of the two celebrities, the collective response was correct in 18/20 cases (90%), while the 8 most

  3. Genetic variation in flowering time induces phenological assortative mating: quantitative genetic methods applied to Brassica rapa.

    PubMed

    Weis, Arthur E; Kossler, Tanya M

    2004-06-01

    It has been argued from first principles that plants mate assortatively by flowering time. However, there have been very few studies of phenological assortative mating, perhaps because current methods to infer paternal phenotype are difficult to apply to natural populations. Two methods are presented to estimate the phenotypic correlation between mates-the quantitative genetic metric for assortative mating-for phenological traits. The first method uses individual flowering schedules to estimate mating probabilities for every potential pairing in a sample. These probabilities are then incorporated into a weighted phenotypic correlation between all potential mates and thus yield a prospective estimate based on mating opportunities. The correlation between mates can also be estimated retrospectively by comparing the regression of offspring phenotype over one parent, which is inflated by assortative mating, to the regression over mid-parent, which is not. In a demonstration experiment with Brassica rapa, the prospective correlation between flowering times (days from germination to anthesis) of pollen recipients and their potential donors was 0.58. The retrospective estimate of this correlation strongly agreed with the prospective estimate. The prospective method is easily employed in field studies that explore the effect of phenological assortative mating on selection response and population differentiation.

  4. Spectrophotometric Method for Quantitative Determination of Cefixime in Bulk and Pharmaceutical Preparation Using Ferroin Complex

    NASA Astrophysics Data System (ADS)

    Naeem Khan, M.; Qayum, A.; Ur Rehman, U.; Gulab, H.; Idrees, M.

    2015-09-01

    A method was developed for the quantitative determination of cefixime in bulk and pharmaceutical preparations using ferroin complex. The method is based on the oxidation of the cefixime with Fe(III) in acidic medium. The formed Fe(II) reacts with 1,10-phenanthroline, and the ferroin complex is measured spectrophotometrically at 510 nm against reagent blank. Beer's law was obeyed in the concentration range 0.2-10 μg/ml with a good correlation of 0.993. The molar absorptivity was calculated and was found to be 1.375×105 L/mol × cm. The limit of detection (LOD) and limit of quantification (LOQ) were found to be 0.030 and 0.101 μg/ml respectively. The proposed method has reproducibility with a relative standard deviation of 5.28% (n = 6). The developed method was validated statistically by performing a recoveries study and successfully applied for the determination of cefixime in bulk powder and pharmaceutical formulations without interferences from common excipients. Percent recoveries were found to range from 98.00 to 102.05% for the pure form and 97.83 to 102.50% for pharmaceutical preparations.

  5. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    NASA Astrophysics Data System (ADS)

    Ryan, C. G.; Laird, J. S.; Fisher, L. A.; Kirkham, R.; Moorhead, G. F.

    2015-11-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  6. Some methods of obtaining quantitative structure-activity relationships for quantities of environmental interest

    SciTech Connect

    Charton, M.

    1985-09-01

    Methods are described for obtaining quantitative structure-activity relationships (QSAR) for the estimation of quantities of environmental interest. Toxicities of alkylamines and of alkyl alkanoates are well correlated by the alkyl bioactivity branching equation (ABB). Narcotic activities of 1,1-disubstituted ethylenes are correlated by the intermolecular forces bioactivity (IMF) equation. When the data set has a limited number of substituents in equivalent positions the group number (GN) equation, derivable from the IMF equation, can be used for correlation. It has been successfully applied to aqueous solubilities, 1-octanol-water partition coefficients, and bioaccumulation factors and ecological magnifications for organochlorine compounds. A combination of the omega method for combining data sets for different organisms with the GN equation has been used to correlate toxicities of organochlorine insecticides in two species of fish. Toxicities of carbamates have been correlated by a combination of the zeta method and the IMFB equation. The ABB and the GN equations are particularly useful in that they generally do not require parameter tables, and that the parameters they use are error-free. The methods presented here, as shown by the examples given, should make it possible to establish a collection of QSAR for toxicities, bioaccumulation factors, aqueous solubilities, partition coefficients, and other properties of sets of compounds of environmental interest. 29 references.

  7. A simple, quantitative method using alginate gel to determine rat colonic tumor volume in vivo.

    PubMed

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-04-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the Apc(Pirc/+) rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm³. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained.

  8. A Simple, Quantitative Method Using Alginate Gel to Determine Rat Colonic Tumor Volume In Vivo

    PubMed Central

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-01-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the ApcPirc/+ rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm3. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained. PMID:24674588

  9. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  10. Estimation methods for monthly humidity from dynamical downscaling data for quantitative assessments of climate change impacts

    NASA Astrophysics Data System (ADS)

    Ueyama, Hideki

    2012-07-01

    Methods are proposed to estimate the monthly relative humidity and wet bulb temperature based on observations from a dynamical downscaling coupled general circulation model with a regional climate model (RCM) for a quantitative assessment of climate change impacts. The water vapor pressure estimation model developed was a regression model with a monthly saturated water vapor pressure that used minimum air temperature as a variable. The monthly minimum air temperature correction model for RCM bias was developed by stepwise multiple regression analysis using the difference in monthly minimum air temperatures between observations and RCM output as a dependent variable and geographic factors as independent variables. The wet bulb temperature was estimated using the estimated water vapor pressure, air temperature, and atmospheric pressure at ground level both corrected for RCM bias. Root mean square errors of the data decreased considerably in August.

  11. Prediction of activated carbon adsorption capacities for organic vapors using quantitative structure-activity relationship methods

    SciTech Connect

    Nirmalakhandan, N.N. ); Speece, R.E. )

    1993-08-01

    Quantitative structure-activity relationship (QSAR) methods were used to develop models to estimate and predict activated carbon adsorption capacities for organic vapors. Literature isothermal data from two sources for 22 organic contaminants on six different carbons were merged to form a training set of 75 data points. Two different QSAR approaches were evaluated: the molecular connectivity approach and the linear solvation energy relationship approach. The QSAR model developed in this study using the molecular connectivity approach was able to fit the experimental data with r = 0.96 and standard error of 0.09. The utility of the model was demonstrated by using predicted k values to calculate adsorption capacities of 12 chemicals on two different carbons and comparing them with experimentally determined values. 9 refs., 1 fig., 3 tabs.

  12. A method for quantitative determination of furanocoumarins in capsules and tablets of phytochemical preparations.

    PubMed

    Cardoso, Claudia Andréa Lima; Pires, Adriana Elias; Honda, Neli Kika

    2006-04-01

    A method for sample preparation and analysis by high-performance liquid chromatography with UV detection (HPLC-UV) was developed for analysis of psoralen, bergapten and 5-[3-(4,5-dihydro-5,5-dimethyl-4-oxo-2-furanyl)-butoxy]-7H-furo[3-2-g][1]benzopyran-7-one in capsules and tablets employed in Brazil for certain illnesses. The linearity, accuracy, the inter- and intra-day precision of the procedure were evaluated. Analytical curves for furanocoumarins were linear in the range of 1.0-50.0 microg/ml. The recoveries of the furanocoumarins in the products analyzed were 97.3-99.5%, and the percent coefficient of variation for the quantitative analysis of the furanocoumarins in the analyses was under 5%. For inter-equipment study gas chromatography (GC) was employed. PMID:16595942

  13. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  14. Methods for quantitative evaluation of dynamics of repair proteins within irradiated cells

    NASA Astrophysics Data System (ADS)

    Hable, V.; Dollinger, G.; Greubel, C.; Hauptner, A.; Krücken, R.; Dietzel, S.; Cremer, T.; Drexler, G. A.; Friedl, A. A.; Löwe, R.

    2006-04-01

    Living HeLa cells are irradiated well directed with single 100 MeV oxygen ions by the superconducting ion microprobe SNAKE, the Superconducting Nanoscope for Applied Nuclear (=Kern-) Physics Experiments, at the Munich 14 MV tandem accelerator. Various proteins, which are involved directly or indirectly in repair processes, accumulate as clusters (so called foci) at DNA-double strand breaks (DSBs) induced by the ions. The spatiotemporal dynamics of these foci built by the phosphorylated histone γ-H2AX are studied. For this purpose cells are irradiated in line patterns. The γ-H2AX is made visible under the fluorescence microscope using immunofluorescence techniques. Quantitative analysis methods are developed to evaluate the data of the microscopic images in order to analyze movement of the foci and their changing size.

  15. Quantitative test method for evaluation of anti-fingerprint property of coated surfaces

    NASA Astrophysics Data System (ADS)

    Wu, Linda Y. L.; Ngian, S. K.; Chen, Z.; Xuan, D. T. T.

    2011-01-01

    An artificial fingerprint liquid is formulated from artificial sweat, hydroxyl-terminated polydimethylsiloxane and a solvent for direct determination of anti-fingerprint property of a coated surface. A range of smooth and rough surfaces with different anti-fingerprint (AF) properties were fabricated by sol-gel technology, on which the AF liquid contact angles, artificial fingerprint and real human fingerprints (HF) were verified and correlated. It is proved that a surface with AF contact angle above 87° is fingerprint free. This provides an objective and quantitative test method to determine anti-fingerprint property of coated surfaces. It is also concluded that AF property can be achieved on smooth and optically clear surfaces. Deep porous structures are more favorable than bumpy structure for oleophobic and AF properties.

  16. Comparison of Myocardial Perfusion Estimates From Dynamic Contrast-Enhanced Magnetic Resonance Imaging With Four Quantitative Analysis Methods

    PubMed Central

    Pack, Nathan A.; DiBella, Edward V. R.

    2012-01-01

    Dynamic contrast-enhanced MRI has been used to quantify myocardial perfusion in recent years. Published results have varied widely, possibly depending on the method used to analyze the dynamic perfusion data. Here, four quantitative analysis methods (two-compartment modeling, Fermi function modeling, model-independent analysis, and Patlak plot analysis) were implemented and compared for quantifying myocardial perfusion. Dynamic contrast-enhanced MRI data were acquired in 20 human subjects at rest with low-dose (0.019 ± 0.005 mmol/kg) bolus injections of gadolinium. Fourteen of these subjects were also imaged at adenosine stress (0.021 ± 0.005 mmol/kg). Aggregate rest perfusion estimates were not significantly different between all four analysis methods. At stress, perfusion estimates were not significantly different between two-compartment modeling, model-independent analysis, and Patlak plot analysis. Stress estimates from the Fermi model were significantly higher (~20%) than the other three methods. Myocardial perfusion reserve values were not significantly different between all four methods. Model-independent analysis resulted in the lowest model curve-fit errors. When more than just the first pass of data was analyzed, perfusion estimates from two-compartment modeling and model-independent analysis did not change significantly, unlike results from Fermi function modeling. PMID:20577976

  17. Cerenkov radiation imaging as a method for quantitative measurements of beta particles in a microfluidic chip

    PubMed Central

    Cho, Jennifer S; Taschereau, Richard; Olma, Sebastian; Liu, Kan; Chen, Yi-Chun; Shen, Clifton K-F; van Dam, R Michael; Chatziioannou, Arion F.

    2009-01-01

    It has been observed that microfluidic chips used for synthesizing 18F-labeled compounds demonstrate visible light emission without nearby scintillators or fluorescent materials. The origin of the light was investigated and found to be consistent with the emission characteristics from Cerenkov radiation. Since 18F decays through the emission of high-energy positrons, the energy threshold for beta particles, i.e., electrons or positrons, to generate Cerenkov radiation was calculated for water and polydimethylsiloxane (PDMS), the most commonly used polymer-based material for microfluidic chips. Beta particles emitted from 18F have a continuous energy spectrum, with a maximum energy that exceeds this energy threshold for both water and PDMS. In addition, the spectral characteristics of the emitted light from 18F in distilled water were also measured, yielding a broad distribution from 300 nm to 700 nm, with higher intensity at shorter wavelengths. A photograph of the 18F solution showed a bluish-white light emitted from the solution, further suggesting Cerenkov radiation. In this study, the feasibility of using this Cerenkov light emission as a method for quantitative measurements of the radioactivity within the microfluidic chip in situ was evaluated. A detector previously developed for imaging microfluidic platforms was used. The detector consisted of a charge coupled device (CCD) optically coupled to a lens. The system spatial resolution, minimum detectable activity and dynamic range were evaluated. In addition, a calibration of Cerenkov signal versus activity concentration in the microfluidic chip was determined. This novel method of Cerenkov radiation measurements will provide researchers with a simple yet robust quantitative imaging tool for microfluidic applications utilizing beta particles. PMID:19847018

  18. A quantitative method for defining high-arched palate using the Tcof1(+/-) mutant mouse as a model.

    PubMed

    Conley, Zachary R; Hague, Molly; Kurosaka, Hiroshi; Dixon, Jill; Dixon, Michael J; Trainor, Paul A

    2016-07-15

    The palate functions as the roof of the mouth in mammals, separating the oral and nasal cavities. Its complex embryonic development and assembly poses unique susceptibilities to intrinsic and extrinsic disruptions. Such disruptions may cause failure of the developing palatal shelves to fuse along the midline resulting in a cleft. In other cases the palate may fuse at an arch, resulting in a vaulted oral cavity, termed high-arched palate. There are many models available for studying the pathogenesis of cleft palate but a relative paucity for high-arched palate. One condition exhibiting either cleft palate or high-arched palate is Treacher Collins syndrome, a congenital disorder characterized by numerous craniofacial anomalies. We quantitatively analyzed palatal perturbations in the Tcof1(+/-) mouse model of Treacher Collins syndrome, which phenocopies the condition in humans. We discovered that 46% of Tcof1(+/-) mutant embryos and new born pups exhibit either soft clefts or full clefts. In addition, 17% of Tcof1(+/-) mutants were found to exhibit high-arched palate, defined as two sigma above the corresponding wild-type population mean for height and angular based arch measurements. Furthermore, palatal shelf length and shelf width were decreased in all Tcof1(+/-) mutant embryos and pups compared to controls. Interestingly, these phenotypes were subsequently ameliorated through genetic inhibition of p53. The results of our study therefore provide a simple, reproducible and quantitative method for investigating models of high-arched palate.

  19. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  20. ADvanced IMage Algebra (ADIMA): a novel method for depicting multiple sclerosis lesion heterogeneity, as demonstrated by quantitative MRI

    PubMed Central

    Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia AM

    2013-01-01

    Background: There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. Objective: To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. Methods: We obtained conventional PDw and T2w images from 10 patients with relapsing–remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Results: Our study’s ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. Conclusion: ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished. PMID:23037551