Science.gov

Sample records for quantitative methods results

  1. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for

  2. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  3. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  4. Quantitative results from the focusing schlieren technique

    NASA Technical Reports Server (NTRS)

    Cook, S. P.; Chokani, Ndaona

    1993-01-01

    An iterative theoretical approach to obtain quantitative density data from the focusing schlieren technique is proposed. The approach is based on an approximate modeling of the focusing action in a focusing schlieren system, and an estimation of an appropriate focal plane thickness. The theoretical approach is incorporated in a computer program, and results obtained from a supersonic wind tunnel experiment evaluated by comparison with CFD data. The density distributions compared favorably with CFD predictions. However, improvements to the system are required in order to reduce noise in the data, to improve specifications of a depth of focus, and to refine the modeling of the focusing action.

  5. Automated Quantitative Nuclear Cardiology Methods.

    PubMed

    Motwani, Manish; Berman, Daniel S; Germano, Guido; Slomka, Piotr

    2016-02-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps, and estimate global and local measures of stress/rest perfusion, all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This article briefly reviews these techniques, highlights several challenges, and discusses the latest developments. PMID:26590779

  6. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. PMID:26763302

  7. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  8. Quantitative Statistical Methods for Image Quality Assessment

    PubMed Central

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  9. A quantitative method to evaluate neutralizer toxicity against Acanthamoeba castellanii.

    PubMed Central

    Buck, S L; Rosenthal, R A

    1996-01-01

    A standard methodology for quantitatively evaluating neutralizer toxicity against Acanthamoeba castellanii does not exist. The objective of this study was to provide a quantitative method for evaluating neutralizer toxicity against A. castellanii. Two methods were evaluated. A quantitative microtiter method for enumerating A. castellanii was evaluated by a 50% lethal dose endpoint method. The microtiter method was compared with the hemacytometer count method. A method for determining the toxicity of neutralizers for antimicrobial agents to A. castellanii was also evaluated. The toxicity to A. castellanii of Dey-Engley neutralizing broth was compared with Page's saline. The microtiter viable cell counts were lower than predicted by the hemacytometer counts. However, the microtiter method gives more reliable counts of viable cells. Dey-Engley neutralizing medium was not toxic to A. castellanii. The method presented gives consistent, reliable results and is simple compared with previous methods. PMID:8795247

  10. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  11. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  12. Four quantitative EMG methods and theirs individual parameter diagnostic value.

    PubMed

    Kurca, E; Drobn, M

    2000-12-01

    Quantitative electromyography (EMG) usage in daily clinical medicine can exclude the investigation results influencing by the electromyographer's subjective factor in needle EMG. The aim of our study was to compare the diagnostic efficiency of these quantitative EMG methods which have found some more consistent application in routine neurologic practice. We have investigated 35 healthy subjects and 59 patients with two basic types of neuromuscular disorders (neuropathies and myopathies) by means of four quantitative EMG methods: 1--modified Buchthal's low threshold MUAPs (motor unit action potentials) analysis; 2--interference EMG pattern Dorfman's and McGill's limited decomposition; 3--interference EMG pattern spectral analysis; 4--interference EMG pattern turns-amplitude analysis. In results analysis parameter's 95% confidence intervals were calculated by Campbell and Gardner and the difference between three subject groups (controls, neuropathies, myopathies) was evaluated by special multidimensional statistics (Hotelling T2 test) using simultaneously all tested parameters of four quantitative EMG methods. The modified Buchthal's low threshold MUAPs analysis was the most effective method in discovering neuropathy and myopathy with area as the best discriminating parameter. The diagnostic power in neuropathies may be increased using selected quantitative EMG methods or theirs individual parameters combinations. Several aspects of applyied quantitative EMG methods and aquired data statistical analysis are discussed. PMID:11155536

  13. Quantitative laser-induced breakdown spectroscopy data using peak area step-wise regression analysis: an alternative method for interpretation of Mars science laboratory results

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Dyar, Melinda D; Schafer, Martha W; Tucker, Jonathan M

    2008-01-01

    The ChemCam instrument on the Mars Science Laboratory (MSL) will include a laser-induced breakdown spectrometer (LIBS) to quantify major and minor elemental compositions. The traditional analytical chemistry approach to calibration curves for these data regresses a single diagnostic peak area against concentration for each element. This approach contrasts with a new multivariate method in which elemental concentrations are predicted by step-wise multiple regression analysis based on areas of a specific set of diagnostic peaks for each element. The method is tested on LIBS data from igneous and metamorphosed rocks. Between 4 and 13 partial regression coefficients are needed to describe each elemental abundance accurately (i.e., with a regression line of R{sup 2} > 0.9995 for the relationship between predicted and measured elemental concentration) for all major and minor elements studied. Validation plots suggest that the method is limited at present by the small data set, and will work best for prediction of concentration when a wide variety of compositions and rock types has been analyzed.

  14. Protein staining methods in quantitative cytochemistry.

    PubMed

    Tas, J; van der Ploeg, M; Mitchell, J P; Cohn, N S

    1980-08-01

    The chemical action and practical application of the Naphthol Yellow S, Alkaline Fast Green, Coomassie Brilliant Blue, Dinitrofluorobenzene and some lesser known protein staining methods have been surveyed with respect to their potentialities for quantitative cytochemical analyses. None of the dyes can be said to bind to any specific protein or group of proteins, but each may be used to analyse the presence of one or more particular amino acid residues. For the cytophotometric measurement of the 'total protein content' of individual cells and cell organelles the covalent binding Dinitrofluorobenzene and the electrostatic binding Naphthol Yellow S can properly be used. Fast Green FCF, applied at alkaline pH, binds electrostatically to the basic amino acid side chains of strongly basic proteins only but not in a quantitative (stoichiometrical) way. Coomassie Brilliant Blue, recently introduced to protein cytochemistry, may be useful for quantitative purposes. The combined Feulgen-Pararosaniline(SO2)/Naphthol Yellow S and Dinitrofluorobenzene/Feulgen-Pararosaniline(SO2) methods enable the simultaneous cytophotometric analysis at two different wavelengths for protein and DNA within the same microscopical preparation. PMID:6157816

  15. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application. PMID:26321463

  16. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  17. Method for quantitative analysis of flocculation performance.

    PubMed

    Tse, Ian C; Swetland, Karen; Weber-Shirk, Monroe L; Lion, Leonard W

    2011-05-01

    The sedimentation rate and the post-sedimentation residual turbidity of flocculated suspensions are properties central to the design and operation of unit processes following flocculation in a water treatment plant. A method for comparing flocculation performance based on these two properties is described. The flocculation residual turbidity analyzer (FReTA) records the turbidity of flocculent suspensions undergoing quiescent settling. The fixed distance across which flocs must travel to clear the measurement volume allows sedimentation velocity distributions of the flocculent suspension to be calculated from the raw turbidity data. By fitting the transformed turbidity data with a modified gamma distribution, the mean and variance of sedimentation velocity can be obtained along with the residual turbidity after a period of settling. This new analysis method can be used to quantitatively compare how differences in flocculator operating conditions affect the sedimentation velocity distribution of flocs as well as the post-sedimentation residual turbidity. PMID:21497877

  18. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  19. Research in Distance Education: Methods and Results.

    ERIC Educational Resources Information Center

    McIsaac, Marina Stock; And Others

    This paper reports research in progress. The purpose of this study was to examine the growing variety of research emerging in distance education and to suggest a method for synthesizing the results. Over 60 articles, representing both quantitative and qualitative studies published in major journals during the past two years, have been abstracted

  20. Sparse methods for Quantitative Susceptibility Mapping

    NASA Astrophysics Data System (ADS)

    Bilgic, Berkin; Chatnuntawech, Itthi; Langkammer, Christian; Setsompop, Kawin

    2015-09-01

    Quantitative Susceptibility Mapping (QSM) aims to estimate the tissue susceptibility distribution that gives rise to subtle changes in the main magnetic field, which are captured by the image phase in a gradient echo (GRE) experiment. The underlying susceptibility distribution is related to the acquired tissue phase through an ill-posed linear system. To facilitate its inversion, spatial regularization that imposes sparsity or smoothness assumptions can be employed. This paper focuses on efficient algorithms for regularized QSM reconstruction. Fast solvers that enforce sparsity under Total Variation (TV) and Total Generalized Variation (TGV) constraints are developed using Alternating Direction Method of Multipliers (ADMM). Through variable splitting that permits closed-form iterations, the computation efficiency of these solvers are dramatically improved. An alternative approach to improve the conditioning of the ill-posed inversion is to acquire multiple GRE volumes at different head orientations relative to the main magnetic field. The phase information from such multi-orientation acquisition can be combined to yield exquisite susceptibility maps and obviate the need for regularized reconstruction, albeit at the cost of increased data acquisition time.

  1. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  2. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  3. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  4. Novel quantitative methods for the determination of biomaterial cytotoxicity.

    PubMed

    Smith, M D; Barbenel, J C; Courtney, J M; Grant, M H

    1992-03-01

    Two novel methods for the determination of biomaterial cytotoxicity using cell culture are presented. The methods combine a standardized protocol for producing extracts from medical devices with either the established MTT assay or a new fluorimetric assay. The suitability of both methods for evaluating the toxicity of candidate materials was demonstrated by resolution of the differences in the toxic effects of serial dilutions of a PVC extract on BHK21 and HT1080 cells. The tests yield highly reproducible, quantitative results and can be applied to materials in the usual physical forms applicable to artificial organs. PMID:1521905

  5. Quantitative MR imaging in fracture dating-Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34±15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895±607ms), which decreased over time to a value of 1094±182ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115±80ms) and decreased to 73±33ms within 21 days after the fracture event. After that time point, no significant changes could be detected for T2. MTR remained constant at 35.5±8.0% over time. The study shows that the quantitative assessment of T1 and T2 behaviour over time in the fractured region enable the generation of a novel model allowing for an objective age determination of a fracture. PMID:26890805

  6. ASSESSING QUANTITATIVE RESULTS IN ACCRETION SIMULATIONS: FROM LOCAL TO GLOBAL

    SciTech Connect

    Hawley, John F.; Guan Xiaoyue; Krolik, Julian H. E-mail: xg3z@virginia.edu

    2011-09-01

    Discretized numerical simulations are a powerful tool for the investigation of nonlinear MHD turbulence in accretion disks. However, confidence in their quantitative predictions requires a demonstration that further refinement of the spatial grid scale would not result in any significant change. This has yet to be accomplished, particularly for global disk simulations. In this paper, we combine data from previously published stratified shearing box simulations and new global disk simulations to calibrate several quantitative diagnostics by which one can estimate progress toward numerical convergence of the magnetic field. Using these diagnostics, we find that the established criterion for an adequate numerical description of linear growth of the magneto-rotational instability (the number of cells across a wavelength of the fastest-growing vertical wavenumber mode) can be extended to a criterion for the adequate description of nonlinear MHD disk turbulence, but the standard required is more stringent. We also find that azimuthal resolution, which has received little attention in previous studies, can significantly affect the evolution of the poloidal magnetic field. We further analyze the comparative resolution requirements of a small sample of initial magnetic field geometries; not surprisingly, more complicated initial field geometries require higher spatial resolution. Otherwise, they tend to evolve to qualitatively similar states if evolved for sufficient time. Applying our quantitative resolution criteria to a sample of previously published global simulations, we find that, with perhaps a single exception, they are significantly underresolved, and therefore underestimate the magnetic turbulence and resulting stress levels throughout the accretion flow.

  7. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  8. A quantitative method for determining the robustness of complex networks

    NASA Astrophysics Data System (ADS)

    Qin, Jun; Wu, Hongrun; Tong, Xiaonian; Zheng, Bojin

    2013-06-01

    Most current studies estimate the invulnerability of complex networks using a qualitative method that analyzes the decay rate of network performance. This method results in confusion over the invulnerability of various types of complex networks. By normalizing network performance and defining a baseline, this paper defines the invulnerability index as the integral of the normalized network performance curve minus the baseline. This quantitative method seeks to measure network invulnerability under both edge and node attacks and provides a definition on the distinguishment of the robustness and fragility of networks. To demonstrate the proposed method, three small-world networks were selected as test beds. The simulation results indicate that the proposed invulnerability index can effectively and accurately quantify network resilience and can deal with both the node and edge attacks. The index can provide a valuable reference for determining network invulnerability in future research.

  9. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  10. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on womens sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  11. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  12. Machine Learning methods for Quantitative Radiomic Biomarkers

    PubMed Central

    Parmar, Chintan; Grossmann, Patrick; Bussink, Johan; Lambin, Philippe; Aerts, Hugo J. W. L.

    2015-01-01

    Radiomics extracts and mines large number of medical imaging features quantifying tumor phenotypic characteristics. Highly accurate and reliable machine-learning approaches can drive the success of radiomic applications in clinical care. In this radiomic study, fourteen feature selection methods and twelve classification methods were examined in terms of their performance and stability for predicting overall survival. A total of 440 radiomic features were extracted from pre-treatment computed tomography (CT) images of 464 lung cancer patients. To ensure the unbiased evaluation of different machine-learning methods, publicly available implementations along with reported parameter configurations were used. Furthermore, we used two independent radiomic cohorts for training (n?=?310 patients) and validation (n?=?154 patients). We identified that Wilcoxon test based feature selection method WLCX (stability?=?0.84??0.05, AUC?=?0.65??0.02) and a classification method random forest RF (RSD?=?3.52%, AUC?=?0.66??0.03) had highest prognostic performance with high stability against data perturbation. Our variability analysis indicated that the choice of classification method is the most dominant source of performance variation (34.21% of total variance). Identification of optimal machine-learning methods for radiomic applications is a crucial step towards stable and clinically relevant radiomic biomarkers, providing a non-invasive way of quantifying and monitoring tumor-phenotypic characteristics in clinical practice. PMID:26278466

  13. [Addition internal standard method in chromatographic quantitative analysis].

    PubMed

    Zheng, Y J; Kang, Y; Feng, Y Z; Zhang, R; Zhang, W B

    2001-09-01

    Internal standard method is a conventional chromatographic quantitative method which requires one or several internal standards added. The internal standard component must not be contained in the sample and need a good separation between the internal standard and sample components. In many cases selecting an internal standard is not convenient or even restricted by the seperation of components. In this paper, we try to combine the internal standard method and the addition method to form a new chromatographic quantitation method named addition internal standard method. The principles of addition internal standard method are suitable to not only chromatographic quantitation but also polarography etc. The related theory and foundation of the method are defined. The operation steps and the conditions suitable to the method are discussed. The advantages and disadvantages of this method are explained in detail. PMID:12545448

  14. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  15. Research radiometric calibration quantitative transfer methods between internal and external

    NASA Astrophysics Data System (ADS)

    Guo, Ju Guang; Ma, Yong hui; Zhang, Guang; Yang, Zhi hui

    2015-10-01

    This paper puts forward a method by realizing the internal and external radiation calibration transfer for infrared radiation characteristics quantitative measuring system. Through technological innovation and innovation application to establish a theoretical model of the corresponding radiated transfer method. This method can be well in engineering application for technology conversion process of radiometric calibration that with relatively simple and effective calibration in the half light path radiation instead of complex difficult whole optical path radiometric calibration. At the same time, it also will provide the basis of effective support to further carry out the target radiated characteristics quantitative measurement and application for ground type infrared radiated quantitative measuring system.

  16. Advanced Quantitative Methods in Counseling Psychology: Synthesis.

    ERIC Educational Resources Information Center

    Ellis, Michael V.; Chartrand, Judy M.

    1999-01-01

    The individual best qualified to design an empirical study is one who is most familiar with the nature of the phenomenon and research area; most familiar with the possible alternative methods for designing and analyzing the study; and most capable of evaluating the potential advantages and disadvantages of the alternatives. (Author)

  17. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems using dynamic PRA methods. These efforts, documented in NUREG/CR-6901, NUREG/CR-6942, and NUREG/CR-6985, included a functional representation of the system's software but did not explicitly address failure modes caused by software defects or by inadequate design requirements. An important identified research need is to establish a commonly accepted basis for incorporating the behavior of software into digital I&C system reliability models for use in PRAs. To address this need, BNL is exploring the inclusion of software failures into the reliability models of digital I&C systems, such that their contribution to the risk of the associated NPP can be assessed.

  18. Chemoenzymatic method for glycomics: Isolation, identification, and quantitation.

    PubMed

    Yang, Shuang; Rubin, Abigail; Eshghi, Shadi Toghi; Zhang, Hui

    2016-01-01

    Over the past decade, considerable progress has been made with respect to the analytical methods for analysis of glycans from biological sources. Regardless of the specific methods that are used, glycan analysis includes isolation, identification, and quantitation. Derivatization is indispensable to increase their identification. Derivatization of glycans can be performed by permethylation or carbodiimide coupling/esterification. By introducing a fluorophore or chromophore at their reducing end, glycans can be separated by electrophoresis or chromatography. The fluorogenically labeled glycans can be quantitated using fluorescent detection. The recently developed approaches using solid-phase such as glycoprotein immobilization for glycan extraction and on-tissue glycan mass spectrometry imaging demonstrate advantages over methods performed in solution. Derivatization of sialic acids is favorably implemented on the solid support using carbodiimide coupling, and the released glycans can be further modified at the reducing end or permethylated for quantitative analysis. In this review, methods for glycan isolation, identification, and quantitation are discussed. PMID:26390280

  19. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A.

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  20. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  1. Fluorometric method of quantitative cell mutagenesis

    SciTech Connect

    Dolbeare, F.A.

    1982-08-17

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  2. Resolving Conflicting Results from Quantitative and Qualitative Methodologies: A Case Study of a Magnet School.

    ERIC Educational Resources Information Center

    Pink, William T.

    This study adds to the effective schools debate a criticism of existing criteria--based on standardized achievement test profiles--for targeting schools for intervention programs. Qualitative and quantitative methods of gathering data result in conflicting assessments of the magnet elementary school studied as needing or not needing intervention.

  3. African Primary Care Research: Quantitative analysis and presentation of results

    PubMed Central

    Ogunbanjo, Gboyega A.

    2014-01-01

    Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435

  4. Quantitative methods for the analysis of zoosporic fungi.

    PubMed

    Marano, Agostina V; Gleason, Frank H; Brlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Tlesphore; Boussiba, Sammy; de Souza, Jos I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi. PMID:22360942

  5. University Students' Orientation to Qualitative and Quantitative Research Methods.

    ERIC Educational Resources Information Center

    Murtonen, Mari

    This study aimed to determine whether different orientations toward qualitative and quantitative methods can be found among students. Data were collected during 3 years from different research methodology course students. There were 195 Finnish students and 122 U.S. students who answered a questionnaire about the appreciation of research methods

  6. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  7. [Teaching quantitative methods in public health: the EHESP experience].

    PubMed

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis. PMID:25629671

  8. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  9. Methods to quantitate videocapsule endoscopy images in celiac disease.

    PubMed

    Ciaccio, Edward J; Tennyson, Christina A; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2014-01-01

    In this work, bioengineering methods that can be used to quantitatively analyze videocapsule endoscopy images that have been acquired from celiac patients versus controls are described. For videocapsule endoscopic analysis, each patient swallows a capsule which contains an imaging device and light source. In celiac and control patients, images are acquired and analyzed at the level of the small intestine. The data used for videocapsule analysis consisted of high resolution images of dimension 576 576 pixels, acquired twice per second. The goal of the quantitative analysis is to detect abnormality in celiac patient images as compared with controls. Several types of abnormality can exist at the level of the small intestine in celiac patients. In untreated patients, and often even after treatment with a gluten-free diet, there can be villous atrophy, as well as presence of fissures and a mottled appearance. To detect and discern these abnormalities, several methods of statistical and structural feature extraction and selection are described. It was found that there is a significantly greater variation in image texture and average brightness level in celiac patients as compared with controls (p < 0.05). Celiac patients have a longer dominant period as compared with controls, averaging 6.4 2.6 seconds versus 4.7 1.6 seconds in controls (p = 0.001). This suggests that overall motility is slower in the celiac patients. Furthermore, the mean number of villous protrusions per image was found to be 402.2 15.0 in celiac patients versus 420.8 24.0 in control patients (p < 0.001). The average protrusion width was 14.66 1.04 pixels in celiacs versus 13.91 1.47 pixels in controls (p = 0.01). The mean protrusion height was 3.10 0.26 grayscale levels for celiacs versus 2.70 0.43 grayscale levels for controls (p < 0.001). Thus celiac patients tended to have fewer protrusions, and these were more varied in shape, tending to be blunted, as compared with controls, which more often had fine, uniform protrusions. A variety of computerized methods are now available to quantitate videocapsule images for comparison of celiac versus control patients. Since these methods are based on computer algorithms, they can be automated and there is no variation in the results due to observer bias. These methods readily lend themselves to automation, so that it may be possible to map the entire small intestine for presence of abnormality in real-time. It is also possible to develop an automated, quantitative clinical score which can be displayed with real-time update during the procedure. This would be useful to determine progress in celiac patients on a gluten-free diet, and to better understand the properties of the healing process in these patients. PMID:25226886

  10. A Quantitative Assessment Method for Ascaris Eggs on Hands

    PubMed Central

    Jeandron, Aurelie; Ensink, Jeroen H. J.; Thamsborg, Stig M.; Dalsgaard, Anders; Sengupta, Mita E.

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  11. A quantitative assessment method for Ascaris eggs on hands.

    PubMed

    Jeandron, Aurelie; Ensink, Jeroen H J; Thamsborg, Stig M; Dalsgaard, Anders; Sengupta, Mita E

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  12. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  13. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  14. Integrating quantitative and qualitative methods to study multifetal pregnancy reduction.

    PubMed

    McKinney, M; Leary, K

    1999-03-01

    This study integrates quantitative and qualitative research methods to examine the psychologic repercussions of multifetal pregnancy reduction, a recently developed reproductive technology. Two theoretical vantage points, descriptive psychiatry and psychoanalytic theory, were used to understand the emotional impact of the medical intervention, which involves aborting some but not all of the fetuses in a multifetal pregnancy. Quantitative analysis of diagnostic interviews indicated that women who underwent pregnancy reductions were at no greater risk than controls for developing depressive disorder. Although multifetal pregnancy reduction posed no apparent mental health risk, women experienced it as stressful and distressing. Women's responses were organized and understood via qualitative analyses based on six contemporary psychoanalytic perspectives: drive theory, ego psychology, object relations theory, self-psychology, interpersonal viewpoints, and developmental concepts. Some of the practical and philosophic implications of qualitative and quantitative strategies are considered. PMID:10100139

  15. Trojan Horse Method: Recent Results

    SciTech Connect

    Pizzone, R. G.; Spitaleri, C.

    2008-01-24

    Owing the presence of the Coulomb barrier at astrophysically relevant kinetic energies, it is very difficult, or sometimes impossible to measure astrophysical reaction rates in laboratory. This is why different indirect techniques are being used along with direct measurements. The THM is unique indirect technique allowing one measure astrophysical rearrangement reactions down to astrophysical relevant energies. The basic principle and a review of the main application of the Trojan Horse Method are presented. The applications aiming at the extraction of the bare S{sub b}(E) astrophysical factor and electron screening potentials U{sub e} for several two body processes are discussed.

  16. Trojan Horse Method: Recent Results

    NASA Astrophysics Data System (ADS)

    Pizzone, R. G.; Spitaleri, C.

    2008-01-01

    Owing the presence of the Coulomb barrier at astrophysically relevant kinetic energies, it is very difficult, or sometimes impossible to measure astrophysical reaction rates in laboratory. This is why different indirect techniques are being used along with direct measurements. The THM is unique indirect technique allowing one measure astrophysical rearrangement reactions down to astrophysical relevant energies. The basic principle and a review of the main application of the Trojan Horse Method are presented. The applications aiming at the extraction of the bare Sb(E) astrophysical factor and electron screening potentials Ue for several two body processes are discussed.

  17. A Novel Targeted Learning Method for Quantitative Trait Loci Mapping

    PubMed Central

    Wang, Hui; Zhang, Zhongyang; Rose, Sherri; van der Laan, Mark

    2014-01-01

    We present a novel semiparametric method for quantitative trait loci (QTL) mapping in experimental crosses. Conventional genetic mapping methods typically assume parametric models with Gaussian errors and obtain parameter estimates through maximum-likelihood estimation. In contrast with univariate regression and interval-mapping methods, our model requires fewer assumptions and also accommodates various machine-learning algorithms. Estimation is performed with targeted maximum-likelihood learning methods. We demonstrate our semiparametric targeted learning approach in a simulation study and a well-studied barley data set. PMID:25258376

  18. Dynamic contrast-enhanced CT in suspected lung cancer: quantitative results

    PubMed Central

    Madsen, H H; Nellemann, H M; Rasmussen, T R; Thygesen, J; Hager, H; Andersen, N T; Rasmussen, F

    2013-01-01

    Objectives: To examine whether dynamic contrast-enhanced CT (DCE-CT) could be used to characterise and safely distinguish between malignant and benign lung tumours in patients with suspected lung cancer. Methods: Using a quantitative approach to DCE-CT, two separate sets of regions of interest (ROIs) in tissues were placed in each tumour: large ROIs over the entire tumour and small ROIs over the maximally perfused parts of the tumour. Using mathematical modelling techniques and dedicated perfusion software, this yielded a plethora of results. Results: First, because of their non-normal distribution, DCE-CT measurements must be analysed using log scale data transformation. Second, there were highly significant differences between large ROI and small ROI measurements (p<0.001). Thus, the ROI method used in a given study should always be specified in advance. Third, neither quantitative parameters (blood flow and blood volume) nor semi-quantitative parameters (peak enhancement) could be used to distinguish between malignant and benign tumours. This was irrespective of the method of quantification used for large ROIs (0.13quantitative approach to DCE-CT is not a clinically usable method for characterising lung tumours. PMID:24029629

  19. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  20. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  1. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikoli?, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  2. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikoli?, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  3. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    NASA Astrophysics Data System (ADS)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  4. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gnen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. PMID:24919831

  5. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  6. Quantitative imaging of volcanic plumes Results, needs, and future trends

    USGS Publications Warehouse

    Platt, Ulrich; Lbcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-01-01

    Recent technology allows two-dimensional imaging of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. FabryProt Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  7. Quantitative imaging of volcanic plumes - Results, needs, and future trends

    NASA Astrophysics Data System (ADS)

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-07-01

    Recent technology allows two-dimensional "imaging" of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2 cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry-Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  8. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  9. Method for a quantitative investigation of the frozen flow hypothesis

    PubMed

    Schock; Spillar

    2000-09-01

    We present a technique to test the frozen flow hypothesis quantitatively, using data from wave-front sensors such as those found in adaptive optics systems. Detailed treatments of the theoretical background of the method and of the error analysis are presented. Analyzing data from the 1.5-m and 3.5-m telescopes at the Starfire Optical Range, we find that the frozen flow hypothesis is an accurate description of the temporal development of atmospheric turbulence on time scales of the order of 1-10 ms but that significant deviations from the frozen flow behavior are present for longer time scales. PMID:10975375

  10. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding, proliferation, fishing, resources protection and management of specific plans.

  11. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  12. A quantitative method for clustering size distributions of elements

    NASA Astrophysics Data System (ADS)

    Dillner, Ann M.; Schauer, James J.; Christensen, William F.; Cass, Glen R.

    A quantitative method was developed to group similarly shaped size distributions of particle-phase elements in order to ascertain sources of the elements. This method was developed and applied using data from two sites in Houston, TX; one site surrounded by refineries, chemical plants and vehicular and commercial shipping traffic, and the other site, 25 miles inland surrounded by residences, light industrial facilities and vehicular traffic. Twenty-four hour size-segregated (0.056< Dp (particle diameter)<1.8 ?m) particulate matter samples were collected during five days in August 2000. ICP-MS was used to quantify 32 elements with concentrations as low as a few picograms per cubic meter. Concentrations of particulate matter mass, sulfate and organic carbon at the two sites were often not significantly different from each other and had smooth unimodal size distributions indicating the regional nature of these species. Element concentrations varied widely across events and sites and often showed sharp peaks at particle diameters between 0.1 and 0.3 ?m and in the ultrafine mode ( Dp<0.1 ?m), which suggested that the sources of these elements were local, high-temperature processes. Elements were quantitatively grouped together in each event using Ward's Method to cluster normalized size distributions of all elements. Cluster analysis provided groups of elements with similar size distributions that were attributed to sources such as automobile catalysts, fluid catalytic cracking unit catalysts, fuel oil burning, a coal-fired power plant, and high-temperature metal working. The clustered elements were generally attributed to different sources at the two sites during each sampling day indicating the diversity of local sources that impact heavy metals concentrations in the region.

  13. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples. PMID:24190861

  14. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0.981 and that for Ni is 0.995. The concentrations of Cr and Ni in two target samples were determined using PLS calibration method, and the relative errors for the two unknown steel alloy samples are lower than 6.62% and 1.49% respectively. The obtained results showed that in the quantitative analysis of steel alloys, the matrix effect would be reduced effectively and the quantitative analysis accuracy would be improved by the PLS calibration method. PMID:25508749

  15. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0.981 and that for Ni is 0.995. The concentrations of Cr and Ni in two target samples were determined using PLS calibration method, and the relative errors for the two unknown steel alloy samples are lower than 6.62% and 1.49% respectively. The obtained results showed that in the quantitative analysis of steel alloys, the matrix effect would be reduced effectively and the quantitative analysis accuracy would be improved by the PLS calibration method. PMID:25474970

  16. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    SciTech Connect

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  17. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  18. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the

  19. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for

  20. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the…

  1. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  2. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  3. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  4. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  5. Understanding youth: using qualitative methods to verify quantitative community indicators.

    PubMed

    Makhoul, Jihad; Nakkash, Rima

    2009-01-01

    Community- and individual-level data were collected from interviews with 1,294 boys and girls, 13 to 19 years old, in three impoverished urban communities of Beirut. Univariate analyses of variables provide quantitative indicators of adolescents' lives and communities. Researchers including the authors, interested in using these indicators to plan for community interventions with youth in the Palestinian refugee camp, discuss the pertinent results with youth from the camp in six focus groups. The authors find that many indicators misrepresent the situation of youth in the camp. For example, adolescents may have underreported cigarette and argileh (water pipe) smoking (8.3% and 22.4%, respectively) because of the lack of social desirability of these behaviors; other questions may have been misunderstood, such as perceived health and health compared to others. Also, important issues for them such as drug abuse, violence, and school problems were not asked. Implications for intervention research are discussed. PMID:17971480

  6. Acceptance sampling methods for sample results verification

    SciTech Connect

    Jesse, C.A.

    1993-06-01

    This report proposes a statistical sampling method for use during the sample results verification portion of the validation of data packages. In particular, this method was derived specifically for the validation of data packages for metals target analyte analysis performed under United States Environmental Protection Agency Contract Laboratory Program protocols, where sample results verification can be quite time consuming. The purpose of such a statistical method is to provide options in addition to the ``all or nothing`` options that currently exist for sample results verification. The proposed method allows the amount of data validated during the sample results verification process to be based on a balance between risks and the cost of inspection.

  7. A Method for Designing Instrument-Free Quantitative Immunoassays.

    PubMed

    Lathwal, Shefali; Sikes, Hadley D

    2016-03-15

    Colorimetric readouts are widely used in point-of-care diagnostic immunoassays to indicate either the presence or the absence of an analyte. For a variety of reasons, it is more difficult to quantify rather than simply detect an analyte using a colorimetric test. We report a method for designing, with minimal iteration, a quantitative immunoassay that can be interpreted objectively by a simple count of number of spots visible to the unaided eye. We combined a method called polymerization-based amplification (PBA) with a series of microscale features containing a decreasing surface density of capture molecules, and the central focus of the study is understanding how the choice of surface densities impacts performance. Using a model pair of antibodies, we have shown that our design approach does not depend on measurement of equilibrium and kinetic binding parameters and can provide a dynamic working range of 3 orders of magnitude (70 pM to 70 nM) for visual quantification. PMID:26878154

  8. A processing method enabling the use of peak height for accurate and precise proton NMR quantitation.

    PubMed

    Hays, Patrick A; Thompson, Robert A

    2009-10-01

    In NMR, peak area quantitation is the most common method used because the area under a peak or peak group is proportional to the number of nuclei at those frequencies. Peak height quantitation has not enjoyed as much utility because of poor precision and linearity as a result of inconsistent shapes and peak widths (measured at half height). By using a post-acquisition processing method employing a Gaussian or line-broadening (exponential decay) apodization (i.e. weighting function) to normalize the shape and width of the internal standard (ISTD) peak, the heights of an analyte calibration spectrum can be compared to the analyte peaks in a sample spectrum resulting in accurate and precise quantitative results. Peak height results compared favorably with 'clean' peak area results for several hundred illicit samples of methamphetamine HCl, cocaine HCl, and heroin HCl, of varying composition and purity. Using peak height and peak area results together can enhance the confidence in the reported purity value; a major advantage in high throughput, automated quantitative analyses. PMID:19548253

  9. Legionella in water samples: how can you interpret the results obtained by quantitative PCR?

    PubMed

    Ditommaso, Savina; Ricciardi, Elisa; Giacomuzzi, Monica; Arauco Rivera, Susan R; Zotti, Carla M

    2015-02-01

    Evaluation of the potential risk associated with Legionella has traditionally been determined from culture-based methods. Quantitative polymerase chain reaction (qPCR) is an alternative tool that offers rapid, sensitive and specific detection of Legionella in environmental water samples. In this study we compare the results obtained by conventional qPCR (iQ-Check™ Quanti Legionella spp.; Bio-Rad) and by culture method on artificial samples prepared in Page's saline by addiction of Legionella pneumophila serogroup 1 (ATCC 33152) and we analyse the selective quantification of viable Legionella cells by the qPCR-PMA method. The amount of Legionella DNA (GU) determined by qPCR was 28-fold higher than the load detected by culture (CFU). Applying the qPCR combined with PMA treatment we obtained a reduction of 98.5% of the qPCR signal from dead cells. We observed a dissimilarity in the ability of PMA to suppress the PCR signal in samples with different amounts of bacteria: the effective elimination of detection signals by PMA depended on the concentration of GU and increasing amounts of cells resulted in higher values of reduction. Using the results from this study we created an algorithm to facilitate the interpretation of viable cell level estimation with qPCR-PMA. PMID:25241149

  10. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  11. A new method for the quantitative analysis of endodontic microleakage.

    PubMed

    Haïkel, Y; Wittenmeyer, W; Bateman, G; Bentaleb, A; Allemann, C

    1999-03-01

    The aim of this in vitro study was to evaluate the apical seal obtained with three commonly used root canal sealing cements: Sealapex, AH Plus or Topseal, and Sealite, using a new method based on the quantitative analysis of 125I-radiolabeled lysozyme penetration. One hundred thirteen teeth with straight single root canals were instrumented to master apical point #25/30. These were divided into three groups: (i) negative control (4 roots) covered with two layers of nail polish, (ii) test group (105 roots) obturated by laterally condensed guttapercha with the three cements; and (iii) positive control (4 roots) obturated without cement. The groups were then immersed in 125I lysozyme solution for a period of 1, 7, 14, or 28 days. After removal, six sections of 0.8 mm length each were made of each root with a fine diamond wire. Each section was analyzed for activity by a gamma counter, corrected for decay, and used to quantify protein penetration. Leakage was high in the positive control and almost negligible in the negative control. AH Plus (Topseal) and Sealapex showed similar leakage behavior over time, with AH Plus (Topseal) performing better. Sealite showed acceptable leakage up until day 14, after which a large increase occurred, presumably due to three-dimensional instability. PMID:10321181

  12. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  13. [Quantitative methods of cancer risk assessment in exposure to chemicals].

    PubMed

    Szymczak, Wies?aw

    2009-01-01

    This is a methodology paper--it contains a review of different quantitative risk assessment methods and their comparison. There are two aspects of cancer risk modeling discussed here: 1. When there is one effective dose only. There were compared two models in this evaluation: one proposed by the Dutch Expert Committee on Occupational Standards and the other--a classical two-stage model. It was taken into account that in both models the animals were exposed for less than two years. An exposure period and a study period of animals were considered in the Dutch methodology. If we use as an exposure measure average lifespan dose estimated with different coefficients of exposure time in an experiment, we get two different dose-response models. And each of them will create different human risk models. There is no criterion that would let us assess which of them is better. 2. There are many models used in the BenchMark Dose (BMD) method. But there is no criterion that allows us to choose the best model objectively. In this paper a two-stage classical model and three BMD models (two-stage, Weibull and linear) were fit for particular data. Very small differences between all the models were noticed. The differences were insignificant because of uncertainties in the risk modeling. The possibility of choice of one model from a bigger set of models is the greatest benefit of this comparison. If the examined chemical is a genotoxic carcinogen, nothing more is needed than to estimate the threshold value. PMID:19746890

  14. How Many proteins are Missed in Quantitative proteomics Based on Ms/Ms sequencing Methods?

    PubMed Central

    Mulvey, Claire; Thur, Bettina; Crawford, Mark; Godovac-Zimmermann, Jasminka

    2014-01-01

    Current bottom-up quantitative proteomics methods based on MS/MS sequencing of peptides are shown to be strongly dependent on sample preparation. Using cytosolic proteins from MCF-7 breast cancer cells, it is shown that protein pre-fractionation based on pI and MW is more effective than pre-fractionation using only MW in increasing the number of observed proteins (947 vs. 704 proteins) and the number of spectral counts per protein. Combination of MS data from the different pre-fractionation methods results in further improvements (1238 proteins). We discuss that at present the main limitation on quantitation by MS/MS sequencing is not MS sensitivity and protein abundance, but rather extensive peptide overlap and limited MS/MS sequencing throughput, and that this favors internally calibrated methods such as SILAC, ICAT or ITRAQ over spectral counting methods in attempts to drastically improve proteome coverage of biological samples. PMID:25729266

  15. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  16. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder.

    PubMed

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results.In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared.In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis.Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  17. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  18. Evaluation of ROI methods for quantitative FDOPA PET images

    SciTech Connect

    Yu, D.C.; Lin, K.P.; Yang, J.; Huang, S.C.

    1995-12-31

    Issues about the accuracy of region of interest (ROI) definition methods for FDOPA PET studies were investigated. An MRI-based ROI method and manually defined ROI method were compared using a computer simulated brain phantom and four real PET studies. The results indicate the error or discrepancy between MRI-based ROI and manually defined ROI is small ({le} 5%) at different head orientations and different noise levels. The VOI is not sensitive to the orientation, but the mid-plane ROI is also fairly reliable.

  19. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  20. Quantitative biomechanical comparison of ankle fracture casting methods.

    PubMed

    Shipman, Alastair; Alsousou, Joseph; Keene, David J; Dyson, Igor N; Lamb, Sarah E; Willett, Keith M; Thompson, Mark S

    2015-06-01

    The incidence of ankle fractures is increasing rapidly due to the ageing demographic. In older patients with compromised distal circulation, conservative treatment of fractures may be indicated. High rates of malunion and complications due to skin fragility motivate the design of novel casting systems, but biomechanical stability requirements are poorly defined. This article presents the first quantitative study of ankle cast stability and hypothesises that a newly proposed close contact cast (CCC) system provides similar biomechanical stability to standard casts (SC). Two adult mannequin legs transected at the malleoli, one incorporating an inflatable model of tissue swelling, were stabilised with casts applied by an experienced surgeon. They were cyclically loaded in torsion, measuring applied rotation angle and resulting torque. CCC stiffness was equal to or greater than that of SC in two measures of ankle cast resistance to torsion. The effect of swelling reduction at the ankle site was significantly greater on CCC than on SC. The data support the hypothesis that CCC provides similar biomechanical stability to SC and therefore also the clinical use of CCC. They suggest that more frequent re-application of CCC is likely required to maintain stability following resolution of swelling at the injury site. PMID:25719278

  1. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    NASA Technical Reports Server (NTRS)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  2. Qualitative and quantitative PCR methods for detection of three lines of genetically modified potatoes.

    PubMed

    Rho, Jae Kyun; Lee, Theresa; Jung, Soon-Il; Kim, Tae-San; Park, Yong-Hwan; Kim, Young-Mi

    2004-06-01

    Qualitative and quantitative polymerase chain reaction (PCR) methods have been developed for the detection of genetically modified (GM) potatoes. The combination of specific primers for amplification of the promoter region of Cry3A gene, potato leafroll virus replicase gene, and potato virus Y coat protein gene allows to identify each line of NewLeaf, NewLeaf Y, and NewLeaf Plus GM potatoes. Multiplex PCR method was also established for the simple and rapid detection of the three lines of GM potato in a mixture sample. For further quantitative detection, the realtime PCR method has been developed. This method features the use of a standard plasmid as a reference molecule. Standard plasmid contains both a specific region of the transgene Cry3A and an endogenous UDP-glucose pyrophosphorylase gene of the potato. The test samples containing 0.5, 1, 3, and 5% GM potatoes were quantified by this method. At the 3.0% level of each line of GM potato, the relative standard deviations ranged from 6.0 to 19.6%. This result shows that the above PCR methods are applicable to detect GM potatoes quantitatively as well as qualitatively. PMID:15161181

  3. A quantitative analytical method to test for salt effects on giant unilamellar vesicles.

    PubMed

    Hadorn, Maik; Boenzli, Eva; Hotz, Peter Eggenberger

    2011-01-01

    Today, free-standing membranes, i.e. liposomes and vesicles, are used in a multitude of applications, e.g. as drug delivery devices and artificial cell models. Because current laboratory techniques do not allow handling of large sample sizes, systematic and quantitative studies on the impact of different effectors, e.g. electrolytes, are limited. In this work, we evaluated the Hofmeister effects of ten alkali metal halides on giant unilamellar vesicles made of palmitoyloleoylphosphatidylcholine for a large sample size by combining the highly parallel water-in-oil emulsion transfer vesicle preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes. PMID:22355683

  4. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  5. A gas chromatography-mass spectrometry method for the quantitation of clobenzorex.

    PubMed

    Cody, J T; Valtier, S

    1999-01-01

    Drugs metabolized to amphetamine or methamphetamine are potentially significant concerns in the interpretation of amphetamine-positive urine drug-testing results. One of these compounds, clobenzorex, is an anorectic drug that is available in many countries. Clobenzorex (2-chlorobenzylamphetamine) is metabolized to amphetamine by the body and excreted in the urine. Following administration, the parent compound was detectable for a shorter time than the metabolite amphetamine, which could be detected for days. Because of the potential complication posed to the interpretation of amphetamin-positive drug tests following administration of this drug, the viability of a current amphetamine procedure using liquid-liquid extraction and conversion to the heptafluorobutyryl derivative followed by gas chromatography-mass spectrometry (GC-MS) analysis was evaluated for identification and quantitation of clobenzorex. Qualitative identification of the drug was relatively straightforward. Quantitative analysis proved to be a far more challenging process. Several compounds were evaluated for use as the internal standard in this method, including methamphetamine-d11, fenfluramine, benzphetamine, and diphenylamine. Results using these compounds proved to be less than satisfactory because of poor reproducibility of the quantitative values. Because of its similar chromatographic properties to the parent drug, the compound 3-chlorobenzylamphetamine (3-Cl-clobenzorex) was evaluated in this study as the internal standard for the quantitation of clobenzorex. Precision studies showed 3-Cl-clobenzorex to produce accurate and reliable quantitative results (within-run relative standard deviations [RSDs] < 6.1%, between-run RSDs < 6.0%). The limits of detection and quantitation for this assay were determined to be 1 ng/mL for clobenzorex. PMID:10595847

  6. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of

  7. Methods and Challenges in Quantitative Imaging Biomarker Development

    PubMed Central

    Abramson, Richard G.; Burton, Kirsteen R.; Yu, John-Paul J.; Scalzetti, Ernest M.; Yankeelov, Thomas E.; Rosenkrantz, Andrew B.; Mendiratta-Lala, Mishal; Bartholmai, Brian J.; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M.

    2014-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This manuscript, drafted by the Association of University Radiologists (AUR) Radiology Research Alliance (RRA) Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. PMID:25481515

  8. Methods and challenges in quantitative imaging biomarker development.

    PubMed

    Abramson, Richard G; Burton, Kirsteen R; Yu, John-Paul J; Scalzetti, Ernest M; Yankeelov, Thomas E; Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M

    2015-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This article, drafted by the Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. PMID:25481515

  9. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works. Engineers need more quantitative information. In order to apply geophysical methods to engineering design works, quantitative interpretation is very important. The presentation introduces several case studies from different countries around the world (Fig. 2) from the integrated and quantitative points of view.

  10. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  11. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  12. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  13. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  14. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  15. An Uneasy Alliance: Combining Qualitative and Quantitative Research Methods.

    ERIC Educational Resources Information Center

    Buchanan, David R.

    1992-01-01

    In a study of the relationship between moral reasoning and teenage drug use, problems arose in an attempt to reduce qualitative data to a quantitative format: (1) making analytic sense of singular and universal responses; (2) the mistaken logical inference that each pattern of judgment should have behavioral indicators; and (3) construction and…

  16. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems

  17. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  18. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (?-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  19. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  20. Quantitative assessment of gene expression network module-validation methods

    PubMed Central

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  1. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples. PMID:25993858

  2. Quantitative Analysis and Validation of Method Using HPTLC

    NASA Astrophysics Data System (ADS)

    Dhandhukia, Pinakin C.; Thakker, Janki N.

    High performance thin layer chromatography is an emerging alternative analytical technique in comparison with conventional column chromatography because of its simplicity, rapidity, accuracy, robustness, and cost effectiveness. Choice of vast array of supporting matrices and solvent systems resulted in separation of almost all types of analytes except volatiles. First step of a robust method development for routine quantification is to check the stability of analyte during various steps of chromatographic development followed by preparation of calibration curves. Thereafter, various validation aspects of analysis namely peak purity, linearity and range, precision, limit of detection, limit of quantification, robustness, and accuracy have to be measured.

  3. Performance analysis of quantitative phase retrieval method in Zernike phase contrast X-ray microscopy

    NASA Astrophysics Data System (ADS)

    Heng, Chen; Kun, Gao; Da-Jiang, Wang; Li, Song; Zhi-Li, Wang

    2016-02-01

    Since the invention of Zernike phase contrast method in 1930, it has been widely used in optical microscopy and more recently in X-ray microscopy. Considering the image contrast is a mixture of absorption and phase information, we recently have proposed and demonstrated a method for quantitative phase retrieval in Zernike phase contrast X-ray microscopy. In this contribution, we analyze the performance of this method at different photon energies. Intensity images of PMMA samples are simulated at 2.5 keV and 6.2 keV, respectively, and phase retrieval is performed using the proposed method. The results demonstrate that the proposed phase retrieval method is applicable over a wide energy range. For weakly absorbing features, the optimal photon energy is 2.5 keV, from the point of view of image contrast and accuracy of phase retrieval. On the other hand, in the case of strong absorption objects, a higher photon energy is preferred to reduce the error of phase retrieval. These results can be used as guidelines to perform quantitative phase retrieval in Zernike phase contrast X-ray microscopy with the proposed method. Supported by the State Key Project for Fundamental Research (2012CB825801), National Natural Science Foundation of China (11475170, 11205157 and 11179004) and Anhui Provincial Natural Science Foundation (1508085MA20).

  4. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    NASA Astrophysics Data System (ADS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-06-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the samples local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilevers contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilevers EMI can be well predicted by the equivalent circuit model while the soft cantilevers cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM.

  5. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methodsacetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason. PMID:25542575

  6. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  7. Difficulties Experienced by Education and Sociology Students in Quantitative Methods Courses.

    ERIC Educational Resources Information Center

    Murtonen, Mari; Lehtinen, Erno

    2003-01-01

    Examined difficulties Finnish university students experienced in learning quantitative methods. Education and sociology students rated different topics on the basis of their difficulty. Overall, students considered statistics and quantitative methods more difficult than other domains. They tended to polarize academic subjects into "easier"

  8. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this combined method in terms of quantitative accuracy using the realistic 3D NCAT phantom and an activity distribution obtained from patient studies. We compared the accuracy of organ activity estimates in images reconstructed with and without addition of downscatter compensation from projections with and without downscatter contamination. Results: We observed that the proposed method provided substantial improvements in accuracy compared to no downscatter compensation and had accuracies comparable to reconstructions from projections without downscatter contamination. Conclusions: The results demonstrate that the proposed model-based downscatter compensation method is effective and may have a role in quantitative 131I imaging. PMID:21815394

  9. [A quantitative method and case analysis for assessing water health].

    PubMed

    Li, Yu-Feng; Liu, Hong-Yu; Hao, Jing-Feng; Zheng, Nan; Cao, Xiao

    2012-02-01

    Water is the basis of wetland, the degree of water health directly determines the function of wetland ecosystem. The theory and method of water health were established in the study. Water health included both water status health and water process health. Water status health was reflected by hydrologic condition and water quality. Water process health was calculated by water elasticity, water stability and resilience. Xixi Wetland Park was taken as the case study. The results indicated that: 1) Seasonal variations of water health were apparent in Xixi Wetland. Water health index was the highest (46.36) and stayed under the sub-health condition in summer. In contrast, water health index was the lowest (37.35) and unhealth in winter; 2) the degrees of water health were obviously different between ponds and creeks. Water health index in ponds was 42.72, which was higher than that of creeks (37.99); 3) Water in Xixi Wetland Park was sub-health with 53.80 as its water health index. Based on the result of study, to enhance water health of creeks in winter is an effective measure to improve water health in Xixi Wetland Park. PMID:22509566

  10. On the quantitative method for measurement and analysis of the fine structure of Fraunhofer line profiles

    NASA Astrophysics Data System (ADS)

    Kuli-Zade, D. M.

    The methods of measurement and analysis of the fine structure of weak and moderate Fraunhofer line profiles are considered. The digital spectral materials were obtained using rapid scanning high dispersion and high resolution double monochromators. The methods of asymmetry coefficient, bisector method and new quantitative method pro- posed by the author are discussed. The new physical values of differential, integral, residual and relative asymmetries are first introduced. These quantitative values permit us to investigate the dependence of asymmetry on microscopic (atomic) and macro- scopic (photospheric) values. It is shown that the integral profile asymmetries grow appreciably with increase in line equivalent width. The average effective depths of the formation of used Fraunhofer lines in the photosphere of the Sun are determined. It is shown that with the increasing of the effective formation depths of the lines integral and residual asymmetries of the lines profiles noticeably decrease. It is in fine agree- ment with the results of intensity dependence of asymmetry. The above-mentioned methods are critically compared and the advantages of author's method are shown. The computer program of calculation of the line-profile asymmetry parameters has been worked out.

  11. Quantitative investigation into methods for evaluating neocortical slice viability

    PubMed Central

    2013-01-01

    Background In cortical and hippocampal brain slice experiments, the viability of processed tissue is usually judged by the amplitude of extracellularly-recorded seizure-like event (SLE) activity. Surprisingly, the suitability of this approach for evaluating slice quality has not been objectively studied. Furthermore, a method for gauging the viability of quiescent tissue, in which SLE activity is intentionally suppressed, has not been documented. In this study we undertook to address both of these matters using the zero-magnesium SLE model in neocortical slices. Methods Using zero-magnesium SLE activity as the output parameter, we investigated: 1) changes in the pattern (amplitude, frequency and length) of SLE activity as slice health either deteriorated; or was compromised by altering the preparation methodology and; 2) in quiescent tissue, whether the triggering of high frequency field activity following electrode insertion predicted subsequent development of SLE activity and hence slice viability. Results SLE amplitude was the single most important variable correlating with slice viability, with a value less than 50?V indicative of tissue unlikely to be able to sustain population activity for more than 3060minutes. In quiescent slices, an increase in high frequency field activity immediately after electrode insertion predicted the development of SLE activity in 100% of cases. Furthermore, the magnitude of the increase in spectral power correlated with the amplitude of succeeding SLE activity (R2 40.9%, p?

  12. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  13. Quantitative estimation of poikilocytosis by the coherent optical method

    NASA Astrophysics Data System (ADS)

    Safonova, Larisa P.; Samorodov, Andrey V.; Spiridonov, Igor N.

    2000-05-01

    The investigation upon the necessity and the reliability required of the determination of the poikilocytosis in hematology has shown that existing techniques suffer from grave shortcomings. To determine a deviation of the erythrocytes' form from the normal (rounded) one in blood smears it is expedient to use an integrative estimate. The algorithm which is based on the correlation between erythrocyte morphological parameters with properties of the spatial-frequency spectrum of blood smear is suggested. During analytical and experimental research an integrative form parameter (IFP) which characterizes the increase of the relative concentration of cells with the changed form over 5% and the predominating type of poikilocytes was suggested. An algorithm of statistically reliable estimation of the IFP on the standard stained blood smears has been developed. To provide the quantitative characterization of the morphological features of cells a form vector has been proposed, and its validity for poikilocytes differentiation was shown.

  14. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy.

    PubMed

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-02-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8-29% smaller than those obtained from the other two methods. This discrepancy decreased to 3-19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method. PMID:26624514

  15. Quantitative Thin-Layer Chromatographic Method for Determination of Amantadine Hydrochloride

    PubMed Central

    Askal, Hassan F.; Khedr, Alaa S.; Darwish, Ibrahim A.; Mahmoud, Ramadan M.

    2008-01-01

    A simple and accurate thin-layer chromatographic (TLC) method for quantitative determination of amantadine hydrochloride (AMD) was developed and validated. The method employed TLC aluminum plates pre-coated with silica gel 60F-254 as a stationary phase. The solvent system used for development consisted of n-hexane-methanol-diethylamine (80: 40: 5, v/v/v). The separated spots were visualized as brown spots after spraying with modified Dragendorffs reagent solution. Amantadine hydrochloride was subjected to accelerated stress conditions: boiling, acid and alkaline hydrolysis, oxidation, and irradiation with ultraviolet light. The drug was found to be stable under all the investigated stress conditions. The method was validated for linearity, limits of detection (LOD) and quantitation (LOQ), precision, robustness, selectivity and accuracy. The optical densities of the separated spots were found to be linear with the amount of AMD in the range of 5-40 g/spot with good correlation coefficient (r=0.9994). The LOD and LOQ values were 0.72 and 2.38 g/spot, respectively. Statistical analysis proved that the method is repeatable and accurate for the determination of AMD. The method, in terms of its sensitivity, accuracy, precision, and robustness met the International Conference of Harmonization/Federal Drug Administration regulatory requirements. The proposed TLC method was successfully applied for the determination of AMD in bulk and capsules with good accuracy and precision; the label claim percentages were 99.0 1.0%. The results obtained by the proposed TLC method were comparable with those obtained by the official method. The proposed method is more advantageous than the previously published chromatographic methods as it involved the most simple chromatographic technique; TLC. In addition, method relies on the use of inexpensive equipment, a scanner and software, and not critical derivatizing reagent, thus maximizing the ability of laboratories worldwide to analyze samples of AMD. PMID:23675083

  16. Quantitative Laser Diffraction Method for the Assessment of Protein Subvisible Particles

    PubMed Central

    Totoki, Shinichiro; Yamamoto, Gaku; Tsumoto, Kouhei; Uchiyama, Susumu; Fukui, Kiichi

    2015-01-01

    Laser diffraction (LD) has been recognized as a method for estimating particle size distribution. Here, a recently developed quantitative LD (qLD) system, which is an LD method with extensive deconvolution analysis, was employed for the quantitative assessment of protein particles sizes, especially aimed at the quantification of 0.210 ?m diameter subvisible particles (SVPs). The qLD accurately estimated concentration distributions for silica beads with diameters ranging from 0.2 to 10 ?m that have refractive indices similar to that of protein particles. The linearity of concentration for micrometer-diameter silica beads was confirmed in the presence of a fixed concentration of submicrometer diameter beads. Similarly, submicrometer-diameter silica beads could be quantified in the presence of micrometer-diameter beads. Subsequently, stir- and heat-stressed intravenous immunoglobulins were evaluated by using the qLD, in which the refractive index of protein particles that was determined experimentally was used in the deconvolution analysis. The results showed that the concentration distributions of protein particles in SVP size range differ for the two stresses. The number concentration of the protein particles estimated using the qLD agreed well with that obtained using flow microscopy. This work demonstrates that qLD can be used for quantitative estimation of protein aggregates in SVP size range. 2014 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:618626, 2015 PMID:25449441

  17. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions. PMID:19216674

  18. A quantitative differentiation method for acrylic fibers by infrared spectroscopy.

    PubMed

    Causin, Valerio; Marega, Carla; Schiavone, Sergio; Marigo, Antonio

    2005-07-16

    Absorbance peak areas of nitrile (2240 cm(-1)), carbonyl (1730 cm(-1)) and CH (1370 cm(-1)) groups were obtained for 48 colorless acrylic fibers by infrared (IR) microspectroscopy. The carbonyl signal, related to the comonomers most commonly used in acrylic fibers, was ratioed against the nitrile and CH bands, pertaining to the backbone of the polymer chains. The ratios A1730/A2240 and A1730/A1370, a relative measure of the comonomer content in the fiber, were used to differentiate the samples. A decrease in the crystallinity of fibers has been noted with increasing comonomer content. Relative standard deviation (R.S.D.) of the ratios were 1 and 3% for repetitive analyses on the same location and along the length of the same single fiber, respectively. When different fibers of the same sample were examined, results were reproducible within 6%. This simple method can greatly enhance the evidential value of colorless acrylic fibers, being able to discriminate them and thus helping the Court to better assess their significance. PMID:15939143

  19. Link-based quantitative methods to identify differentially coexpressed genes and gene Pairs

    PubMed Central

    2011-01-01

    Background Differential coexpression analysis (DCEA) is increasingly used for investigating the global transcriptional mechanisms underlying phenotypic changes. Current DCEA methods mostly adopt a gene connectivity-based strategy to estimate differential coexpression, which is characterized by comparing the numbers of gene neighbors in different coexpression networks. Although it simplifies the calculation, this strategy mixes up the identities of different coexpression neighbors of a gene, and fails to differentiate significant differential coexpression changes from those trivial ones. Especially, the correlation-reversal is easily missed although it probably indicates remarkable biological significance. Results We developed two link-based quantitative methods, DCp and DCe, to identify differentially coexpressed genes and gene pairs (links). Bearing the uniqueness of exploiting the quantitative coexpression change of each gene pair in the coexpression networks, both methods proved to be superior to currently popular methods in simulation studies. Re-mining of a publicly available type 2 diabetes (T2D) expression dataset from the perspective of differential coexpression analysis led to additional discoveries than those from differential expression analysis. Conclusions This work pointed out the critical weakness of current popular DCEA methods, and proposed two link-based DCEA algorithms that will make contribution to the development of DCEA and help extend it to a broader spectrum. PMID:21806838

  20. Geothermal investigations in Nebraska: methods and results

    SciTech Connect

    Gosnold, W.D. Jr.; Eversoll, D.A.; Carlson, M.P.; Ruscetta, C.A.; Foley, D.

    1981-05-01

    At the inception of the geothermal resource assessment program in Nebraska there was some skepticism about the existence of any geothermal resources within the state. Now after two years of study and collaboration with other workers in the geothermal field it is found that about two-thirds of the state has access to a potential low-temperature resource. The nature of the resource is warm water in laterally extensive aquifers which are overlain by thick (> 1 km) sections of low thermal conductivity sediments. For most of the resource area the high temperatures in the aquifers result from high temperature gradients in the overlying shales. However, in the northcentral and far western parts of the state there is evidence for convective heat flow due to updip water flow in the aquifers. The success of the program has resulted from the synthesis of heat flow and temperature gradient measurements with stratigraphic and lithologic data. The methods used and the results obtained during the study are described.

  1. An experimental method for quantitatively evaluating the elemental processes of indoor radioactive aerosol behavior.

    PubMed

    Yamazawa, H; Yamada, S; Xu, Y; Hirao, S; Moriizumi, J

    2015-11-01

    An experimental method for quantitatively evaluating the elemental processes governing the indoor behaviour of naturally occurring radioactive aerosols was proposed. This method utilises transient response of aerosol concentrations to an artificial change in aerosol removal rate by turning on and off an air purifier. It was shown that the indoor-outdoor exchange rate and the indoor deposition rate could be estimated by a continuous measurement of outdoor and indoor aerosol number concentration measurements and by the method proposed in this study. Although the scatter of the estimated parameters is relatively large, both the methods gave consistent results. It was also found that the size distribution of radioactive aerosol particles and hence activity median aerodynamic diameter remained not largely affected by the operation of the air purifier, implying the predominance of the exchange and deposition processes over other processes causing change in the size distribution such as the size growth by coagulation and the size dependence of deposition. PMID:25935006

  2. An ECL-PCR method for quantitative detection of point mutation

    NASA Astrophysics Data System (ADS)

    Zhu, Debin; Xing, Da; Shen, Xingyan; Chen, Qun; Liu, Jinfeng

    2005-04-01

    A new method for identification of point mutations was proposed. Polymerase chain reaction (PCR) amplification of a sequence from genomic DNA was followed by digestion with a kind of restriction enzyme, which only cut the wild-type amplicon containing its recognition site. Reaction products were detected by electrochemiluminescence (ECL) assay after adsorption of the resulting DNA duplexes to the solid phase. One strand of PCR products carries biotin to be bound on a streptavidin-coated microbead for sample selection. Another strand carries Ru(bpy)32+ (TBR) to react with tripropylamine (TPA) to emit light for ECL detection. The method was applied to detect a specific point mutation in H-ras oncogene in T24 cell line. The results show that the detection limit for H-ras amplicon is 100 fmol and the linear range is more than 3 orders of magnitude, thus, make quantitative analysis possible. The genotype can be clearly discriminated. Results of the study suggest that ECL-PCR is a feasible quantitative method for safe, sensitive and rapid detection of point mutation in human genes.

  3. Methods for quantitatively determining fault slip using fault separation

    NASA Astrophysics Data System (ADS)

    Xu, S.-S.; Velasquillo-Martnez, L. G.; Grajales-Nishimura, J. M.; Murillo-Muetn, G.; Nieto-Samaniego, A. F.

    2007-10-01

    Fault slip and fault separation are generally not equal to each other, however, they are geometrically related. The fault slip ( S) is a vector with a magnitude, a direction, and a sense of the movement. In this paper, a series of approaches are introduced to estimate quantitatively the magnitude and direction of the fault slip using fault separations. For calculation, the known factors are the pitch of slip lineations ( ?), the pitch of a cutoff ( ?), the dip separation ( Smd) or the strike separation ( Smh) for one marker. The two main purposes of this work include: (1) to analyze the relationship between fault slip and fault separation when slickenside lineations of a fault are known; (2) to estimate the slip direction when the parameters Smd or Smh, and ? for two non-parallel markers at a place (e.g., a point) are known. We tested the approaches using an example from a mainly strike-slip fault in East Quantoxhead, United Kingdom, and another example from the Jordan Field, Ector County, Texas. Also, we estimated the relative errors of apparent heave of the normal faults from the Sierra de San Miguelito, central Mexico.

  4. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbhl, Werner; Dorner, Brigitte G

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  5. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  6. Quantitative Analysis of Single Particle Trajectories: Mean Maximal Excursion Method

    PubMed Central

    Tejedor, Vincent; Bénichou, Olivier; Voituriez, Raphael; Jungmann, Ralf; Simmel, Friedrich; Selhuber-Unkel, Christine; Oddershede, Lene B.; Metzler, Ralf

    2010-01-01

    An increasing number of experimental studies employ single particle tracking to probe the physical environment in complex systems. We here propose and discuss what we believe are new methods to analyze the time series of the particle traces, in particular, for subdiffusion phenomena. We discuss the statistical properties of mean maximal excursions (MMEs), i.e., the maximal distance covered by a test particle up to time t. Compared to traditional methods focusing on the mean-squared displacement we show that the MME analysis performs better in the determination of the anomalous diffusion exponent. We also demonstrate that combination of regular moments with moments of the MME method provides additional criteria to determine the exact physical nature of the underlying stochastic subdiffusion processes. We put the methods to test using experimental data as well as simulated time series from different models for normal and anomalous dynamics such as diffusion on fractals, continuous time random walks, and fractional Brownian motion. PMID:20371337

  7. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  8. Reconstruction-classification method for quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Malone, Emma; Powell, Samuel; Cox, Ben T.; Arridge, Simon

    2015-12-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches.

  9. Reconstruction-classification method for quantitative photoacoustic tomography.

    PubMed

    Malone, Emma; Powell, Samuel; Cox, Ben T; Arridge, Simon

    2015-12-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches. PMID:26662815

  10. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  11. University Students' Research Orientations: Do Negative Attitudes Exist toward Quantitative Methods?

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2005-01-01

    This paper examines university social science and education students' views of research methodology, especially asking whether a negative research orientation towards quantitative methods exists. Finnish (n = 196) and US (n = 122) students answered a questionnaire concerning their views on quantitative, qualitative, empirical, and theoretical

  12. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  13. A general method for the quantitative assessment of mineral pigments.

    PubMed

    Zurita Ares, M C; Fernndez, J M

    2016-01-01

    A general method for the estimation of mineral pigment contents in different bases has been proposed using a sole set of calibration curves, (one for each pigment), calculated for a white standard base, thus elaborating patterns for each utilized base is not necessary. The method can be used in different bases and its validity had ev en been proved in strongly tinted bases. The method consists of a novel procedure that combines diffuse reflectance spectroscopy, second derivatives and the Kubelka-Munk function. This technique has proved to be at least one order of magnitude more sensitive than X-Ray diffraction for colored compounds, since it allowed the determination of the pigment amount in colored samples containing 0.5wt% of pigment that was not detected by X-Ray Diffraction. The method can be used to estimate the concentration of mineral pigments in a wide variety of either natural or artificial materials, since it does not requiere the calculation of each pigment pattern in every base. This fact could have important industrial consequences, as the proposed method would be more convenient, faster and cheaper. PMID:26695268

  14. Quantitative interpretation of mineral hyperspectral images based on principal component analysis and independent component analysis methods.

    PubMed

    Jiang, Xiping; Jiang, Yu; Wu, Fang; Wu, Fenghuang

    2014-01-01

    Interpretation of mineral hyperspectral images provides large amounts of high-dimensional data, which is often complicated by mixed pixels. The quantitative interpretation of hyperspectral images is known to be extremely difficult when three types of information are unknown, namely, the number of pure pixels, the spectrum of pure pixels, and the mixing matrix. The problem is made even more complex by the disturbance of noise. The key to interpreting abstract mineral component information, i.e., pixel unmixing and abundance inversion, is how to effectively reduce noise, dimension, and redundancy. A three-step procedure is developed in this study for quantitative interpretation of hyperspectral images. First, the principal component analysis (PCA) method can be used to process the pixel spectrum matrix and keep characteristic vectors with larger eigenvalues. This can effectively reduce the noise and redundancy, which facilitates the abstraction of major component information. Second, the independent component analysis (ICA) method can be used to identify and unmix the pixels based on the linear mixed model. Third, the pure-pixel spectrums can be normalized for abundance inversion, which gives the abundance of each pure pixel. In numerical experiments, both simulation data and actual data were used to demonstrate the performance of our three-step procedure. Under simulation data, the results of our procedure were compared with theoretical values. Under the actual data measured from core hyperspectral images, the results obtained through our algorithm are compared with those of similar software (Mineral Spectral Analysis 1.0, Nanjing Institute of Geology and Mineral Resources). The comparisons show that our method is effective and can provide reference for quantitative interpretation of hyperspectral images. PMID:24694708

  15. The fundamentals and applications of phase field method in quantitative microstructural modeling

    NASA Astrophysics Data System (ADS)

    Shen, Chen

    The key to predicting and therefore controlling properties of materials is the knowledge of microstructure. As computer modeling and simulation is becoming an important part of materials science and engineering, there is an ever-increasing demand for quantitative models that are able to handle microstructures of realistic complexity at length and time scales of practical interest. The phase field approach has become the method of choice for modeling complicated microstructural evolutions during various phase transformations, grain growth and plastic deformation. Using gradient thermodynamics of non-uniform systems and Langevin dynamics, the method characterizes arbitrary microstructures and their spatial-temporal evolution with field variables, and is capable of simulating microstructures and their evolution under various realistic conditions. However, the adoption of the phase field method in practical applications has been slow because the current phase field microstructure modeling is qualitative in nature. In this thesis, recent efforts in developing the phase field method for quantitative microstructure modeling are presented. This includes extension of the phase field method to situations where nucleation, growth and coarsening occur concurrently, incorporation of anisotropic elastic energy into the nucleation activation energy, and comparison of phase field kinetics for diffusion-controlled phase transformations with Johnson-Mehl-Avrami-Kolmogorov (JMAK) theory. The most recent extensions of the phase field method to modeling dislocation networks, dislocation core structures and partial dislocations, and dislocation interactions with gamma/gamma' microstructures in superalloys are also presented. The length scale limitations and practical approaches to increase simulation length scales for quantitative modeling are discussed for a quite general category of phase field applications. These extensions enable various new understandings of microstructure. For example, coherent precipitates are found to behave similar to dislocations and grain boundaries, causing solute segregation, correlated nucleation and autocatalytic effect. The overall kinetics in diffusion-controlled precipitation agrees with the JMAK prediction only at early stages, and due to soft-impingement and the Gibbs-Thomson effect the later kinetics could deviate considerably. The new formulations of the crystalline energy and gradient energy in phase field model of dislocations allow to study complex dislocation structures, including networks and dissociated nodes, in a self-consistent way. The introduction of gamma-surfaces for constituent phases enables treating dislocation motion in multi-phase microstructure in one model. Finally, the discussion on the length scale clarifies the applicability of the conventional approaches for increasing simulation length scales, and their respective consequence to the quantitative results.

  16. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, D.A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  17. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models.

    PubMed

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  18. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models

    PubMed Central

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  19. Quantifying disability: data, methods and results.

    PubMed Central

    Murray, C. J.; Lopez, A. D.

    1994-01-01

    Conventional methods for collecting, analysing and disseminating data and information on disability in populations have relied on cross-sectional censuses and surveys which measure prevalence in a given period. While this may be relevant for defining the extent and demographic pattern of disabilities in a population, and thus indicating the need for rehabilitative services, prevention requires detailed information on the underlying diseases and injuries that cause disabilities. The Global Burden of Disease methodology described in this paper provides a mechanism for quantifying the health consequences of the years of life lived with disabilities by first estimating the age-sex-specific incidence rates of underlying conditions, and then mapping these to a single disability index which collectively reflects the probability of progressing to a disability, the duration of life lived with the disability, and the approximate severity of the disability in terms of activity restriction. Detailed estimates of the number of disability-adjusted life years (DALYs) lived are provided in this paper, for eight geographical regions. The results should be useful to those concerned with planning health services for the disabled and, more particularly, with determining policies to prevent the underlying conditions which give rise to serious disabling sequelae. PMID:8062403

  20. MODIS Radiometric Calibration Program, Methods and Results

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong; Guenther, Bruce; Angal, Amit; Barnes, William; Salomonson, Vincent; Sun, Junqiang; Wenny, Brian

    2012-01-01

    As a key instrument for NASA s Earth Observing System (EOS), the Moderate Resolution Imaging Spectroradiometer (MODIS) has made significant contributions to the remote sensing community with its unprecedented amount of data products continuously generated from its observations and freely distributed to users worldwide. MODIS observations, covering spectral regions from visible (VIS) to long-wave infrared (LWIR), have enabled a broad range of research activities and applications for studies of the earth s interactive system of land, oceans, and atmosphere. In addition to extensive pre-launch measurements, developed to characterize sensor performance, MODIS carries a set of on-board calibrators (OBC) that can be used to track on-orbit changes of various sensor characteristics. Most importantly, dedicated and continuous calibration efforts have been made to maintain sensor data quality. This paper provides an overview of the MODIS calibration program, on-orbit calibration activities, methods, and performance. Key calibration results and lessons learned from the MODIS calibration effort are also presented in this paper.

  1. Magnetic ligation method for quantitative detection of microRNAs.

    PubMed

    Liong, Monty; Im, Hyungsoon; Majmudar, Maulik D; Aguirre, Aaron D; Sebas, Matthew; Lee, Hakho; Weissleder, Ralph

    2014-07-01

    A magnetic ligation method is utilized for the detection of microRNAs among a complex biological background without polymerase chain reaction or nucleotide modification. The sandwich probes assay can be adapted to analyze a panel of microRNAs associated with cardiovascular diseases in heart tissue samples. PMID:24532323

  2. Magnetic Ligation Method for Quantitative Detection of MicroRNAs

    PubMed Central

    Liong, Monty; Im, Hyungsoon; Majmudar, Maulik D.; Aguirre, Aaron D.; Sebas, Matthew; Lee, Hakho; Weissleder, Ralph

    2014-01-01

    A magnetic ligation method is utilized for the detection of microRNAs amongst a complex biological background without polymerase chain reaction or nucleotide modification. The sandwich probes assay can be adapted to analyze a panel of microRNAs associated with cardiovascular diseases in heart tissue samples. PMID:24532323

  3. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, Edward F. (Gaithersburg, MD); Keller, Richard A. (Los Alamos, NM); Apel, Charles T. (Los Alamos, NM)

    1983-01-01

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions.

  4. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1983-09-06

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions. 6 figs.

  5. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  6. Selection methods in forage breeding: a quantitative appraisal

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Forage breeding can be extraordinarily complex because of the number of species, perenniality, mode of reproduction, mating system, and the genetic correlation for some traits evaluated in spaced plants vs. performance under cultivation. Aiming to compare eight forage breeding methods for direct sel...

  7. QUANTITATIVE METHODS FOR CROSS-SPECIES MAPPING (CSM)

    EPA Science Inventory

    Cross species extrapolation will be defined as prediction from one species to another without empirical verification. ross species mapping (CSM) is the same except empirical verification is performed. SM may be viewed as validation of methods for extrapolation. Algorithms for CSM...

  8. Quantitative method for in vitro matrigel invasiveness measurement through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Rey, Juan A; Dotor, Javier; Castresana, Javier S

    2014-10-01

    The determination of cell invasion by matrigel assay is usually evaluated by counting cells able to pass through a porous membrane and attach themselves to the other side, or by an indirect quantification of eluted specific cell staining dye by means of optical density measurement. This paper describes a quantitative analytical imaging approach for determining the invasiveness of tumor cells using a simple method, based on images processing with the public domain software, ImageJ. Images obtained by direct capture are split into the red channel, and the generated image is used to measure the area that cells cover in the picture. To overcome the several disadvantages that classical cell invasion determinations present, we propose this method because it generates more accurate and sensitive determinations, and it could be a reasonable option for improving the quality of the results. The cost-effective alternative method proposed is based on this simple and robust software that is worldwide affordable. PMID:24990701

  9. Systematic Comparison and Validation of Quantitative Real-Time PCR Methods for the Quantitation of Adeno-Associated Viral Products.

    PubMed

    Werling, Natalie Jayne; Satkunanathan, Stifani; Thorpe, Robin; Zhao, Yuan

    2015-06-01

    Adeno-associated viral (AAV) vectors show great promise for gene therapy because of their excellent safety profile; however, development of robust dose-determining assays for AAV has presented a significant challenge. With the ultimate goal of future harmonization and standardization of AAV dose determination assays, we systematically analyzed the influence of key variables, including sample preparation procedure, the choice of primers, and real-time quantitative PCR (qPCR) target sequences and calibration DNA conformation on the qPCR quantitation of AAV products. Our results emphasize the importance of designing qPCR primers and conducting sample preparation and demonstrate the need for extensive characterization, vigorous control, and use of reference materials in clinical dose determination. PMID:25953194

  10. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms. Graphical abstract A novel photographic method was developed to quantify bacterial biofilms. Broad spectrum biomolecular staining enhanced the visibility of the biofilms. Image analysis objectively and quantitatively measured biofilm accumulation from digital photographs. When compared to independent measurements of cell density the new method accurately quantified growth of Pseudomonas putida biofilms as they grew over time. The graph shows a comparison of biofilm quantification from cell density and image analysis. Error bars show standard deviation from three independent samples. Inset photographs show effect of staining. PMID:26643074

  11. Methods for Acquisition of Quantitative Data from Confocal Images of Gene Expression in situ

    PubMed Central

    Surkova, S. Yu.; Myasnikova, E. M.; Kozlov, K. N.; Samsonova, A. A.; Reinitz, J.; Samsonova, M. G.

    2009-01-01

    In this review, we summarize original methods for the extraction of quantitative information from confocal images of gene-expression patterns. These methods include image segmentation, the extraction of quantitative numerical data on gene expression, and the removal of background signal and spatial registration. Finally, it is possible to construct a spatiotemporal atlas of gene expression from individual images recorded at each developmental stage. Initially all methods were developed to extract quantitative numerical information from confocal images of segmentation gene expression in Drosophila melanogaster. The application of these methods to Drosophila images makes it possible to reveal new mechanisms in the formation of segmentation gene expression domains, as well as to construct a quantitative atlas of segmentation gene expression. Most image processing procedures can be easily adapted to process a wide range of biological images. PMID:19343098

  12. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  13. Composition and quantitation of microalgal lipids by ERETIC H NMR method.

    PubMed

    Nuzzo, Genoveffa; Gallo, Carmela; d'Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo

    2013-10-01

    Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790

  14. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    SciTech Connect

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-15

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10{sup -6}, comparable with the recent results reported in the literature.

  15. A Bead-Based Method for Multiplexed Identification and Quantitation of DNA Sequences Using Flow Cytometry

    PubMed Central

    Spiro, Alexander; Lowe, Mary; Brown, Drew

    2000-01-01

    A new multiplexed, bead-based method which utilizes nucleic acid hybridizations on the surface of microscopic polystyrene spheres to identify specific sequences in heterogeneous mixtures of DNA sequences is described. The method consists of three elements: beads (5.6-?m diameter) with oligomer capture probes attached to the surface, three fluorophores for multiplexed detection, and flow cytometry instrumentation. Two fluorophores are impregnated within each bead in varying amounts to create different bead types, each associated with a unique probe. The third fluorophore is a reporter. Following capture of fluorescent cDNA sequences from environmental samples, the beads are analyzed by flow cytometric techniques which yield a signal intensity for each capture probe proportional to the amount of target sequences in the analyte. In this study, a direct hybrid capture assay was developed and evaluated with regard to sequence discrimination and quantitation of abundances. The target sequences (628 to 728 bp in length) were obtained from the 16S/23S intergenic spacer region of microorganisms collected from polluted groundwater at the nuclear waste site in Hanford, Wash. A fluorescence standard consisting of beads with a known number of fluorescent DNA molecules on the surface was developed, and the resolution, sensitivity, and lower detection limit for measuring abundances were determined. The results were compared with those of a DNA microarray using the same sequences. The bead method exhibited far superior sequence discrimination and possesses features which facilitate accurate quantitation. PMID:11010868

  16. A bead-based method for multiplexed identification and quantitation of DNA sequences using flow cytometry.

    PubMed

    Spiro, A; Lowe, M; Brown, D

    2000-10-01

    A new multiplexed, bead-based method which utilizes nucleic acid hybridizations on the surface of microscopic polystyrene spheres to identify specific sequences in heterogeneous mixtures of DNA sequences is described. The method consists of three elements: beads (5.6-microm diameter) with oligomer capture probes attached to the surface, three fluorophores for multiplexed detection, and flow cytometry instrumentation. Two fluorophores are impregnated within each bead in varying amounts to create different bead types, each associated with a unique probe. The third fluorophore is a reporter. Following capture of fluorescent cDNA sequences from environmental samples, the beads are analyzed by flow cytometric techniques which yield a signal intensity for each capture probe proportional to the amount of target sequences in the analyte. In this study, a direct hybrid capture assay was developed and evaluated with regard to sequence discrimination and quantitation of abundances. The target sequences (628 to 728 bp in length) were obtained from the 16S/23S intergenic spacer region of microorganisms collected from polluted groundwater at the nuclear waste site in Hanford, Wash. A fluorescence standard consisting of beads with a known number of fluorescent DNA molecules on the surface was developed, and the resolution, sensitivity, and lower detection limit for measuring abundances were determined. The results were compared with those of a DNA microarray using the same sequences. The bead method exhibited far superior sequence discrimination and possesses features which facilitate accurate quantitation. PMID:11010868

  17. Quantitative Trait Locus Mapping Methods for Diversity Outbred Mice

    PubMed Central

    Gatti, Daniel M.; Svenson, Karen L.; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J.; Broman, Karl W.; Churchill, Gary A.

    2014-01-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  18. Quantitative trait locus mapping methods for diversity outbred mice.

    PubMed

    Gatti, Daniel M; Svenson, Karen L; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J; Broman, Karl W; Churchill, Gary A

    2014-09-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  19. Measurable impact of RNA quality on gene expression results from quantitative PCR

    PubMed Central

    Vermeulen, Jolle; De Preter, Katleen; Lefever, Steve; Nuytens, Justine; De Vloed, Fanny; Derveaux, Stefaan; Hellemans, Jan; Speleman, Frank; Vandesompele, Jo

    2011-01-01

    Compromised RNA quality is suggested to lead to unreliable results in gene expression studies. Therefore, assessment of RNA integrity and purity is deemed essential prior to including samples in the analytical pipeline. This may be of particular importance when diagnostic, prognostic or therapeutic conclusions depend on such analyses. In this study, the comparative value of six RNA quality parameters was determined using a large panel of 740 primary tumour samples for which real-time quantitative PCR gene expression results were available. The tested parameters comprise of microfluidic capillary electrophoresis based 18S/28S rRNA ratio and RNA Quality Index value, HPRT1 5?3? difference in quantification cycle (Cq) and HPRT1 3? Cq value based on a 5?/3? ratio mRNA integrity assay, the Cq value of expressed Alu repeat sequences and a normalization factor based on the mean expression level of four reference genes. Upon establishment of an innovative analytical framework to assess impact of RNA quality, we observed a measurable impact of RNA quality on the variation of the reference genes, on the significance of differential expression of prognostic marker genes between two cancer patient risk groups, and on risk classification performance using a multigene signature. This study forms the basis for further rational assessment of reverse transcription quantitative PCR based results in relation to RNA quality. PMID:21317187

  20. Measurable impact of RNA quality on gene expression results from quantitative PCR.

    PubMed

    Vermeulen, Jolle; De Preter, Katleen; Lefever, Steve; Nuytens, Justine; De Vloed, Fanny; Derveaux, Stefaan; Hellemans, Jan; Speleman, Frank; Vandesompele, Jo

    2011-05-01

    Compromised RNA quality is suggested to lead to unreliable results in gene expression studies. Therefore, assessment of RNA integrity and purity is deemed essential prior to including samples in the analytical pipeline. This may be of particular importance when diagnostic, prognostic or therapeutic conclusions depend on such analyses. In this study, the comparative value of six RNA quality parameters was determined using a large panel of 740 primary tumour samples for which real-time quantitative PCR gene expression results were available. The tested parameters comprise of microfluidic capillary electrophoresis based 18S/28S rRNA ratio and RNA Quality Index value, HPRT1 5'-3' difference in quantification cycle (Cq) and HPRT1 3' Cq value based on a 5'/3' ratio mRNA integrity assay, the Cq value of expressed Alu repeat sequences and a normalization factor based on the mean expression level of four reference genes. Upon establishment of an innovative analytical framework to assess impact of RNA quality, we observed a measurable impact of RNA quality on the variation of the reference genes, on the significance of differential expression of prognostic marker genes between two cancer patient risk groups, and on risk classification performance using a multigene signature. This study forms the basis for further rational assessment of reverse transcription quantitative PCR based results in relation to RNA quality. PMID:21317187

  1. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages.

    PubMed

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and "naked" unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  2. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  3. Methods for quantitative analysis of trabecular bone structure.

    PubMed

    Cortet, B; Colin, D; Dubois, P; Delcambre, B; Marchandise, X

    1995-12-01

    Bone mineral density accounts for 70% to 80% of the mechanical resistance of bone but is unrelated to bone tissue structure. The vertebral fracture risk increases with advancing age irrespective of whether or not bone mineral density decreases, suggesting that changes in bone microarchitecture contribute significantly to the development of osteoporosis. In contrast to bone mass, bone architecture is difficult to evaluate. Among the various methods developed to investigate bone structure, biomechanical studies are of limited value since they are done on cadaver bones. Measurement of microarchitectural parameters (e.g., mean trabecular thickness, density and separation) in bone specimens obtained by needle biopsy is the gold-standard technique. Parameters reflecting trabecular interconnections (e.g., total number of nodes and free ends) can also be measured on needle biopsy specimens. New techniques of as yet unproven validity include star volume and trabecular bone pattern factor measurement. Noninvasive techniques capable of supplying qualitative information about bone tissue are also under study. Ultrasonography can theoretically provide data on bone microarchitecture but has not yet been proven useful in clinical practice. Statistical, structural, or fractal analysis techniques can be used to evaluate bone texture on digitized roentgenograms, computed tomography sections, or magnetic resonance imaging displays; although this approach holds great promise, it is still under evaluation and has not yet been compared with histomorphometry. Lastly, the apparent relaxation time of bone marrow determined using magnetic resonance imaging may also provide information on bone structure. PMID:8869221

  4. New methods for quantitative and qualitative facial studies: an overview.

    PubMed

    Thomas, I T; Hintz, R J; Frias, J L

    1989-01-01

    The clinical study of birth defects has traditionally followed the Gestalt approach, with a trend, in recent years, toward more objective delineation. Data collection, however, has been largely restricted to measurements from X-rays and anthropometry. In other fields, new techniques are being applied that capitalize on the use of modern computer technology. One such technique is that of remote sensing, of which photogrammetry is a branch. Cartographers, surveyors and engineers, using specially designed cameras, have applied geometrical techniques to locate points on an object precisely. These techniques, in their long-range application, have become part of our industrial technology and have assumed great importance with the development of satellite-borne surveillance systems. The close-range application of similar techniques has the potential for extremely accurate clinical measurement. We are currently evaluating the application of remote sensing to facial measurement using three conventional 35 mm still cameras. The subject is photographed in front of a carefully measured grid, and digitization is then carried out on 35-mm slides specific landmarks on the cranioface are identified, along with points on the background grid and the four corners of the slide frame, and are registered as xy coordinates by a digitizer. These coordinates are then converted into precise locations in object space. The technique is capable of producing measurements to within 1/100th of an inch. We suggest that remote sensing methods such as this may well be of great value in the study of congenital malformations. PMID:2677039

  5. A novel method for quantitative analysis of acetylacetone and ethyl acetoacetate by fluorine-19 nuclear magnetic spectroscopy.

    PubMed

    Zhou, Lulin; Li, Cheng; Weng, Xinchu

    2016-03-01

    A new method utilization of NMR spectra was developed for structural and quantitative analysis of enol forms of acetylacetone and ethyl acetoacetate. Acetylacetone and ethyl acetoacetate were determined by (19) F NMR upon derivatisation with ?-fluorobenzoyl chloride. The base-catalyzed derivatives of acetylacetone and ethyl acetoacetate reaction with ?-fluorobenzoyl chloride were analyzed by (1) H and (13) C NMR spectroscopies. E and Z configurations of acetylacetone and ethyl acetoacetate were separated and purified by thin layer chromatography. In addition, the ability of (19) F NMR for quantitative analysis of acetylacetone by integration of the appropriate signals of the derivatives were tested and compared. The results further testified the enol forms of acetylacetone and ethyl acetoacetate and the feasibility of (19) F NMR method. This method can be potentially used to characterize E and Z isomers and quantitatively analyze E/Z ratio of ?-diketone and ?-ketoester homologues. Copyright 2015 John Wiley & Sons, Ltd. PMID:26521683

  6. Immunochemical methods for quantitation of vitamin B6. Technical report

    SciTech Connect

    Brandon, D.L.; Corse, J.W.

    1981-09-30

    A procedure is described which proposes schemes for determining the total of all B6 vitamins in acid-hydrolyzed samples utilizing a radio-immunoassay (RIA) or an enzyme-immunoassay (EIA). Sample preparation is similar for both RIA and EIA. Two specific antibodies (antipyridoxine and antipyridoxamine) are employed to determine pyridoxamine, a portion of the sample is reduced with sodium borohydride. Pyridoxal is determined by difference between pyridoxine before and after reduction. The results indicate that two procedures have been developed which are selective for pyridoxamine (the fluorescent enzyme immunoassay and the spin immunoassay) and one assay which is equally sensitive to pyridoxine and pyridoxamine (the radio-immunoassay).

  7. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    PubMed

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

  8. Bird community as an indicator of biodiversity: results from quantitative surveys in Brazil.

    PubMed

    Vielliard, J M

    2000-09-01

    This short review presents the results obtained in several localities of Brazil on the composition of forest bird communities. Data were collected since the late 80's, after we introduced a new methodology of quantitative survey, based on acoustic identification and unlimited-radius point census. Although these data are still scattered, they show uniquely precise and coherently comparative patterns of composition of forest bird communities. Our methodology has the advantage of being absolutely non-disturbing, highly efficient in the field and immediately processed. Results confirm that the structure of a bird community is a good indicator of biodiversity, particularly useful where biodiversity is high. Many of these data are available only in unpublished dissertations and abstracts of congress communications, or are being analysed. A cooperative program is needed to promote new surveys and publish their results, as a contribution for measuring and monitoring biodiversity, especially in complex endangered habitats. PMID:11028097

  9. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling

    PubMed Central

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol. PMID:23176383

  10. A quantitative method for photovoltaic encapsulation system optimization

    NASA Technical Reports Server (NTRS)

    Garcia, A., III; Minning, C. P.; Cuddihy, E. F.

    1981-01-01

    It is pointed out that the design of encapsulation systems for flat plate photovoltaic modules requires the fulfillment of conflicting design requirements. An investigation was conducted with the objective to find an approach which will make it possible to determine a system with optimum characteristics. The results of the thermal, optical, structural, and electrical isolation analyses performed in the investigation indicate the major factors in the design of terrestrial photovoltaic modules. For defect-free materials, minimum encapsulation thicknesses are determined primarily by structural considerations. Cell temperature is not strongly affected by encapsulant thickness or thermal conductivity. The emissivity of module surfaces exerts a significant influence on cell temperature. Encapsulants should be elastomeric, and ribs are required on substrate modules. Aluminum is unsuitable as a substrate material. Antireflection coating is required on cell surfaces.

  11. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level. PMID:24187313

  12. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  13. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such

  14. Spatial Access Priority Mapping (SAPM) with Fishers: A Quantitative GIS Method for Participatory Planning

    PubMed Central

    Yates, Katherine L.; Schoeman, David S.

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers’ spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers’ willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process in a transparent, quantitative way. PMID:23874623

  15. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (?2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (?2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  16. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    PubMed Central

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  17. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Pirg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  18. A new method for fast quantitative mapping of absolute water content in vivo.

    PubMed

    Neeb, H; Zilles, K; Shah, N J

    2006-07-01

    The presence of brain edema, in its various forms, is an accompanying feature of many diseased states. Although the localized occurrence of brain edema may be demonstrated with MRI, the quantitative determination of absolute water content, an aspect that could play an important role in the objective evaluation of the dynamics of brain edema and the monitoring of the efficiency of treatment, is much more demanding. We present a method for the localized and quantitative measurement of absolute water content based on the combination of two fast multi-slice and multi-time point sequences QUTE and TAPIR for mapping the T(2)* and T(1) relaxation times, respectively. Incorporation of corrections for local B(1) field miscalibrations, temperature differences between the subject and a reference probe placed in the FOV, receiver profile inhomogeneities and T(1) saturation effects are included and allow the determination of water content with anatomical resolution and a precision >98%. The method was validated in phantom studies and was applied to the localized in vivo measurement of water content in a group of normal individuals and a patient with brain tumor. The results demonstrate that in vivo measurement of regional absolute water content is possible in clinically relevant measurement times with a statistical and systematic measurement error of <2%. PMID:16650780

  19. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  20. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  1. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    PubMed

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time <60 s) and method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4 (+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and sample analysis throughput suitable for broad applications in life sciences, agricultural chemistry, consumer safety, and beyond. Graphical abstract Position of FIA-MS relative to chromatography-MS and ambient MS in terms of analytical figures of merit and sample analysis throughput. PMID:26670771

  2. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the clinical setting.

  3. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;…

  4. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;

  5. Quantitative accuracy analysis of the discontinuous Galerkin method for seismic wave propagation

    NASA Astrophysics Data System (ADS)

    Kser, Martin; Hermann, Verena; Puente, Josep de la

    2008-06-01

    We present a quantitative accuracy analysis of the Discontinuous Galerkin Finite-Element method for the simulation of seismic wave propagation on tetrahedral meshes. Several parameters are responsible for the accuracy of results, such as the chosen approximation order, the spatial discretization, that is, number of elements per wavelength, and the propagation distance of the waves due to numerical dispersion and dissipation. As error norm we choose the time-frequency representation of the envelope and phase misfit of seismograms to assess the accuracy of the resulting seismograms since this provides the time evolution of the spectral content and allows for the clear separation of amplitude and phase errors obtained by the numerical method. Our results can be directly used to set up the necessary modelling parameters for practical applications, such as the minimum approximation order for a given mesh spacing to reach a desired accuracy. Finally, we apply our results to the well-acknowledged LOH.1 and LOH.3 problems of the SPICE Code Validation project, including heterogeneous material and the free surface boundary condition, and compare our solutions with those of other methods. In general, we want to stress the increasing importance of certain standard procedures to facilitate future code validations and comparisons of results in the community of numerical seismology.

  6. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods.

    PubMed

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-15

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%). PMID:26774813

  7. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    NASA Astrophysics Data System (ADS)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  8. Statistical Methods for Summarizing Independent Correlational Results.

    ERIC Educational Resources Information Center

    Viana, Marlos A. G.

    1980-01-01

    Statistical techniques for summarizing results from independent correlational studies are presented. The case in which only the sample correlation coefficients are available and the case in which the original paired data are available are both considered. (Author/JKS)

  9. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  10. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hbner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials. PMID:11767156

  11. New facility design and work method for the quantitative fit testing laboratory. Master's thesis

    SciTech Connect

    Ward, G.F.

    1989-05-01

    The United States Air Force School of Aerospace Medicine (USAFSAM) tests the quantitative fit of masks which are worn by military personnel during nuclear, biological, and chemical warfare. Subjects are placed in a Dynatech-Frontier Fit Testing Chamber, salt air is fed into the chamber, and samples of air are drawn from the mask and the chamber. The ratio of salt air outside the mask to salt air inside the mask is called the quantitative fit factor. A motion-time study was conducted to evaluate the efficiency of the layout and work method presently used in the laboratory. A link analysis was done to determine equipment priorities, and the link data and design guidelines were used to develop three proposed laboratory designs. The proposals were evaluated by projecting the time and motion efficiency, and the energy expended working in each design. Also evaluated were the lengths of the equipment links for each proposal, and each proposal's adherence to design guidelines. A mock-up was built of the best design proposal, and a second motion-time study was run. Results showed that with the new laboratory and work procedures, the USAFSAM analyst could test 116 more subjects per year than are currently tested. Finally, the results of a questionnaire given to the analyst indicated that user acceptance of the work area improved with the new design.

  12. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. PMID:21402442

  13. The use of electromagnetic induction methods for establishing quantitative permafrost models in West Greenland

    NASA Astrophysics Data System (ADS)

    Ingeman-Nielsen, Thomas; Brandt, Inooraq

    2010-05-01

    The sedimentary settings at West Greenlandic town and infrastructural development sites are dominated by fine-grained marine deposits of late to post glacial origin. Prior to permafrost formation, these materials were leached by percolating precipitation, resulting in depletion of salts. Present day permafrost in these deposits is therefore very ice-rich with ice contents approaching 50-70% vol. in some areas. Such formations are of great concern in building and construction projects in Greenland, as they loose strength and bearing capacity upon thaw. It is therefore of both technical and economical interest to develop methods to precisely investigate and determine parameters such as ice-content and depth to bedrock in these areas. In terms of geophysical methods for near surface investigations, traditional methods such as Electrical Resistivity Tomography (ERT) and Refraction Seismics (RS) have generally been applied with success. The Georadar method usually fails due to very limited penetration depth in the fine-grained materials, and Electromagnetic Induction (EMI) methods are seldom applicable for quantitative interpretation due to the very high resistivities causing low induced currents and thus small secondary fields. Nevertheless, in some areas of Greenland the marine sequence was exposed relatively late, and as a result the sediments may not be completely leached of salts. In such cases, layers with pore water salinity approaching that of sea water, may be present below an upper layer of very ice rich permafrost. The saline pore water causes a freezing-point depression which results in technically unfrozen sediments at permafrost temperatures around -3 C. Traditional ERT and VES measurements are severely affected by equivalency problems in these settings, practically prohibiting reasonable quantitative interpretation without constraining information. Such prior information may be obtained of course from boreholes, but equipment capable of drilling permafrozen sediments is generally not available in Greenland, and mobilization costs are therefore considerable thus limiting the use of geotechnical borings to larger infrastructure and construction projects. To overcome these problems, we have tested the use of shallow Transient ElectroMagnetic (TEM) measurements, to provide constraints in terms of depth to and resistivity of the conductive saline layer. We have tested such a setup at two field sites in the Ilulissat area (mid-west Greenland), one with available borehole information (site A), the second without (site C). VES and TEM soundings were collected at each site and the respective data sets subsequently inverted using a mutually constrained inversion scheme. At site A, the TEM measurements (20x20m square loop, in-loop configuration) show substantial and repeatable negative amplitude segments, and therefore it has not presently been possible to provide a quantitative interpretation for this location. Negative segments are typically a sign of Induced Polarization or cultural effects. Forward modeling based on inversion of the VES data constrained with borehole information has indicated that IP effects could indeed be the cause of the observed anomaly, although such effects are not normally expected in permafrost or saline deposits. Data from site C has shown that jointly inverting the TEM and VES measurements does provide well determined estimates for all layer parameters except the thickness of the active layer and resistivity of the bedrock. The active layer thickness may be easily probed to provide prior information on this parameter, and the bedrock resistivity is of limited interest in technical applications. Although no confirming borehole information is available at this site, these results indicate that joint or mutually constrained inversion of TEM and VES data is feasible and that this setup may provide a fast and cost effective method for establishing quantitative interpretations of permafrost structure in partly saline conditions.

  14. Evaluation of methods for quantitating erythrocyte antibodies and description of a new method using horseradish peroxidase-labelled antiglobulin.

    PubMed

    Greenwalt, T J; Steane, E A

    1980-01-01

    Concentrates of anti-D and rabbit anti-human globulin (AHG) prepared by standard elution and ammonium sulphate precipitation methods were labelled with 125I and 131I and horseradish peroxidase (HRP), respectively. The number of anti-D molecules attached per rbc ranged from 12,500 to 21,300 for the various phenotypes studied and the label of D-negative rbc never exceeded 3.4% of these values. With 131I-AHG the uptake by control D-negative cells averaged 13% of the uptake by D-positive cells. It was also found that the average ratio of AHG molecules reacting with each IgG molecule was between 2.6 and 3.3 in free solution regardless of the label but was between 3.4 and 7.5 on rbc and ghosts with 131I-AHG and 3 or lower with HRP-AHG. A colorimetric procedure for quantitating IgG antibodies on rbc ghosts is described using HRP-labelled-AHG and o-dianisidine as the hydrogen donor. The method is very sensitive and useful for the detection of coating antibodies but cannot be used for precise quantitation because about 50% of the IgG molecules on rbc are lost in preparing the ghosts. An AutoAnalyzer method for estimating the number of antibodies attached to red cells is briefly described. Direct measurements of numbers of antigens receptors with radiolabeled specific antibodies gave the most reproducible results. Labelled rabbit AHG was not as good because the ratio of AHG to IgG varied. The AutoAnalyzer method may prove useful because of its convenience. PMID:7019023

  15. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  16. Application of bias correction methods to improve the accuracy of quantitative radar rainfall in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.-K.; Kim, J.-H.; Suk, M.-K.

    2015-11-01

    There are many potential sources of the biases in the radar rainfall estimation process. This study classified the biases from the rainfall estimation process into the reflectivity measurement bias and the rainfall estimation bias by the Quantitative Precipitation Estimation (QPE) model and also conducted the bias correction methods to improve the accuracy of the Radar-AWS Rainrate (RAR) calculation system operated by the Korea Meteorological Administration (KMA). In the Z bias correction for the reflectivity biases occurred by measuring the rainfalls, this study utilized the bias correction algorithm. The concept of this algorithm is that the reflectivity of the target single-pol radars is corrected based on the reference dual-pol radar corrected in the hardware and software bias. This study, and then, dealt with two post-process methods, the Mean Field Bias Correction (MFBC) method and the Local Gauge Correction method (LGC), to correct the rainfall estimation bias by the QPE model. The Z bias and rainfall estimation bias correction methods were applied to the RAR system. The accuracy of the RAR system was improved after correcting Z bias. For the rainfall types, although the accuracy of the Changma front and the local torrential cases was slightly improved without the Z bias correction the accuracy of the typhoon cases got worse than the existing results in particular. As a result of the rainfall estimation bias correction, the Z bias_LGC was especially superior to the MFBC method because the different rainfall biases were applied to each grid rainfall amount in the LGC method. For the rainfall types, the results of the Z bias_LGC showed that the rainfall estimates for all types was more accurate than only the Z bias and, especially, the outcomes in the typhoon cases was vastly superior to the others.

  17. A simple quantitative method to study protein-lipopolysaccharide interactions by using liquid crystals.

    PubMed

    Das, Dibyendu; Sidiq, Sumyra; Pal, Santanu Kumar

    2015-03-16

    The interaction of proteins with endotoxins has divergent effects on lipopolysaccharide (LPS)-induced responses, which serve as a basis for many clinical and therapeutic applications. It is, therefore, important to understand these interactions from both theoretical and practical points of view. This paper advances the design of liquid crystal (LC)-based stimuli-responsive soft materials for quantitative measurements of LPS-protein binding events through interfacial ordering transition. Micrometer-thick films of LCs undergo easily visualized ordering transitions in response to proteins at LPS-aqueous interfaces of the LCs. The optical response of the LC changes from dark to bright after aqueous solutions of hemoglobin (Hb), bovine serum albumin (BSA), and lysozyme proteins (LZM) are in contact with a LPS-laden aqueous-LC interface. The effects of interactions of different proteins with LPS are also observed to cause the response of the LC to vary significantly from one to another; this indicates that manipulation of the protein-LPS binding affinity can provide the basis for a general, facile method to tune the LPS-induced responses of the LCs to interfacial phenomena. By measuring the optical retardation of the 4'-pentyl-4-cyanobiphenyl (5CB) LC, the binding affinity of the proteins (Hb, BSA, and LZM) towards LPS that leads to different orientational behavior at the aqueous interfaces of the LCs can be determined. The interaction of proteins with the LPS-laden monolayer is highest for LPS-Hb, followed by LPS-BSA, and least for LPS-LZM; this is in correlation with their increasing order of binding constants (LPS-Hb>LPS-BSA>LPS-LZM). The results presented herein pave the way for quantitative and multiplexed measurements of LPS-protein binding events and reveal the potential of the LC system to be used as quantitative LC-based, stimuli-responsive soft materials. PMID:25572441

  18. Application of quality control planning methods for the improvement of a quantitative molecular assay.

    PubMed

    Shahsiah, Reza; Nili, Fatemeh; Ardalan, Farid Azmoudeh; Pourgholi, Fatemeh; Borumand, Mohammad Ali

    2013-11-01

    Hepatitis B virus (HBV) DNA measurement has an important role in the diagnosis and management of patients with chronic HBV infection. In cases of chronic hepatitis B, clinical decision is based on either the absolute amount of HBV DNA level, or else the relative change in HBV DNA level. To produce high quality and comparable results, assay performance characteristics must be verified and statistical quality control methods must be planned. In this study, systematic and random error values in an assay of plasma HBV DNA were determined. Performance of the method was examined by employing a normalized operational process specifications (OPSpecs) chart. The systematic error at low and high control levels were 0.33 and 0.22 log(IU/mL) respectively. At both levels, the standard deviations (SD) of the assay were 0.17 log(IU/mL). In addition, a single rule of 12.5SD with 2 control measurements was selected as a candidate quality control method. The assay performed well and was acceptable for clinical use. Further improvement may be attained by switching to automated purification methods. In this study, the well-established discipline of statistical quality control was applied to a real-time quantitative PCR. It was concluded that by employing statistical quality control (QC) methods, which utilize long-term controls, critical changes in the measurement system could be detected. PMID:23933079

  19. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a particular…

  20. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must

  1. Developing Investigative Entry Points: Exploring the Use of Quantitative Methods in English Education Research

    ERIC Educational Resources Information Center

    McGraner, Kristin L.; Robbins, Daniel

    2010-01-01

    Although many research questions in English education demand the use of qualitative methods, this paper will briefly explore how English education researchers and doctoral students may use statistics and quantitative methods to inform, complement, and/or deepen their inquiries. First, the authors will provide a general overview of the survey areas

  2. A method for the quantitative determination of crystalline phases by X-ray

    NASA Technical Reports Server (NTRS)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  3. Comparison of Overlap Methods for Quantitatively Synthesizing Single-Subject Data

    ERIC Educational Resources Information Center

    Wolery, Mark; Busick, Matthew; Reichow, Brian; Barton, Erin E.

    2010-01-01

    Four overlap methods for quantitatively synthesizing single-subject data were compared to visual analysts' judgments. The overlap methods were percentage of nonoverlapping data, pairwise data overlap squared, percentage of data exceeding the median, and percentage of data exceeding a median trend. Visual analysts made judgments about 160 A-B data

  4. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  5. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  6. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  7. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods

    PubMed Central

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2015-01-01

    This data article describes a controlled, spiked proteomic dataset for which the “ground truth” of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values. PMID:26862574

  8. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-03-01

    This data article describes a controlled, spiked proteomic dataset for which the "ground truth" of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values. PMID:26862574

  9. An effective method for the quantitative detection of porcine endogenous retrovirus in pig tissues.

    PubMed

    Zhang, Peng; Yu, Ping; Wang, Wei; Zhang, Li; Li, Shengfu; Bu, Hong

    2010-05-01

    Xenotransplantation shows great promise for providing a virtually limitless supply of cells, tissues, and organs for a variety of therapeutical procedures. However, the potential of porcine endogenous retrovirus (PERV) as a human-tropic pathogen, particularly as a public health risk, is a major concern for xenotransplantation. This study focus on the detection of copy number in various tissues and organs in Banna Minipig Inbreed (BMI) from 2006 to 2007 in West China Hospital, Sichuan University. Real-time quantitative polymerase chain reaction (SYBR Green I) was performed in this study. The results showed that the pol gene had the most copy number in tissues compared with gag, envA, and envB. Our experiment will offer a rapid and accurate method for the detection of the copy number in various tissues and was especially suitable for the selection of tissues or organs in future clinical xenotransplantation. PMID:20108128

  10. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs. PMID:26085428

  11. Evaluation of methods for oligonucleotide array data via quantitative real-time PCR

    PubMed Central

    Qin, Li-Xuan; Beyer, Richard P; Hudson, Francesca N; Linford, Nancy J; Morris, Daryl E; Kerr, Kathleen F

    2006-01-01

    Background There are currently many different methods for processing and summarizing probe-level data from Affymetrix oligonucleotide arrays. It is of great interest to validate these methods and identify those that are most effective. There is no single best way to do this validation, and a variety of approaches is needed. Moreover, gene expression data are collected to answer a variety of scientific questions, and the same method may not be best for all questions. Only a handful of validation studies have been done so far, most of which rely on spike-in datasets and focus on the question of detecting differential expression. Here we seek methods that excel at estimating relative expression. We evaluate methods by identifying those that give the strongest linear association between expression measurements by array and the "gold-standard" assay. Quantitative reverse-transcription polymerase chain reaction (qRT-PCR) is generally considered the "gold-standard" assay for measuring gene expression by biologists and is often used to confirm findings from microarray data. Here we use qRT-PCR measurements to validate methods for the components of processing oligo array data: background adjustment, normalization, mismatch adjustment, and probeset summary. An advantage of our approach over spike-in studies is that methods are validated on a real dataset that was collected to address a scientific question. Results We initially identify three of six popular methods that consistently produced the best agreement between oligo array and RT-PCR data for medium- and high-intensity genes. The three methods are generally known as MAS5, gcRMA, and the dChip mismatch mode. For medium- and high-intensity genes, we identified use of data from mismatch probes (as in MAS5 and dChip mismatch) and a sequence-based method of background adjustment (as in gcRMA) as the most important factors in methods' performances. However, we found poor reliability for methods using mismatch probes for low-intensity genes, which is in agreement with previous studies. Conclusion We advocate use of sequence-based background adjustment in lieu of mismatch adjustment to achieve the best results across the intensity spectrum. No method of normalization or probeset summary showed any consistent advantages. PMID:16417622

  12. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  13. Quantitative analysis and efficiency study of PSD methods for a LaBr3:Ce detector

    NASA Astrophysics Data System (ADS)

    Zeng, Ming; Cang, Jirong; Zeng, Zhi; Yue, Xiaoguang; Cheng, Jianping; Liu, Yinong; Ma, Hao; Li, Junli

    2016-03-01

    The LaBr3:Ce scintillator has been widely studied for nuclear spectroscopy because of its optimal energy resolution (<3%@ 662 keV) and time resolution (~300 ps). Despite these promising properties, the intrinsic radiation background of LaBr3:Ce is a critical issue, and pulse shape discrimination (PSD) has been shown to be an efficient potential method to suppress the alpha background from the 227Ac. In this paper, the charge comparison method (CCM) for alpha and gamma discrimination in LaBr3:Ce is quantitatively analysed and compared with two other typical PSD methods using digital pulse processing. The algorithm parameters and discrimination efficiency are calculated for each method. Moreover, for the CCM, the correlation between the CCM feature value distribution and the total charge (energy) is studied, and a fitting equation for the correlation is inferred and experimentally verified. Using the equations, an energy-dependent threshold can be chosen to optimize the discrimination efficiency. Additionally, the experimental results show a potential application in low-activity high-energy γ measurement by suppressing the alpha background.

  14. Intracranial aneurysm segmentation in 3D CT angiography: method and quantitative validation

    NASA Astrophysics Data System (ADS)

    Firouzian, Azadeh; Manniesing, R.; Flach, Z. H.; Risselada, R.; van Kooten, F.; Sturkenboom, M. C. J. M.; van der Lugt, A.; Niessen, W. J.

    2010-03-01

    Accurately quantifying aneurysm shape parameters is of clinical importance, as it is an important factor in choosing the right treatment modality (i.e. coiling or clipping), in predicting rupture risk and operative risk and for pre-surgical planning. The first step in aneurysm quantification is to segment it from other structures that are present in the image. As manual segmentation is a tedious procedure and prone to inter- and intra-observer variability, there is a need for an automated method which is accurate and reproducible. In this paper a novel semi-automated method for segmenting aneurysms in Computed Tomography Angiography (CTA) data based on Geodesic Active Contours is presented and quantitatively evaluated. Three different image features are used to steer the level set to the boundary of the aneurysm, namely intensity, gradient magnitude and variance in intensity. The method requires minimum user interaction, i.e. clicking a single seed point inside the aneurysm which is used to estimate the vessel intensity distribution and to initialize the level set. The results show that the developed method is reproducible, and performs in the range of interobserver variability in terms of accuracy.

  15. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    NASA Astrophysics Data System (ADS)

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R.; Twa, Michael D.; Larin, Kirill V.

    2015-05-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessment of biomechanical properties of tissues with micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of a proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Youngs modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods.

  16. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. PMID:25842334

  17. Comparison of Concentration Methods for Quantitative Detection of Sewage-Associated Viral Markers in Environmental Waters

    PubMed Central

    Harwood, V. J.; Gyawali, P.; Sidhu, J. P. S.; Toze, S.

    2015-01-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ∼10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg2+) and elution with acid were inefficient and more prone to inhibition, and will result in underestimation of the prevalence and concentrations of HAdVs and HPyVs markers in environmental waters. PMID:25576614

  18. Comparison of concentration methods for quantitative detection of sewage-associated viral markers in environmental waters.

    PubMed

    Ahmed, W; Harwood, V J; Gyawali, P; Sidhu, J P S; Toze, S

    2015-03-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ?10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg(2+)) and elution with acid were inefficient and more prone to inhibition, and will result in underestimation of the prevalence and concentrations of HAdVs and HPyVs markers in environmental waters. PMID:25576614

  19. Evaluation of quantitative methods for the determination of polyphenols in algal extracts.

    PubMed

    Parys, Sabine; Rosenbaum, Anne; Kehraus, Stefan; Reher, Gerrit; Glombitza, Karl-Werner; König, Gabriele M

    2007-12-01

    Marine brown algae such as Ascophyllum nodosum and Fucus vesiculosus accumulate polyphenols composed of phloroglucinol units. These compounds are of ecological importance and, due to their antioxidative activity, of pharmacological value as well. In this study four methods for the quantitative determination of phlorotannins are compared: spectrophotometric determinations using Folin-Ciocalteu's phenol reagent or 2,4-dimethoxybenzaldehyde (DMBA), quantitative (1)H NMR spectroscopy (qHNMR), and gravimetrical measurements. On the basis of the relative standard deviation and the F-test, the determination using Folin-Ciocalteu's phenol reagent and qHNMR proved to be the most reliable and precise methods. PMID:18052031

  20. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    PubMed

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances. PMID:26545323

  1. Comparative assessment of fluorescent transgene methods for quantitative imaging in human cells.

    PubMed

    Mahen, Robert; Koch, Birgit; Wachsmuth, Malte; Politi, Antonio Z; Perez-Gonzalez, Alexis; Mergenthaler, Julia; Cai, Yin; Ellenberg, Jan

    2014-11-01

    Fluorescence tagging of proteins is a widely used tool to study protein function and dynamics in live cells. However, the extent to which different mammalian transgene methods faithfully report on the properties of endogenous proteins has not been studied comparatively. Here we use quantitative live-cell imaging and single-molecule spectroscopy to analyze how different transgene systems affect imaging of the functional properties of the mitotic kinase Aurora B. We show that the transgene method fundamentally influences level and variability of expression and can severely compromise the ability to report on endogenous binding and localization parameters, providing a guide for quantitative imaging studies in mammalian cells. PMID:25232003

  2. Simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and dead reckoning

    NASA Astrophysics Data System (ADS)

    Davey, Neil S.; Godil, Haris

    2013-05-01

    This article presents a comparative study between a well-known SLAM (Simultaneous Localization and Mapping) algorithm, called Gmapping, and a standard Dead-Reckoning algorithm; the study is based on experimental results of both approaches by using a commercial skid-based turning robot, P3DX. Five main base-case scenarios are conducted to evaluate and test the effectiveness of both algorithms. The results show that SLAM outperformed the Dead Reckoning in terms of map-making accuracy in all scenarios but one, since SLAM did not work well in a rapidly changing environment. Although the main conclusion about the excellence of SLAM is not surprising, the presented test method is valuable to professionals working in this area of mobile robots, as it is highly practical, and provides solid and valuable results. The novelty of this study lies in its simplicity. The simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and Dead Reckoning and some applications using autonomous robots are being patented by the authors in U.S. Patent Application Nos. 13/400,726 and 13/584,862.

  3. A quantitative study of motion estimation methods on 4D cardiac gated SPECT reconstruction

    PubMed Central

    Qi, Wenyuan; Yang, Yongyi; Niu, Xiaofeng; King, Michael A.

    2012-01-01

    Purpose: Motion-compensated temporal processing can have a major impact on improving the image quality in gated cardiac single photon emission computed tomography (SPECT). In this work, we investigate the effect of different optical flow estimation methods for motion-compensated temporal processing in gated SPECT. In particular, we explore whether better motion estimation can substantially improve reconstructed image quality, and how the estimated motion would compare to the ideal case of known motion in terms of reconstruction. Methods: We consider the following three methods for obtaining the image motion in 4D reconstruction: (1) the HornSchunck optical flow equation (OFE) method, (2) a recently developed periodic OFE method, and (3) known cardiac motion derived from the NURBS-based cardiac-torso (NCAT) phantom. The periodic OFE method is used to exploit the inherent periodic nature in cardiac gated images. In this method, the optical flow in a sequence is modeled by a Fourier harmonic representation, which is then estimated from the image data. We study the impact of temporal processing on 4D reconstructions when the image motion is obtained with the different methods above. For quantitative evaluation, we use simulated imaging with multiple noise realizations from the NCAT phantom, where different patient geometry and lesion sizes are also considered. To quantify the reconstruction results, we use the following measures of reconstruction accuracy and defect detection in the myocardium: (1) overall error level in the myocardium, (2) regional accuracy of the left ventricle (LV) wall, (3) accuracy of regional time activity curves of the LV, and (4) perfusion defect detectability with a channelized Hotelling observer (CHO). In addition, we also examine the effect of noise on the distortion in the reconstructed LV wall shape by detecting its contours. As a preliminary demonstration, these methods are also tested on two sets of clinical acquisitions. Results: For the different quantitative measures considered, the periodic OFE further improved the reconstruction accuracy of the myocardium compared to OFE in 4D reconstruction; its improvement in reconstruction almost matched that of the known motion. Specifically, the overall mean-squared error in the myocardium was reduced by over 20% with periodic OFE; with noise level fixed at 10%, the regional bias on the LV was reduced from 20% (OFE) to 14% (periodic OFE), compared to 11% by the known motion. In addition, the CHO results show that there was also improvement in lesion detectability with the periodic OFE. The regional time activity curves obtained with the periodic OFE were also observed to be more consistent with the reference; in addition, the contours of the reconstructed LV wall with the periodic OFE were demonstrated to show less degree of variations among different noise realizations. Such improvements were also consistent with the results obtained from the clinical acquisitions. Conclusions: Use of improved optical flow estimation can further improve the accuracy of reconstructed images in 4D. The periodic OFE method not only can achieve improvements over the traditional OFE, but also can almost match that of the known motion in terms of the several quality measures considered. PMID:22894443

  4. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-03-23

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed. PMID:26928571

  5. Quantitative evaluation of besifloxacin ophthalmic suspension by HPLC, application to bioassay method and cytotoxicity studies.

    PubMed

    Costa, Mrcia C N; Barden, Amanda T; Andrade, Juliana M M; Oppe, Trcio P; Schapoval, Elfrides E S

    2014-02-01

    Besifloxacin (BSF) is a synthetic chiral fluoroquinolone developed for the topical treatment of ophthalmic infections. The present study reports the development and validation of a microbiological assay, applying the cylinder-plate method, for determination of BSF in ophthalmic suspension. To assess this methodology, the development and validation of the method was performed for the quantification of BSF by high performance liquid chromatography (HPLC). The HPLC method showed specificity, linearity in the range of 20-80 g mL(-1) (r=0.9998), precision, accuracy and robustness. The microbiological method is based on the inhibitory effect of BSF upon the strain of Staphylococcus epidermidis ATCC 12228 used as a test microorganism. The bioassay validation method yielded excellent results and included linearity, precision, accuracy, robustness and selectivity. The assay results were treated statistically by analysis of variance (ANOVA) and were found to be linear (r=0.9974) in the range of 0.5-2.0 g mL(-1), precise (inter-assay: RSD=0.84), accurate (101.4%), specific and robust. The bioassay and the previously validated high performance liquid chromatographic (HPLC) method were compared using Student's t test, which indicated that there was no statistically significant difference between these two methods. These results confirm that the proposed microbiological method can be used as routine analysis for the quantitative determination of BSF in an ophthalmic suspension. A preliminary stability study during the HPLC validation was performed and demonstrated that BSF is unstable under UV conditions. The photodegradation kinetics of BSF in water showed a first-order reaction for the drug product (ophthalmic suspension) and a second-order reaction for the reference standard (RS) under UVA light. UVA degraded samples of BSF were also studied in order to determine the preliminary in vitro cytotoxicity against mononuclear cells. The results indicated that BSF does not alter the cell membrane and has been considered non-toxic to human mononuclear cells in the experimental conditions tested. PMID:24401427

  6. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals.

    PubMed

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-09-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization, and generalized synchronization) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error under null hypothesis (independence between two signals) and mean variance computed over all values of coupling degree in each model, we provide a criterion for comparing performances. Results show that the performances of the compared methods are highly dependent on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model dependent. PMID:17025676

  7. Quantitative comparison of reconstruction methods for intra-voxel fiber recovery from diffusion MRI.

    PubMed

    Daducci, Alessandro; Canales-Rodrguez, Erick Jorge; Descoteaux, Maxime; Garyfallidis, Eleftherios; Gur, Yaniv; Lin, Ying-Chia; Mani, Merry; Merlet, Sylvain; Paquette, Michael; Ramirez-Manzanares, Alonso; Reisert, Marco; Reis Rodrigues, Paulo; Sepehrband, Farshid; Caruyer, Emmanuel; Choupan, Jeiran; Deriche, Rachid; Jacob, Mathews; Menegaz, Gloria; Pr?kovska, Vesna; Rivera, Mariano; Wiaux, Yves; Thiran, Jean-Philippe

    2014-02-01

    Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies. PMID:24132007

  8. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals

    PubMed Central

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-01-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization (PS), and generalized synchronization (GS)) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error (MSE) under null hypothesis (independence between two signals) and mean variance (MV) computed over all values of coupling degree in each model, we introduce a new criterion for comparing performances. Results show that the performances of the compared methods are highly depending on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model-dependent. PMID:17025676

  9. Quantitative fluorescence method for continuous measurement of DNA hybridization kinetics using a fluorescent intercalator.

    PubMed

    Yguerabide, J; Ceballos, A

    1995-07-01

    We present a quantitative fluorescence method for continuous measurement of DNA or RNA hybridization (including renaturation) kinetics using a fluorescent DNA intercalator. The method has high sensitivity and can be used with reaction volumes as small as 1 microliter and amounts of DNA around 1 ng. The method is based on the observations that (i) for the usual hybridization conditions, intercalators such as ethidium bromide bind (intercalate) to double-stranded DNA (dsDNA) but not single-stranded DNA or RNA and (ii) there is a large increase in fluorescence intensity when intercalators such as ethidium bromide bind to dsDNA. In this application, the intercalator can be considered as a quantitative indicator of dsDNA concentration. When a small amount of intercalator is added to a hybridizing solution, the fluorescence intensity of the intercalators increases with increase in dsDNA. The hybridization reaction can thus be monitored by continuously recording fluorescence intensity vs time. Because the amount of intercalator bound to dsDNA is not necessarily proportional to dsDNA concentration, the time-dependent fluorescence intensity graph is not identical to the kinetic graph [dsDNA] vs t. However, the fluorescence intensity vs time graph can easily be converted to the true [dsDNA] vs t graph by means of an experimental calibration graph of fluorescence intensity vs [dsDNA]. This calibration graph is obtained in a separate experiment using samples containing known amounts of dsDNA in the ethidium bromide buffer used in the kinetic measurement. We present results of experimental tests of the intercalator technique using ethidium bromide as an intercalator and DNA from Escherichia coli and lambda-phage and Poly(I)-Poly(C) RNA hybrids. These DNA and RNA samples have Cot1/2 values that cover a range of 10(6). Our experimental results show that (i) the kinetics of hybridization are not significantly perturbed by the intercalator at concentrations where no more than 10% of the binding sites on DNA or RNA hybrids are occupied, (ii) the kinetic graphs obtained by the intercalator fluorescence method and corrected with the calibration graph agree with kinetic graphs obtained by optical absorbance measurements at 260 nm, and (iii) the intercalator technique can be used in the different salt environments often used to increase the velocity of the hybridization reaction and at the hybridization temperatures (35-75 degrees C) normally used to minimize nonspecific hybridization. PMID:8572297

  10. A multiplex lectin-channel monitoring method for human serum glycoproteins by quantitative mass spectrometry.

    PubMed

    Ahn, Yeong Hee; Ji, Eun Sun; Shin, Park Min; Kim, Kwang Hoe; Kim, Yong-Sam; Ko, Jeong Heon; Yoo, Jong Shin

    2012-02-01

    A mass profiling method and multiple reaction monitoring (MRM)-based quantitative approach were used to analyze multiple lectin-captured fractions of human serum using different lectins such as aleuria aurantia lectin (AAL), phytohemagglutinin-L(4) (L-PHA), concanavalin A (Con A), and Datura stramonium agglutinin (DSA) to quantitatively monitor protein glycosylation diversity. Each fraction, prepared by multiple lectin-fractionation and tryptic digestion, was analyzed by 1-D LC-MS/MS. Semi-quantitative profiling showed that the list of glycoproteins identified from each lectin-captured fraction is significantly different according to the used lectin. Thus, it was confirmed that the multiplex lectin-channel monitoring (LCM) using multiple lectins is useful for investigating protein glycosylation diversity in a proteome sample. Based on the semi-quantitative mass profiling, target proteins showing lectin-specificity among each lectin-captured fraction were selected and analyzed by the MRM-based method in triplicate using each lectin-captured fraction (average CV 7.9%). The MRM-based analysis for each lectin-captured fraction was similar to those obtained by the profiling experiments. The abundance of each target protein measured varied dramatically, based on the lectin-specificity. The multiplex LCM approach using MRM-based analyses is useful for quantitatively monitoring target protein glycoforms selectively fractionated by multiple lectins. Thus through multiplex LCM rather than single, we could inquire minutely into protein glycosylation states. PMID:22158852

  11. Meta-analysis of results from quantitative trait loci mapping studies on pig chromosome 4.

    PubMed

    Silva, K M; Bastiaansen, J W M; Knol, E F; Merks, J W M; Lopes, P S; Guimares, S E F; van Arendonk, J A M

    2011-06-01

    Meta-analysis of results from multiple studies could lead to more precise quantitative trait loci (QTL) position estimates compared to the individual experiments. As the raw data from many different studies are not readily available, the use of results from published articles may be helpful. In this study, we performed a meta-analysis of QTL on chromosome 4 in pig, using data from 25 separate experiments. First, a meta-analysis was performed for individual traits: average daily gain and backfat thickness. Second, a meta-analysis was performed for the QTL of three traits affecting loin yield: loin eye area, carcass length and loin meat weight. Third, 78 QTL were selected from 20 traits that could be assigned to one of three broad categories: carcass, fatness or growth traits. For each analysis, the number of identified meta-QTL was smaller than the number of initial QTL. The reduction in the number of QTL ranged from 71% to 86% compared to the total number before the meta-analysis. In addition, the meta-analysis reduced the QTL confidence intervals by as much as 85% compared to individual QTL estimates. The reduction in the confidence interval was greater when a large number of independent QTL was included in the meta-analysis. Meta-QTL related to growth and fatness were found in the same region as the FAT1 region. Results indicate that the meta-analysis is an efficient strategy to estimate the number and refine the positions of QTL when QTL estimates are available from multiple populations and experiments. This strategy can be used to better target further studies such as the selection of candidate genes related to trait variation. PMID:21198696

  12. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  13. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  14. The Usefulness of Qualitative and Quantitative Methods: Studying Incest in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    The purpose of this study is to illustrate how the combined use of qualitative and quantitative methods were necessary in obtaining within this society a clearer understanding of incest. The paper opens with a report of studies carried out on natural father and stepfather incestuous families, and this opens up the issue of the appropriateness of

  15. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  16. A GC-FID method for quantitative analysis of N,N-carbonyldiimidazole.

    PubMed

    Lee, Claire; Mangion, Ian

    2016-03-20

    N,N-Carbonyldiimidazole (CDI), a common synthetic reagent used in commercial scale pharmaceutical synthesis, is known to be sensitive to hydrolysis from ambient moisture. This liability demands a simple, robust analytical method to quantitatively determine reagent quality to ensure reproducible performance in chemical reactions. This work describes a protocol for a rapid GC-FID based analysis of CDI. PMID:26773533

  17. A simple method for quantitative diagnosis of small hive beetles, Aethina tumida, in the field

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Here we present a simple and fast method for quantitative diagnosis of small hive beetles (= SHB) in honeybee field colonies using corrugated plastic diagnostic-strips. In Australia, we evaluated its efficacy by comparing the number of lured SHB with the total number of beetles in the hives. The d...

  18. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,

  19. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  20. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an

  1. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the

  2. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    ERIC Educational Resources Information Center

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining

  3. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological

  4. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  5. Potential Guidelines for Conducting and Reporting Environmental Education Research: Quantitative Methods of Inquiry.

    ERIC Educational Resources Information Center

    Smith-Sebasto, N. J.

    2001-01-01

    Presents potential guidelines for conducting and reporting environmental education research using quantitative methods of inquiry that were developed during a 10-hour (1-1/2 day) workshop sponsored by the North American Commission on Environmental Education Research during the 1998 annual meeting of the North American Association for Environmental…

  6. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  7. Examination of Quantitative Methods Used in Early Intervention Research: Linkages with Recommended Practices.

    ERIC Educational Resources Information Center

    Snyder, Patricia; Thompson, Bruce; McLean, Mary E.; Smith, Barbara J.

    2002-01-01

    Findings are reported related to the research methods and statistical techniques used in 450 group quantitative studies examined by the Council for Exceptional Children's Division for Early Childhood Recommended Practices Project. Studies were analyzed across seven dimensions including sampling procedures, variable selection, variable definition,

  8. [Development of quantitative analyse method for determination of alkaloid cytisin in Spartium junceum L., growing in Georgia].

    PubMed

    Iavich, P A; Churadze, L I; Suladze, T Sh; Rukhadze, T A

    2011-12-01

    The aim of the research was to develop a method for quantitative determination of cytisine in Spartium junceum L. We used the above-ground parts of plants. In developing a method of analysis we used the method of 3-phase extraction. In this case the best results were obtained in the system: chopped raw material - water solution of ammonia - chloroform. In this case, the amount of alkaloids extracted almost entirely from the plant and goes into the chloroform phase. Evaluation of the results was carried out by the validation. The method for determination of cytisine in raw product was proposed. The method comprises the following steps-extraction of raw materials extracting chloroform phase and its evaporation, the translation of solids in methanol, the chromatographic separation cytisine and its fixation of the spectrophotometer method. The method is reproducible, has the required accuracy, is easy to analysis (less than 9 hours). PMID:22306505

  9. Quantitative methods for evaluating optical and frictional properties of cationic polymers.

    PubMed

    Wu, W; Alkema, J; Shay, G D; Basset, D R

    2001-01-01

    This paper presents three quantitative methods to examine gloss, opacity, and friction of cationic polymers. The adsorption of cationic polymers onto hair and skin can be regarded as a thin film coating. Therefore, optical and frictional properties of polymer films are of significant relevance to the applications of cationic polymers in hair care products. Such properties reflect the desirable hair condition attributes consumers seek in shampoo and conditioner products. Using these test methods, polyquaternium-10 and cationic guar samples of varying molecular weight and cationic substitution were compared. The effect of an anionic surfactant, sodium dodecyl sulfate (SDS), on polymer film properties was also investigated. Neat guar hydroxypropyl trimonium chloride imparts less friction than polyquaternium-10 but dulls the substrate employed in this study. The optical data show that polyquaternium-10 provides greater film clarity and gloss than cationic guars. In the presence of SDS, polyquaternium-10 also displays similar or lower friction than cationic guar. The comparative optical and frictional results are in good agreement with the visual assessment of the cationic polymer films. These results clearly demonstrate that polyquaternium-10 exhibits superior film properties in the forms of both neat polymer and polymer/surfactant complex. In addition, microscopic techniques such as scanning electron microscopy (SEM) and atomic force microscopy (AFM) provide powerful explanations for the differences noted between the two popular classes of cationic polymers. The test methods described in this paper can be utilized to differentiate the upper performance potential of cationic polymers. These objective and standardized test methods derived from the coatings industry are not affected by the variability of hair or the formulation complexity of end products. They can be useful tools in the product development process in quickly screening the relative performance of different polymers. PMID:11382843

  10. Bayesian data augmentation methods for the synthesis of qualitative and quantitative research findings

    PubMed Central

    Crandell, Jamie L.; Voils, Corrine I.; Chang, YunKyung; Sandelowski, Margarete

    2010-01-01

    The possible utility of Bayesian methods for the synthesis of qualitative and quantitative research has been repeatedly suggested but insufficiently investigated. In this project, we developed and used a Bayesian method for synthesis, with the goal of identifying factors that influence adherence to HIV medication regimens. We investigated the effect of 10 factors on adherence. Recognizing that not all factors were examined in all studies, we considered standard methods for dealing with missing data and chose a Bayesian data augmentation method. We were able to summarize, rank, and compare the effects of each of the 10 factors on medication adherence. This is a promising methodological development in the synthesis of qualitative and quantitative research. PMID:21572970

  11. Development of a quantitative method to measure vision in children with chronic cortical visual impairment.

    PubMed Central

    Good, W V

    2001-01-01

    PURPOSE: Cortical visual impairment (CVI) is the most common cause of bilateral vision impairment in children in Western countries. Better quantitative tools for measuring vision are needed to assess these children, to allow measurement of their visual deficit, and to monitor their response to treatment and rehabilitation. The author performed a series of experiments to assess the use of the sweep visual evoked potential (VEP) as a quantitative tool for measuring vision in CVI. METHODS: The first experiment was a reliability measure (test/retest) of VEP grating acuity thresholds of 23 children with CVI. To validate the VEP procedure, VEP grating acuity was compared to a clinical measure of vision, the Huo scale, and to a psychophysical measure of vision, the Teller Acuity Card procedure. Finally, the sweep VEP was tested as a tool for defining optimal luminance conditions for grating acuity in 13 children with CVI, by measuring grating thresholds under 2 different luminance conditions: 50 and 100 candela per square meter (cd/m2). RESULTS: Retest thresholds were similar to original thresholds (r2 = 0.662; P = .003, 1-tailed t test). Grating VEP measures correlate significantly with the clinical index (r2 = 0.63; P = .00004). Teller acuity measurements are also similar to VEP measures in children (r2 = 0.64; P = .0005) but show lower acuities compared to the VEP for children with particularly low vision. Finally, 3 of 13 children tested under 2 background luminance conditions showed paradoxical improvement in grating threshold with dimmer luminance. CONCLUSIONS: The sweep VEP tool is a reliable and valid means for measuring grating acuity in children with CVI. The tool also shows promise as a means of determining the optimal visual environment for children with CVI. PMID:11797314

  12. Accuracy, Precision, and Method Detection Limits of Quantitative PCR for Airborne Bacteria and Fungi ?

    PubMed Central

    Hospodsky, Denina; Yamamoto, Naomichi; Peccia, Jordan

    2010-01-01

    Real-time quantitative PCR (qPCR) for rapid and specific enumeration of microbial agents is finding increased use in aerosol science. The goal of this study was to determine qPCR accuracy, precision, and method detection limits (MDLs) within the context of indoor and ambient aerosol samples. Escherichia coli and Bacillus atrophaeus vegetative bacterial cells and Aspergillus fumigatus fungal spores loaded onto aerosol filters were considered. Efficiencies associated with recovery of DNA from aerosol filters were low, and excluding these efficiencies in quantitative analysis led to underestimating the true aerosol concentration by 10 to 24 times. Precision near detection limits ranged from a 28% to 79% coefficient of variation (COV) for the three test organisms, and the majority of this variation was due to instrument repeatability. Depending on the organism and sampling filter material, precision results suggest that qPCR is useful for determining dissimilarity between two samples only if the true differences are greater than 1.3 to 3.2 times (95% confidence level at n = 7 replicates). For MDLs, qPCR was able to produce a positive response with 99% confidence from the DNA of five B. atrophaeus cells and less than one A. fumigatus spore. Overall MDL values that included sample processing efficiencies ranged from 2,000 to 3,000 B. atrophaeus cells per filter and 10 to 25 A. fumigatus spores per filter. Applying the concepts of accuracy, precision, and MDL to qPCR aerosol measurements demonstrates that sample processing efficiencies must be accounted for in order to accurately estimate bioaerosol exposure, provides guidance on the necessary statistical rigor required to understand significant differences among separate aerosol samples, and prevents undetected (i.e., nonquantifiable) values for true aerosol concentrations that may be significant. PMID:20817798

  13. The quantitative assessment of the pre- and postoperative craniosynostosis using the methods of image analysis.

    PubMed

    Fabija?ska, Anna; W?gli?ski, Tomasz

    2015-12-01

    This paper considers the problem of the CT based quantitative assessment of the craniosynostosis before and after the surgery. First, fast and efficient brain segmentation approach is proposed. The algorithm is robust to discontinuity of skull. As a result it can be applied both in pre- and post-operative cases. Additionally, image processing and analysis algorithms are proposed for describing the disease based on CT scans. The proposed algorithms automate determination of the standard linear indices used for assessment of the craniosynostosis (i.e. cephalic index CI and head circumference HC) and allow for planar and volumetric analysis which so far have not been reported. Results of applying the introduced methods to sample craniosynostotic cases before and after the surgery are presented and discussed. The results show that the proposed brain segmentation algorithm is characterized by high accuracy when applied both in the pre- and postoperative craniosynostosis, while the introduced planar and volumetric indices for the disease description may be helpful to distinguish between the types of the disease. PMID:26143078

  14. Precision of dehydroascorbic acid quantitation with the use of the subtraction method--validation of HPLC-DAD method for determination of total vitamin C in food.

    PubMed

    Mazurek, Artur; Jamroz, Jerzy

    2015-04-15

    In food analysis, a method for determination of vitamin C should enable measuring of total content of ascorbic acid (AA) and dehydroascorbic acid (DHAA) because both chemical forms exhibit biological activity. The aim of the work was to confirm applicability of HPLC-DAD method for analysis of total content of vitamin C (TC) and ascorbic acid in various types of food by determination of validation parameters such as: selectivity, precision, accuracy, linearity and limits of detection and quantitation. The results showed that the method applied for determination of TC and AA was selective, linear and precise. Precision of DHAA determination by the subtraction method was also evaluated. It was revealed that the results of DHAA determination obtained by the subtraction method were not precise which resulted directly from the assumption of this method and the principles of uncertainty propagation. The proposed chromatographic method should be recommended for routine determinations of total vitamin C in various food. PMID:25466057

  15. Real-time PCR methods for independent quantitation of TTV and TLMV.

    PubMed

    Moen, Eva M; Sleboda, Jowita; Grinde, Bjrn

    2002-06-01

    There is considerable interest in the possible clinical effects of the human circoviruses TT virus (TTV) and TTV-like mini virus (TLMV). Most people appear to have at least one of these viruses replicating actively in their bodies, thus mere correlation of the presence of virus and disease states are probably less informative than a quantitative analysis of viraemia. Real-time PCR based methods, with either SYBR Green or TaqMan probe, designed to quantitate selectively TTV and TLMV are described. The suggested TaqMan-based protocols were suitable for quantitation of viruses in the range of 10(2)-10(9) copies/ml of sample; and proved, by sequencing of PCR products, to be specific for each of the two viruses. PMID:12020793

  16. An Evaluation of Quantitative Methods of Determining the Degree of Melting Experienced by a Chondrule

    NASA Technical Reports Server (NTRS)

    Nettles, J. W.; Lofgren, G. E.; Carlson, W. D.; McSween, H. Y., Jr.

    2004-01-01

    Many workers have considered the degree to which partial melting occurred in chondrules they have studied, and this has led to attempts to find reliable methods of determining the degree of melting. At least two quantitative methods have been used in the literature: a convolution index (CVI), which is a ratio of the perimeter of the chondrule as seen in thin section divided by the perimeter of a circle with the same area as the chondrule, and nominal grain size (NGS), which is the inverse square root of the number density of olivines and pyroxenes in a chondrule (again, as seen in thin section). We have evaluated both nominal grain size and convolution index as melting indicators. Nominal grain size was measured on the results of a set of dynamic crystallization experiments previously described, where aliquots of LEW97008(L3.4) were heated to peak temperatures of 1250, 1350, 1370, and 1450 C, representing varying degrees of partial melting of the starting material. Nominal grain size numbers should correlate with peak temperature (and therefore degree of partial melting) if it is a good melting indicator. The convolution index is not directly testable with these experiments because the experiments do not actually create chondrules (and therefore they have no outline on which to measure a CVI). Thus we had no means to directly test how well the CVI predicted different degrees of melting. Therefore, we discuss the use of the CVI measurement and support the discussion with X-ray Computed Tomography (CT) data.

  17. A validated LC-MS-MS method for simultaneous identification and quantitation of rodenticides in blood.

    PubMed

    Bidny, Sergei; Gago, Kim; David, Mark; Duong, Thanh; Albertyn, Desdemona; Gunja, Naren

    2015-04-01

    A rapid, highly sensitive and specific analytical method for the extraction, identification and quantification of nine rodenticides from whole blood has been developed and validated. Commercially available rodenticides in Australia include coumatetralyl, warfarin, brodifacoum, bromadiolone, difenacoum, flocoumafen, difethialone, diphacinone and chlorophacinone. A Waters ACQUITY UPLC TQD system operating in multiple reaction monitoring mode was used to conduct the analysis. Two different ionization techniques, ES+ and ES-, were examined to achieve optimal sensitivity and selectivity resulting in detection by MS-MS using electrospray ionization in positive mode for difenacoum and brodifacoum and in negative mode for all other analytes. All analytes were extracted from 200 L of whole blood with ethylacetate and separated on a Waters ACQUITY UPLC BEH-C18 column using gradient elution. Ammonium acetate (10 mM, pH 7.5) and methanol were used as mobile phases with a total run time of 8 min. Recoveries were between 70 and 105% with limits of detection ranging from 0.5 to 1 ng/mL. The limit of quantitation was 2 ng/mL for all analytes. Calibration curves were linear within the range 2-200 ng/mL for all analytes with the coefficient of determination ?0.98. The application of the proposed method using liquid-liquid extraction in a series of clinical investigations and forensic toxicological analyses was successful. PMID:25595137

  18. Identification and quantitation method for nonylphenol and lower oligomer nonylphenol ethoxylates in fish tissues.

    PubMed

    Snyder, S A; Keith, T L; Naylor, C G; Staples, C A; Giesy, J P

    2001-09-01

    Substantial research is currently focused on the toxicological effects of alkylphenol ethoxylates (APEs) and alkylphenols (APs) on aquatic animals. Considerable data are available on the concentrations of APEs and APs in river systems in the United States; however, few if any data are available on the tissue concentrations of fish living in these rivers. A reliable method for the analysis of nonylphenol (NP) and lower oligomer nonylphenol ethoxylates (NPE1-3) in fish tissues has been developed. Nonylphenol and NPE1-3 were extracted from fish tissues using extractive steam distillation. Normal phase high-performance liquid chromatography (HLPC) was used as a cleanup step prior to analysis by gas chromatography with mass selective detection (GC/MSD) using selected ion monitoring. Optimization of this technique resulted in consistent recoveries in excess of 70%, with the exception of NPE3 (17%). Method detection limits (MDLs) and limits of quantitation using the technique range from 3 to 20 and 5 to 29 ng/g wet weight, respectively. Nonylphenol and NPE1 were detected in subsamples (n = 6) of a single common carp captured in the Las Vegas Bay of Lake Mead (NV, USA) at average concentrations of 184+/-4 ng/g and 242+/-9 wet weight, respectively. Nonylphenol ethoxylates were not detected in the carp collected at Lake Mead. PMID:11521811

  19. Deep neural nets as a method for quantitative structure-activity relationships.

    PubMed

    Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir

    2015-02-23

    Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable. PMID:25635324

  20. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-05-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

  1. Quantitative 1H-NMR Method for the Determination of Tadalafil in Bulk Drugs and its Tablets.

    PubMed

    Yang, Qingyun; Qiu, Hui; Guo, Wei; Wang, Dongmei; Zhou, Xingning; Xue, Dan; Zhang, Jinlan; Wu, Song; Wang, Yinghong

    2015-01-01

    A simple, rapid, accurate, and selective quantitative nuclear magnetic resonance method for the determination of tadalafil in bulk drugs and its tablets was established and evaluated. Spectra were obtained in dimethylsulfoxide-d6 using 2,4-dinitrotoluene as the internal standard. In this study, the method's linearity, range, limit of quantification, stability, precision, and accuracy were validated. The results were consistent with those obtained from high-performance liquid chromatography analysis. Thus, the proposed method is a useful and practical tool for the determination of tadalafil in bulk drugs and its tablets. PMID:26147583

  2. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer

    NASA Astrophysics Data System (ADS)

    Fu, Guanglei; Sanjay, Sharma T.; Dou, Maowei; Li, Xiujun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. Electronic supplementary information (ESI) available: Additional information on FTIR characterization (Fig. S1), photothermal immunoassay of PSA in human serum samples (Table S1), and the Experimental section, including preparation of antibody-conjugated iron oxide NPs, sandwich-type immunoassay, characterization, and photothermal detection protocol. See DOI: 10.1039/c5nr09051b

  3. Quantitative evaluation of registration methods for atlas-based diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Wu, Xue; Eggebrecht, Adam T.; Culver, Joseph P.; Zhan, Yuxuan; Basevi, Hector; Dehghani, Hamid

    2013-06-01

    In Diffuse Optical Tomography (DOT), an atlas-based model can be used as an alternative to a subject-specific anatomical model for recovery of brain activity. The main step of the generation of atlas-based subject model is the registration of atlas model to the subject head. The accuracy of the DOT then relies on the accuracy of registration method. In this work, 11 registration methods are quantitatively evaluated. The registration method with EEG 10/20 systems with 19 landmarks and non-iterative point to point algorithm provides approximately 1.4 mm surface error and is considered as the most efficient registration method.

  4. A quantitative cell modeling and wound-healing analysis based on the Electric Cell-substrate Impedance Sensing (ECIS) method.

    PubMed

    Yang, Jen Ming; Chen, Szi-Wen; Yang, Jhe-Hao; Hsu, Chih-Chin; Wang, Jong-Shyan

    2016-02-01

    In this paper, a quantitative modeling and wound-healing analysis of fibroblast and human keratinocyte cells is presented. Our study was conducted using a continuous cellular impedance monitoring technique, dubbed Electric Cell-substrate Impedance Sensing (ECIS). In fact, we have constructed a mathematical model for quantitatively analyzing the cultured cell growth using the time series data directly derived by ECIS in a previous work. In this study, the applicability of our model into the keratinocyte cell growth modeling analysis was assessed first. In addition, an electrical "wound-healing" assay was used as a means to evaluate the healing process of keratinocyte cells at a variety of pressures. Two innovative and new-defined indicators, dubbed cell power and cell electroactivity, respectively, were developed for quantitatively characterizing the biophysical behavior of cells. We then employed the wavelet transform method to perform a multi-scale analysis so the cell power and cell electroactivity across multiple observational time scales may be captured. Numerical results indicated that our model can well fit the data measured from the keratinocyte cell culture for cell growth modeling analysis. Also, the results produced by our quantitative analysis showed that the wound healing process was the fastest at the negative pressure of 125mmHg, which consistently agreed with the qualitative analysis results reported in previous works. PMID:26773459

  5. Laboratory and field validation of a Cry1Ab protein quantitation method for water.

    PubMed

    Strain, Katherine E; Whiting, Sara A; Lydy, Michael J

    2014-10-01

    The widespread planting of crops expressing insecticidal proteins derived from the soil bacterium Bacillus thuringiensis (Bt) has given rise to concerns regarding potential exposure to non-target species. These proteins are released from the plant throughout the growing season into soil and surface runoff and may enter adjacent waterways as runoff, erosion, aerial deposition of particulates, or plant debris. It is crucial to be able to accurately quantify Bt protein concentrations in the environment to aid in risk analyses and decision making. Enzyme-linked immunosorbent assay (ELISA) is commonly used for quantitation of Bt proteins in the environment; however, there are no published methods detailing and validating the extraction and quantitation of Bt proteins in water. The objective of the current study was to optimize the extraction of a Bt protein, Cry1Ab, from three water matrices and validate the ELISA method for specificity, precision, accuracy, stability, and sensitivity. Recovery of the Cry1Ab protein was matrix-dependent and ranged from 40 to 88% in the validated matrices, with an overall method detection limit of 2.1 ng/L. Precision among two plates and within a single plate was confirmed with a coefficient of variation less than 20%. The ELISA method was verified in field and laboratory samples, demonstrating the utility of the validated method. The implementation of a validated extraction and quantitation protocol adds consistency and reliability to field-collected data regarding transgenic products. PMID:25059137

  6. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    NASA Astrophysics Data System (ADS)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  7. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  8. Method Development and Validation of a Stability-Indicating RP-HPLC Method for the Quantitative Analysis of Dronedarone Hydrochloride in Pharmaceutical Tablets

    PubMed Central

    Dabhi, Batuk; Jadeja, Yashwantsinh; Patel, Madhavi; Jebaliya, Hetal; Karia, Denish; Shah, Anamik

    2013-01-01

    A simple, precise, and accurate HPLC method has been developed and validated for the quantitative analysis of Dronedarone Hydrochloride in tablet form. An isocratic separation was achieved using a Waters Symmetry C8 (100 × 4.6 mm), 5 μm particle size column with a flow rate of 1 ml/min and UV detector at 290 nm. The mobile phase consisted of buffer: methanol (40:60 v/v) (buffer: 50 mM KH2PO4 + 1 ml triethylamine in 1 liter water, pH=2.5 adjusted with ortho-phosphoric acid). The method was validated for specificity, linearity, precision, accuracy, robustness, and solution stability. The specificity of the method was determined by assessing interference from the placebo and by stress testing the drug (forced degradation). The method was linear over the concentration range 20–80 μg/ml (r2 = 0.999) with a Limit of Detection (LOD) and Limit of Quantitation (LOQ) of 0.1 and 0.3 μg/ml respectively. The accuracy of the method was between 99.2–100.5%. The method was found to be robust and suitable for the quantitative analysis of Dronedarone Hydrochloride in a tablet formulation. Degradation products resulting from the stress studies did not interfere with the detection of Dronedarone Hydrochloride so the assay is thus stability-indicating. PMID:23641332

  9. Improved Methodical Approach for Quantitative BRET Analysis of G Protein Coupled Receptor Dimerization

    PubMed Central

    Szalai, Bence; Hoffmann, Pter; Prokop, Susanne; Erdlyi, Lszl; Vrnai, Pter; Hunyady, Lszl

    2014-01-01

    G Protein Coupled Receptors (GPCR) can form dimers or higher ordered oligomers, the process of which can remarkably influence the physiological and pharmacological function of these receptors. Quantitative Bioluminescence Resonance Energy Transfer (qBRET) measurements are the gold standards to prove the direct physical interaction between the protomers of presumed GPCR dimers. For the correct interpretation of these experiments, the expression of the energy donor Renilla luciferase labeled receptor has to be maintained constant, which is hard to achieve in expression systems. To analyze the effects of non-constant donor expression on qBRET curves, we performed Monte Carlo simulations. Our results show that the decrease of donor expression can lead to saturation qBRET curves even if the interaction between donor and acceptor labeled receptors is non-specific leading to false interpretation of the dimerization state. We suggest here a new approach to the analysis of qBRET data, when the BRET ratio is plotted as a function of the acceptor labeled receptor expression at various donor receptor expression levels. With this method, we were able to distinguish between dimerization and non-specific interaction when the results of classical qBRET experiments were ambiguous. The simulation results were confirmed experimentally using rapamycin inducible heterodimerization system. We used this new method to investigate the dimerization of various GPCRs, and our data have confirmed the homodimerization of V2 vasopressin and CaSR calcium sensing receptors, whereas our data argue against the heterodimerization of these receptors with other studied GPCRs, including type I and II angiotensin, ?2 adrenergic and CB1 cannabinoid receptors. PMID:25329164

  10. Multi-Window Classical Least Squares Multivariate Calibration Methods for Quantitative ICP-AES Analyses

    SciTech Connect

    CHAMBERS,WILLIAM B.; HAALAND,DAVID M.; KEENAN,MICHAEL R.; MELGAARD,DAVID K.

    1999-10-01

    The advent of inductively coupled plasma-atomic emission spectrometers (ICP-AES) equipped with charge-coupled-device (CCD) detector arrays allows the application of multivariate calibration methods to the quantitative analysis of spectral data. We have applied classical least squares (CLS) methods to the analysis of a variety of samples containing up to 12 elements plus an internal standard. The elements included in the calibration models were Ag, Al, As, Au, Cd, Cr, Cu, Fe, Ni, Pb, Pd, and Se. By performing the CLS analysis separately in each of 46 spectral windows and by pooling the CLS concentration results for each element in all windows in a statistically efficient manner, we have been able to significantly improve the accuracy and precision of the ICP-AES analyses relative to the univariate and single-window multivariate methods supplied with the spectrometer. This new multi-window CLS (MWCLS) approach simplifies the analyses by providing a single concentration determination for each element from all spectral windows. Thus, the analyst does not have to perform the tedious task of reviewing the results from each window in an attempt to decide the correct value among discrepant analyses in one or more windows for each element. Furthermore, it is not necessary to construct a spectral correction model for each window prior to calibration and analysis: When one or more interfering elements was present, the new MWCLS method was able to reduce prediction errors for a selected analyte by more than 2 orders of magnitude compared to the worst case single-window multivariate and univariate predictions. The MWCLS detection limits in the presence of multiple interferences are 15 rig/g (i.e., 15 ppb) or better for each element. In addition, errors with the new method are only slightly inflated when only a single target element is included in the calibration (i.e., knowledge of all other elements is excluded during calibration). The MWCLS method is found to be vastly superior to partial least squares (PLS) in this case of limited numbers of calibration samples.

  11. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Cates, Michael R.; Franks, Larry A.

    1985-01-01

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fissions are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for .sup.239 Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  12. Raman spectroscopy provides a rapid, non-invasive method for quantitation of starch in live, unicellular microalgae.

    PubMed

    Ji, Yuetong; He, Yuehui; Cui, Yanbin; Wang, Tingting; Wang, Yun; Li, Yuanguang; Huang, Wei E; Xu, Jian

    2014-12-01

    Conventional methods for quantitation of starch content in cells generally involve starch extraction steps and are usually labor intensive, thus a rapid and non-invasive method will be valuable. Using the starch-producing unicellular microalga Chlamydomonas reinhardtii as a model, we employed a customized Raman spectrometer to capture the Raman spectra of individual single cells under distinct culture conditions and along various growth stages. The results revealed a nearly linear correlation (R(2) = 0.9893) between the signal intensity at 478 cm(-1) and the starch content of the cells. We validated the specific correlation by showing that the starch-associated Raman peaks were eliminated in a mutant strain where the AGPase (ADP-glucose pyrophosphorylase) gene was disrupted and consequentially the biosynthesis of starch blocked. Furthermore, the method was validated in an industrial algal strain of Chlorella pyrenoidosa. This is the first demonstration of starch quantitation in individual live cells. Compared to existing cellular starch quantitation methods, this single-cell Raman spectra-based approach is rapid, label-free, non-invasive, culture-independent, low-cost, and potentially able to simultaneously track multiple metabolites in individual live cells, therefore should enable many new applications. PMID:24906189

  13. [Comparison of the results of quantitative polymerase chain reaction for cytomegalovirus among laboratories].

    PubMed

    Mori, Takehiko; Kato, Jun; Yamane, Akiko; Aisa, Yoshinobu; Nakazato, Tomonori; Okamoto, Shinichiro

    2011-04-01

    While cytomegalovirus (CMV) antigenemia assay is generally used for the monitoring of CMV reactivation/diseases, plasma real-time polymerase chain reaction (PCR) can also be a promising methodology and it can be performed by commercially available laboratories. In this study, the results of plasma real-time PCR performed by three laboratories were compared with those of CMV antigenemia. In CMV antigenemia-positive patients (N=11), the results of PCR were positive in all cases examined by two laboratories, while one laboratory yielded positive results only in 6 patients. One of the 2 CMV antigenemia-negative patients yielded positive PCR results at two laboratories, but one laboratory did not show positive results. A per-sample analysis showed that, of 84 CMV antigenemia-negative samples, PCR yielded negative results in 44 samples, positive results in 11 samples at three laboratories and at one or two laboratories in the remaining samples. Of the 36 CMV antigenemia-positive samples, PCR yielded positive results in 21 samples at three laboratories and positive results in the remaining samples at one or two laboratories. Although copy number of CMV-DNA evaluated by PCR at 3 laboratories showed a significant correlation with CMV antigenemia values, median copy number of CMV-DNA showed significant differences among the laboratories. Based on these findings, it is suggested that plasma real-time PCR has a higher sensitivity than CMV antigenemia in detecting CMV reactivation. However, the results varied significantly among the laboratories, suggesting that standardization of the methods is warranted. PMID:21566406

  14. Interferences of suspended clay fraction in protein quantitation by several determination methods.

    PubMed

    Lozzi, I; Pucci, A; Pantani, O L; D'Acqui, L P; Calamai, L

    2008-05-01

    Seven current methods of protein quantitation, Bradford (standard, micro, and 590/450 nm ratio), Lowry, bicinchoninic acid (BCA), UV spectrophotometry at 280 nm, and Quant-iT fluorescence-based determination, were compared with regard to their susceptibility to interferences due to the presence of suspended and not easily detectable clay particles. Bovine serum albumin (BSA) and Na-Wyoming montmorillonite were selected as model protein and reference clay, respectively. Protein-clay suspension mixtures were freshly prepared for each assay to simulate supernatants not completely centrifuged in batch sorption/kinetic experiments. Seven fixed increasing levels of clay (0.0, 0.00725, 0.0145, 0.029, 0.058, 0.145, 0.435 mg ml(-1)) were mixed with different levels of BSA in an appropriate range for each assay. To ascertain the interfering effect of different levels of clay, the theoretical concentrations of BSA were plotted against the estimated BSA concentrations of the samples, as obtained from the calibration curve of each method. A correct quantitation of the BSA concentration not influenced by clay would be described by a regression line with slope (b) not significantly different from 1 and an intercept (a) not significantly different from zero. At the lowest clay levels (0.00725 mg ml(-1)) a significant interference was evident for Bradford micro, Bradford 590/450, UV, and fluorescence. The three methods (Bradford standard, Lowry, and BCA) that seemed to show the better performances in the presence of clay after this first screening step also underwent an ANCOVA analysis, with the measured BSA concentrations as dependent variable and the clay concentrations as covariate. The Bradford standard and BCA methods were affected by a clay-dependent interference on BSA quantitation. The Lowry assay was the only method that gave correct estimates of BSA concentrations in the presence of any of the clay levels tested. PMID:18314004

  15. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    PET reconstruction is key to the quantification of PET data. To our knowledge, no comparative study of reconstruction methods has been performed to date. In this study, we compared reconstruction methods with various filters in terms of their spatial resolution, non-uniformities (NU), recovery coefficients (RCs), and spillover ratios (SORs). In addition, the linearity of reconstructed radioactivity between linearity of measured and true concentrations were also assessed. A Siemens Inveon PET scanner was used in this study. Spatial resolution was measured with NEMA standard by using a 1 mm3 sized 18F point source. Image quality was assessed in terms of NU, RC and SOR. To measure the effect of reconstruction algorithms and filters, data was reconstructed using FBP, 3D reprojection algorithm (3DRP), ordered subset expectation maximization 2D (OSEM 2D), and maximum a posteriori (MAP) with various filters or smoothing factors (β). To assess the linearity of reconstructed radioactivity, image quality phantom filled with 18F was used using FBP, OSEM and MAP (β =1.5 & 5 × 10-5). The highest achievable volumetric resolution was 2.31 mm3 and the highest RCs were obtained when OSEM 2D was used. SOR was 4.87% for air and 3.97% for water, obtained OSEM 2D reconstruction was used. The measured radioactivity of reconstruction image was proportional to the injected one for radioactivity below 16 MBq/ml when FBP or OSEM 2D reconstruction methods were used. By contrast, when the MAP reconstruction method was used, activity of reconstruction image increased proportionally, regardless of the amount of injected radioactivity. When OSEM 2D or FBP were used, the measured radioactivity concentration was reduced by 53% compared with true injected radioactivity for radioactivity <16 MBq/ml. The OSEM 2D reconstruction method provides the highest achievable volumetric resolution and highest RC among all the tested methods and yields a linear relation between the measured and true concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  16. A method for operative quantitative interpretation of multispectral images of biological tissues

    NASA Astrophysics Data System (ADS)

    Lisenko, S. A.; Kugeiko, M. M.

    2013-10-01

    A method for operative retrieval of spatial distributions of biophysical parameters of a biological tissue by using a multispectral image of it has been developed. The method is based on multiple regressions between linearly independent components of the diffuse reflection spectrum of the tissue and unknown parameters. Possibilities of the method are illustrated by an example of determining biophysical parameters of the skin (concentrations of melanin, hemoglobin and bilirubin, blood oxygenation, and scattering coefficient of the tissue). Examples of quantitative interpretation of the experimental data are presented.

  17. Radial period extraction method employing frequency measurement for quantitative collimation testing

    NASA Astrophysics Data System (ADS)

    Li, Sikun; Wang, Xiangzhao

    2016-01-01

    A radial period extraction method employing frequency measurement is proposed for quantitative collimation testing using spiral gratings. The radial period of the difference-frequency fringe is treated as a measure of the collimation condition. A frequency measurement technique based on wavelet transform and a statistical approach is presented to extract the radial period directly from the amplitude-transmittance spiral fringe. A basic constraint to set the parameters of the wavelet is introduced. Strict mathematical demonstration is given. The method outperforms methods employing phase measurement in terms of precision, stability and noise immune ability.

  18. Development of a Strain-Specific Molecular Method for Quantitating Individual Campylobacter Strains in Mixed Populations?

    PubMed Central

    Elvers, Karen T.; Helps, Christopher R.; Wassenaar, Trudy M.; Allen, Vivien M.; Newell, Diane G.

    2008-01-01

    The identification of sites resulting in cross-contamination of poultry flocks in the abattoir and determination of the survival and persistence of campylobacters at these sites are essential for the development of intervention strategies aimed at reducing the microbial burden on poultry at retail. A novel molecule-based method, using strain- and genus-specific oligonucleotide probes, was developed to detect and enumerate specific campylobacter strains in mixed populations. Strain-specific oligonucleotide probes were designed for the short variable regions (SVR) of the flaA gene in individual Campylobacter jejuni strains. A 16S rRNA Campylobacter genus-specific probe was also used. Both types of probes were used to investigate populations of campylobacters by colony lift hybridization. The specificity and proof of principle of the method were tested using strains with closely related SVR sequences and mixtures of these strains. Colony lifts of campylobacters were hybridized sequentially with up to two labeled strain-specific probes, followed by the generic 16S rRNA probe. SVR probes were highly specific, differentiating down to 1 nucleotide in the target sequence, and were sufficiently sensitive to detect colonies of a single strain in a mixed population. The 16S rRNA probe detected all Campylobacter spp. tested but not closely related species, such as Arcobacter skirrowi and Helicobacter pullorum. Preliminary field studies demonstrated the application of this technique to target strains isolated from poultry transport crate wash tank water. This method is quantitative, sensitive, and highly specific and allows the identification and enumeration of selected strains among all of the campylobacters in environmental samples. PMID:18281428

  19. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesC? approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  20. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits-Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2)= 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2)= 0.97), with a means SD value of -0.06 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P<0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primerdesign strategy may be applied to the detection of other RNA or DNA viruses. PMID:26612712

  1. Application of quantitative 1H-NMR method to determination of paeoniflorin in Paeoniae radix.

    PubMed

    Tanaka, Rie; Yamazaki, Marina; Hasada, Keiko; Nagatsu, Akito

    2013-07-01

    A quantitative (1)H-NMR method (qHNMR) was used to measure paeoniflorin content in Paeoniae radix (dried root of Paeonia lactiflora), of which paeoniflorin is a major component. The purity of paeoniflorin was calculated from the ratio of the intensity of the H-9 signal at ? 5.78 ppm of paeoniflorin to that of a hexamethyldisilane (HMD) signal at 0 ppm. The concentration of HMD was corrected with SI traceability by using bisphenol A of certified reference material (CRM) grade. The paeoniflorin content in 2 separate samples of Paeoniae radix was determined by qHNMR and was found to be 2.15 and 2.45%. We demonstrated that this method is useful for quantitative analysis of crude drugs. PMID:23081682

  2. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  3. a Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis.

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Available from UMI in association with The British Library. Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4-tetrahidro-1, 6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 Br.H_2O) has been determined, first using monochromatic Mo Kalpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed a R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop- 2-en-1-one, (C_{25 }H_{20}N _2O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analysis respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analysis of the benzil compound ((C_6H_5 O.CO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature -114 ^circC. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new molecular structure of (Dinitrato-(N,N ^'-dimethylethylene-diamine)copper(II)) has been determined using Mo Kalpha radiation on a four circle diffractometer. The refinement resulted in an R-factor (on F) of 4.06%.

  4. A Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4 -tetrahidro-1,6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 BrcdotH_2O) has been determined, first using monochromatic Mo K alpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed an R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop-2-en-1-one, (C_{25}H _{20}N_2 O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analyses respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analyses of the benzil compound ((C_6H_5 OcdotCO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new molecular structure of (Dinitrato-(N,N ^'-dimethylethylene-diamine)copper(II)) has been determined using Mo Kalpha radiation on a four circle diffractometer. The refinement resulted in an R-factor (on F) of 4.06%.

  5. Quantitative methods for genome-scale analysis of in situ hybridization and correlation with microarray data

    PubMed Central

    Lee, Chang-Kyu; Sunkin, Susan M; Kuan, Chihchau; Thompson, Carol L; Pathak, Sayan; Ng, Lydia; Lau, Chris; Fischer, Shanna; Mortrud, Marty; Slaughterbeck, Cliff; Jones, Allan; Lein, Ed; Hawrylycz, Michael

    2008-01-01

    With the emergence of genome-wide colorimetric in situ hybridization (ISH) data sets such as the Allen Brain Atlas, it is important to understand the relationship between this gene expression modality and those derived from more quantitative based technologies. This study introduces a novel method for standardized relative quantification of colorimetric ISH signal that enables a large-scale cross-platform expression level comparison of ISH with two publicly available microarray brain data sources. PMID:18234097

  6. Localization and quantitation of chloroplast enzymes and light-harvesting components using immunocytochemical methods

    SciTech Connect

    Mustardy, L.; Cunningham, F.X Jr.; Gantt, E. )

    1990-09-01

    Seven chloroplast proteins were localized in Porphyridium cruentum (ATCC 50161) by immunolabeling with colloidal gold on electron microscope sections of log phase cells grown under red, green, and white light. Ribulose bisphosphate carboxylase labeling occurred almost exclusively in the pyrenoid. The major apoproteins of photosystem I (56-64 kD) occurred mostly over the stromal thylakoid region and also appeared over the thylakoids passing through the pyrenoid. Labeling for photosystem II core components (D2 and a 45 kD Chl-binding protein), for phycobilisomes (allophycocyanin, and a 91 kD L{sub CM} linker) and for ATP synthase ({beta} subunit) were predominantly present in the thylakoid region but not in the pyrenoid region of the chloroplast. Red light cells had increased labeling per thylakoid length for polypeptides of photosystem II and of phycobilisomes, while photosystem I density decreased, compared to white light cells. Conversely, green light cells had a decreased density of photosystem II and phycobilisome polypeptides, while photosystem I density changed little compared with white light cells. A comparison of the immunogold labeling results with data from spectroscopic methods and from rocket immunoelectrophoresis indicates that it can provide a quantitative measure of the relative amounts of protein components as well as their localization in specific organeller compartments.

  7. Integrated multiplatform method for in vitro quantitative assessment of cellular uptake for fluorescent polymer nanoparticles

    NASA Astrophysics Data System (ADS)

    Ferrari, Raffaele; Lupi, Monica; Falcetta, Francesca; Bigini, Paolo; Paolella, Katia; Fiordaliso, Fabio; Bisighini, Cinzia; Salmona, Mario; D'Incalci, Maurizio; Morbidelli, Massimo; Moscatelli, Davide; Ubezio, Paolo

    2014-01-01

    Studies of cellular internalization of nanoparticles (NPs) play a paramount role for the design of efficient drug delivery systems, but so far they lack a robust experimental technique able to quantify the NP uptake in terms of number of NPs internalized in each cell. In this work we propose a novel method which provides a quantitative evaluation of fluorescent NP uptake by combining flow cytometry and plate fluorimetry with measurements of number of cells. Single cell fluorescence signals measured by flow cytometry were associated with the number of internalized NPs, exploiting the observed linearity between average flow cytometric fluorescence and overall plate fluorimeter measures, and previous calibration of the microplate reader with serial dilutions of NPs. This precise calibration has been made possible by using biocompatible fluorescent NPs in the range of 20-300 nm with a narrow particle size distribution, functionalized with a covalently bonded dye, Rhodamine B, and synthesized via emulsion free-radical polymerization. We report the absolute number of NPs internalized in mouse mammary tumor cells (4T1) as a function of time for different NP dimensions and surface charges and at several exposure concentrations. The obtained results indicate that 4T1 cells incorporated 103-104 polymer NPs in a short time, reaching an intracellular concentration 15 times higher than the external one.

  8. Quantitative method of analyzing the interaction of slightly selective radioligands with multiple receptor subtypes

    SciTech Connect

    McGonigle, P.; Neve, K.A.; Molinoff, P.B.

    1986-10-01

    Subclasses of receptors exist for most neurotransmitters. Frequently, two subtypes of receptors coexist in the same tissue and, in some cases, they mediate the same physiological response. In tissues with two classes of binding sites for a given hormone, an estimate of the proportion of each class of binding sites is obtained by inhibiting the binding of a single concentration of a radioligand with a selective unlabeled ligand. Accurate estimates of the density of each class of receptors will only be obtained, however, if the radioligand is entirely nonselective. Selectivity of just 2- to 3-fold can markedly influence the results of subtype analysis. The conclusion that a radioligand is nonselective is usually based on the results of a saturation binding curve. If Scatchard analysis results in a linear plot, the radioligand is nonselective. Scatchard analysis cannot distinguish between a radioligand that is nonselective and one that is slightly selective. The use of a slightly selective radioligand can lead to errors of 50% or more, depending on the concentration of the radioligand relative to the Kd values of the two classes of sites. A new method has been developed that can be used to quantitate 2- to 3-fold differences in the affinity of two distinct classes of binding sites for a radioligand. This approach requires that a series of inhibition experiments with a selective unlabeled ligand be performed in the presence of increasing concentrations of the radioligand. Analysis of the resulting inhibition curves, utilizing the mathematical modeling program MLAB on the PROPHET system, yields accurate estimates of the density of each class of receptor as well as the affinity of each receptor for the labeled and unlabeled ligands. This approach was used to determine whether /sup 125/I-iodopindolol shows selectivity for beta 1- or beta 2-adrenergic receptors.

  9. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles

    SciTech Connect

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki; Department of Surgical Oncology, Graduate School of Medicine, Tohoku University, Seiryo-machi, Aoba-ku, Sendai 980-8574

    2012-09-28

    Highlights: Black-Right-Pointing-Pointer Organic fluorescent material-assembled nanoparticles for IHC were prepared. Black-Right-Pointing-Pointer New nanoparticle fluorescent intensity was 10.2-fold greater than Qdot655. Black-Right-Pointing-Pointer Nanoparticle staining analyzed a wide range of ER expression levels in tissue. Black-Right-Pointing-Pointer Nanoparticle staining enhanced the quantitative sensitivity for ER diagnosis. -- Abstract: The detection of estrogen receptors (ERs) by immunohistochemistry (IHC) using 3,3 Prime -diaminobenzidine (DAB) is slightly weak as a prognostic marker, but it is essential to the application of endocrine therapy, such as antiestrogen tamoxifen-based therapy. IHC using DAB is a poor quantitative method because horseradish peroxidase (HRP) activity depends on reaction time, temperature and substrate concentration. However, IHC using fluorescent material provides an effective method to quantitatively use IHC because the signal intensity is proportional to the intensity of the photon excitation energy. However, the high level of autofluorescence has impeded the development of quantitative IHC using fluorescence. We developed organic fluorescent material (tetramethylrhodamine)-assembled nanoparticles for IHC. Tissue autofluorescence is comparable to the fluorescence intensity of quantum dots, which are the most representative fluorescent nanoparticles. The fluorescent intensity of our novel nanoparticles was 10.2-fold greater than quantum dots, and they did not bind non-specifically to breast cancer tissues due to the polyethylene glycol chain that coated their surfaces. Therefore, the fluorescent intensity of our nanoparticles significantly exceeded autofluorescence, which produced a significantly higher signal-to-noise ratio on IHC-imaged cancer tissues than previous methods. Moreover, immunostaining data from our nanoparticle fluorescent IHC and IHC with DAB were compared in the same region of adjacent tissues sections to quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal.

  10. Test Characteristics of Urinary Biomarkers Depend on Quantitation Method in Acute Kidney Injury

    PubMed Central

    Md Ralib, Azrina; Pickering, John W.; Shaw, Geoffrey M.; Devarajan, Prasad; Edelstein, Charles L.; Bonventre, Joseph V.

    2012-01-01

    The concentration of urine influences the concentration of urinary biomarkers of AKI. Whether normalization to urinary creatinine concentration, as commonly performed to quantitate albuminuria, is the best method to account for variations in urinary biomarker concentration among patients in the intensive care unit is unknown. Here, we compared the diagnostic and prognostic performance of three methods of biomarker quantitation: absolute concentration, biomarker normalized to urinary creatinine concentration, and biomarker excretion rate. We measured urinary concentrations of alkaline phosphatase, γ-glutamyl transpeptidase, cystatin C, neutrophil gelatinase–associated lipocalin, kidney injury molecule–1, and IL-18 in 528 patients on admission and after 12 and 24 hours. Absolute concentration best diagnosed AKI on admission, but normalized concentrations best predicted death, dialysis, or subsequent development of AKI. Excretion rate on admission did not diagnose or predict outcomes better than either absolute or normalized concentration. Estimated 24-hour biomarker excretion associated with AKI severity, and for neutrophil gelatinase–associated lipocalin and cystatin C, with poorer survival. In summary, normalization to urinary creatinine concentration improves the prediction of incipient AKI and outcome but provides no advantage in diagnosing established AKI. The ideal method for quantitating biomarkers of urinary AKI depends on the outcome of interest. PMID:22095948

  11. A method of quantitative risk assessment for transmission pipeline carrying natural gas.

    PubMed

    Jo, Young-Do; Ahn, Bum Jong

    2005-08-31

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline. PMID:15913887

  12. Task 4.4 - development of supercritical fluid extraction methods for the quantitation of sulfur forms in coal

    SciTech Connect

    Timpe, R.C.

    1995-04-01

    Development of advanced fuel forms depends on having reliable quantitative methods for their analysis. Determination of the true chemical forms of sulfur in coal is necessary to develop more effective methods to reduce sulfur content. Past work at the Energy & Environmental Research Center (EERC) indicates that sulfur chemistry has broad implications in combustion, gasification, pyrolysis, liquefaction, and coal-cleaning processes. Current analytical methods are inadequate for accurately measuring sulfur forms in coal. This task was concerned with developing methods to quantitate and identify major sulfur forms in coal based on direct measurement (as opposed to present techniques based on indirect measurement and difference values). The focus was on the forms that were least understood and for which the analytical methods have been the poorest, i.e., organic and elemental sulfur. Improved measurement techniques for sulfatic and pyritic sulfur also need to be developed. A secondary goal was to understand the interconversion of sulfur forms in coal during thermal processing. EERC has developed the first reliable analytical method for extracting and quantitating elemental sulfur from coal (1). This method has demonstrated that elemental sulfur can account for very little or as much as one-third of the so-called organic sulfur fraction. This method has disproved the generally accepted idea that elemental sulfur is associated with the organic fraction. A paper reporting the results obtained on this subject entitled {open_quote}Determination of Elemental Sulfur in Coal by Supercritical Fluid Extraction and Gas Chromatography with Atomic Emission Detection{close_quote} was published in Fuel (A).

  13. Quantitative measurements of optical absorption in CVD-grown ZnS with the phase-shift photothermal method

    NASA Astrophysics Data System (ADS)

    Luk'yanov, A. Yu.; Tyukaev, R. V.; Pogorelko, A. A.; Gavrishchuk, E. M.; Pereskokov, A. A.

    2003-01-01

    A technique of measuring weak bulk optical absorption at a wavelength of 10.6 ?m in polished plates made of CVD-grown polycrystalline ZnS was developed on the basis of the phase-shift photothermal method. The results obtained are in good agreement with the data on the absorption of the samples under investigation, calculated from calorimetric and transmittance measurements. The possibility of highly sensitive local quantitative measurements of the optical-absorption coefficient using the phase-shift photothermal method is demonstrated.

  14. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    PubMed Central

    Manfredi, Marcello; Bearman, Greg; Williamson, Greg; Kronkright, Dale; Doehne, Eric; Jacobs, Megan; Marengo, Emilio

    2014-01-01

    In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI) we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time. PMID:25010699

  15. A new quantitative method for the non-invasive documentation of morphological damage in paintings using RTI surface normals.

    PubMed

    Manfredi, Marcello; Bearman, Greg; Williamson, Greg; Kronkright, Dale; Doehne, Eric; Jacobs, Megan; Marengo, Emilio

    2014-01-01

    In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI) we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time. PMID:25010699

  16. Quantitative ultrasound of cortical bone in the femoral neck predicts femur strength: results of a pilot study.

    PubMed

    Grimal, Quentin; Grondin, Julien; Guérard, Sandra; Barkmann, Reinhard; Engelke, Klaus; Glüer, Claus-C; Laugier, Pascal

    2013-02-01

    A significant risk of femoral neck (FN) fracture exists for men and women with an areal bone mineral density (aBMD) higher than the osteoporotic range, as measured with dual-energy X-ray absorptiometry (DXA). Separately measuring the cortical and trabecular FN compartments and combining the results would likely be a critical aspect of enhancing the diagnostic capabilities of a new technique. Because the cortical shell determines a large part of FN strength a novel quantitative ultrasound (QUS) technique that probes the FN cortical compartment was implemented. The sensitivity of the method to variations of FN cortical properties and FN strength was tested. Nine femurs (women, mean age 83 years) were subjected to QUS to measure the through transmission time-of-flight (TOF) at the FN and mechanical tests to assess strength. Quantitative computed tomography (QCT) scans were performed to enable analysis of the dependence of TOF on bone parameters. DXA was also performed for reference. An ultrasound wave propagating circumferentially in the cortical shell was measured in all specimens. Its TOF was not influenced by the properties of the trabecular compartment. Averaged TOF for nine FN measurement positions/orientations was significantly correlated to strength (R2  = 0.79) and FN cortical QCT variables: total BMD (R(2)  = 0.54); regional BMD in the inferoanterior (R2  = 0.90) and superoanterior (R2  = 0.57) quadrants; and moment of inertia (R2  = 0.71). The results of this study demonstrate that QUS can perform a targeted measurement of the FN cortical compartment. Because the method involves mechanical guided waves, the QUS variable is related to the geometric and material properties of the cortical shell (cortical thickness, tissue elasticity, and porosity). This work opens the way to a multimodal QUS assessment of the proximal femur, combining our approach targeting the cortical shell with the existing modality sensitive to the trabecular compartment. In vivo feasibility of our approach has to be confirmed with experimental data in patients. PMID:22915370

  17. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and

  18. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  19. Researchers views on return of incidental genomic research results: qualitative and quantitative findings

    PubMed Central

    Klitzman, Robert; Appelbaum, Paul S.; Fyer, Abby; Martinez, Josue; Buquez, Brigitte; Wynn, Julia; Waldman, Cameron R.; Phelan, Jo; Parens, Erik; Chung, Wendy K.

    2013-01-01

    Purpose Comprehensive genomic analysis including exome and genome sequencing is increasingly being utilized in research studies, leading to the generation of incidental genetic findings. It is unclear how researchers plan to deal with incidental genetic findings. Methods We conducted a survey of the practices and attitudes of 234 members of the US genetic research community and performed qualitative semistructured interviews with 28 genomic researchers to understand their views and experiences with incidental genetic research findings. Results We found that 12% of the researchers had returned incidental genetic findings, and an additional 28% planned to do so. A large majority of researchers (95%) believe that incidental findings for highly penetrant disorders with immediate medical implications should be offered to research participants. However, there was no consensus on returning incidental results for other conditions varying in penetrance and medical actionability. Researchers raised concerns that the return of incidental findings would impose significant burdens on research and could potentially have deleterious effects on research participants if not performed well. Researchers identified assistance needed to enable effective, accurate return of incidental findings. Conclusion The majority of the researchers believe that research participants should have the option to receive at least some incidental genetic research results. PMID:23807616

  20. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226

  1. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  2. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18 O-Labeling Method for Quantitative Proteomics

    SciTech Connect

    Lopez-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather S.; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-08-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min and minimized the amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from bacteria Shewanella oneidensis, and mouse plasma, as well as for the labeling of complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, and rapid, and thus well-suited for automation.

  3. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  4. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  5. The effect of hydraulic loading on bioclogging in porous media: Quantitative results from tomographic imaging

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Davit, Y.; Connolly, J. M.; Gerlach, R.; Wood, B. D.; Wildenschild, D.

    2013-12-01

    Biofilm growth in porous media is generally surface attached, and pore filling. A direct result of biofilm formation is the clogging of pore space available for fluid transport. This clogging effect has come to be termed bioclogging. In physical experiments bioclogging expresses as an increase in differential pressure across experimental specimens and traditional investigations of bioclogging in 3D porous media have included measurements of bulk differential pressure changes in order to evaluate changes in permeability or hydraulic conductivity. Due to the opaque nature of most types of porous media, visualization of bioclogging has been limited to the use of 2D or pseudo-3D micromodels. As a result, bioclogging models have relied on parameters derived from 2D visualization experiments. Results from these studies have shown that even small changes in pore morphology associated with biofilm growth can significantly alter fluid hydrodynamics. Recent advances in biofilm imaging facilitate the investigation of biofilm growth and bioclogging in porous media through the implementation of x-ray computed microtomography (CMT) and a functional contrast agent. We used barium sulfate as the contrast agent which consists of a particle suspension that fills all pore space available to fluid flow. Utilization of x-ray CMT with a barium sulfate contrast agent facilitates the examination of biofilm growth at the micron scale throughout experimental porous media growth reactors. This method has been applied to investigate changes in macropore morphology associated with biofilm growth. Applied fluid flow rates correspond to initial Reynolds numbers ranging from 0.1 to 100. Results include direct comparison of measured changes in porosity and hydraulic conductivity as calculated using differential pressure measurements vs. images. In addition, parameters such as biofilm thickness, reactive surface area, and attachment surface area will be presented in order to help characterize biofilm structure at each of the investigated flow rates.

  6. Quantitative PCR method for evaluating freshness of whiting (Merlangius merlangus) and plaice (Pleuronectes platessa).

    PubMed

    Duflos, Guillaume; Theraulaz, Laurence; Giordano, Gerard; Mejean, Vincent; Malle, Pierre

    2010-07-01

    We have developed a method for rapid quantification of fish spoilage bacteria based on quantitative PCR with degenerated oligonucleotides that hybridize on the torA gene coding for trimethylamine N-oxide reductase, one of the major bacterial enzymes in fish spoilage. To show the utility of this gene, we used a regular PCR with DNA extracts from whiting (Merlangius merlangus) and plaice (Pleuronectes platessa) stored in ice. Quantitative PCR showed that the number of copies of the torA gene, i.e., the number of spoilage bacteria, increases with length of storage. This approach can therefore be used to evaluate freshness for the two fish species studied (whiting and plaice). PMID:20615351

  7. Quantitative and qualitative methods in UK health research: then, now and...?

    PubMed

    McPherson, K; Leydon, G

    2002-09-01

    This paper examines the current status of qualitative and quantitative research in the context of UK (public) health research in cancer. It is proposed that barren competition between qualitative and quantitative methods is inevitable, but that effective synergy between them continues to be essential to research excellence. The perceived methodological utility, with respect to understanding residual uncertainties, can account for the status accorded various research techniques and these will help to explain shifts witnessed in recent years and contribute towards an understanding of what can be realistically expected in terms of future progress. It is argued that the methodological debate, though familiar to many, is worthy of rearticulation in the context of cancer research where the psychosocial aspects of living with a cancer and the related complexity of providing appropriate cancer care are being addressed across Europe, as evidenced in recent directions in policy and research. PMID:12296843

  8. Using quantitative and qualitative data in health services research what happens when mixed method findings conflict? [ISRCTN61522618

    PubMed Central

    Moffatt, Suzanne; White, Martin; Mackintosh, Joan; Howel, Denise

    2006-01-01

    Background In this methodological paper we document the interpretation of a mixed methods study and outline an approach to dealing with apparent discrepancies between qualitative and quantitative research data in a pilot study evaluating whether welfare rights advice has an impact on health and social outcomes among a population aged 60 and over. Methods Quantitative and qualitative data were collected contemporaneously. Quantitative data were collected from 126 men and women aged over 60 within a randomised controlled trial. Participants received a full welfare benefits assessment which successfully identified additional financial and non-financial resources for 60% of them. A range of demographic, health and social outcome measures were assessed at baseline, 6, 12 and 24 month follow up. Qualitative data were collected from a sub-sample of 25 participants purposively selected to take part in individual interviews to examine the perceived impact of welfare rights advice. Results Separate analysis of the quantitative and qualitative data revealed discrepant findings. The quantitative data showed little evidence of significant differences of a size that would be of practical or clinical interest, suggesting that the intervention had no impact on these outcome measures. The qualitative data suggested wide-ranging impacts, indicating that the intervention had a positive effect. Six ways of further exploring these data were considered: (i) treating the methods as fundamentally different; (ii) exploring the methodological rigour of each component; (iii) exploring dataset comparability; (iv) collecting further data and making further comparisons; (v) exploring the process of the intervention; and (vi) exploring whether the outcomes of the two components match. Conclusion The study demonstrates how using mixed methods can lead to different and sometimes conflicting accounts and, using this six step approach, how such discrepancies can be harnessed to interrogate each dataset more fully. Not only does this enhance the robustness of the study, it may lead to different conclusions from those that would have been drawn through relying on one method alone and demonstrates the value of collecting both types of data within a single study. More widespread use of mixed methods in trials of complex interventions is likely to enhance the overall quality of the evidence base. PMID:16524479

  9. Quantitative elastography of skin and skin lesion using phase-sensitive OCT (PhS-OCT) and surface wave method

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang

    2012-01-01

    This paper presents the combination of phase sensitive optical coherence tomography (PhS-OCT) imaging system and surface wave method to achieve quantitative evaluation and elastography of the mechanical properties of in vivo human skin. PhS-OCT measures the surface acoustic waves (SAWs) generated by impulse stimulation from a home-made shaker, and provide the B-frame images for the sample. The surface wave phase velocity dispersion curves were calculated, from which the elasticity of different skin layers were determined. The combination of phase velocities from adjacent two locations generates a quantitative elastography of sample. The experimental results agree well with theoretical expectations and may offer potential use in clinical situations.

  10. Path Integrals and Exotic Options:. Methods and Numerical Results

    NASA Astrophysics Data System (ADS)

    Bormetti, G.; Montagna, G.; Moreni, N.; Nicrosini, O.

    2005-09-01

    In the framework of Black-Scholes-Merton model of financial derivatives, a path integral approach to option pricing is presented. A general formula to price path dependent options on multidimensional and correlated underlying assets is obtained and implemented by means of various flexible and efficient algorithms. As an example, we detail the case of Asian call options. The numerical results are compared with those obtained with other procedures used in quantitative finance and found to be in good agreement. In particular, when pricing at the money (ATM) and out of the money (OTM) options, path integral exhibits competitive performances.

  11. Three-dimensional quantitative analysis of adhesive remnants and enamel loss resulting from debonding orthodontic molar tubes

    PubMed Central

    2014-01-01

    Aims Presenting a new method for direct, quantitative analysis of enamel surface. Measurement of adhesive remnants and enamel loss resulting from debonding molar tubes. Material and methods Buccal surfaces of fifteen extracted human molars were directly scanned with an optic blue-light 3D scanner to the nearest 2 μm. After 20 s etching molar tubes were bonded and after 24 h storing in 0.9% saline - debonded. Then 3D scanning was repeated. Superimposition and comparison were proceeded and shape alterations of the entire objects were analyzed using specialized computer software. Residual adhesive heights as well as enamel loss depths have been obtained for the entire buccal surfaces. Residual adhesive volume and enamel loss volume have been calculated for every tooth. Results The maximum height of adhesive remaining on enamel surface was 0.76 mm and the volume on particular teeth ranged from 0.047 mm3 to 4.16 mm3. The median adhesive remnant volume was 0.988 mm3. Mean depths of enamel loss for particular teeth ranged from 0.0076 mm to 0.0416 mm. Highest maximum depth of enamel loss was 0.207 mm. Median volume of enamel loss was 0.104 mm3 and maximum volume was 1.484 mm3. Conclusions Blue-light 3D scanning is able to provide direct precise scans of the enamel surface, which can be superimposed in order to calculate shape alterations. Debonding molar tubes leaves a certain amount of adhesive remnants on the enamel, however the interface fracture pattern varies for particular teeth and areas of enamel loss are present as well. PMID:25208969

  12. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  13. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  14. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years.

    PubMed

    Tapaltsyan, Vagan; Eronen, Jussi T; Lawing, A Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D

    2015-05-01

    The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem-cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3,500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine whether evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  15. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl. PMID:24931408

  16. Quantitative determination of pertechnetate by electrochemical methods. Electrochemistry of technetium radiopharmaceutical analogs

    SciTech Connect

    Lewis, J.Y.

    1983-01-01

    Electroanalytical methods are described for obtaining quantitative information about pertechnetate in various matrices and qualitative information about analogs to commercially used /sup 99m/Tc skeletal imaging agents. An anodic stripping voltammetry method for determining pertechnetate is described that addresses the problem of multiple asymmetrical stripping waves occurring at solid electrodes. Lower detection limits are needed however, for determining pertechnetate in two matrices of specific interest, i.e., /sup 99/Mo//sup 99m/Tc generator eluents and environmental samples. A detailed discussion of a search for a well behaved pertechnetate reduction wave is followed by a description of the liquid chromatography/reductive electrochemical detection (LCEC) method developed for determining pertechnetate. The development of electroanalytical methods suitable for characterizing technetium methylene diphosphonate and technetium hydroxyethylidene reaction mixtures prepared by reduction of pertechnetate with NaBH/sub 4/ is presented.

  17. A quantitative method for the evaluation of three-dimensional structure of temporal bone pneumatization

    PubMed Central

    Hill, Cheryl A.; Richtsmeier, Joan T.

    2010-01-01

    Temporal bone pneumatization has been included in lists of characters used in phylogenetic analyses of human evolution. While studies suggest that the extent of pneumatization has decreased over the course of human evolution, little is known about the processes underlying these changes or their significance. In short, reasons for the observed reduction and the potential reorganization within pneumatized spaces are unknown. Technological limitations have limited previous analyses of pneumatization in extant and fossil species to qualitative observations of the extent of temporal bone pneumatization. In this paper, we introduce a novel application of quantitative methods developed for the study of trabecular bone to the analysis of pneumatized spaces of the temporal bone. This method utilizes high-resolution X-ray computed tomography (HRXCT) images and quantitative software to estimate three-dimensional parameters (bone volume fractions, anisotropy, and trabecular thickness) of bone structure within defined units of pneumatized spaces. We apply this approach in an analysis of temporal bones of diverse but related primate species, Gorilla gorilla, Pan troglodytes, Homo sapiens, and Papio hamadryas anubis, to illustrate the potential of these methods. In demonstrating the utility of these methods, we show that there are interspecific differences in the bone structure of pneumatized spaces, perhaps reflecting changes in the localized growth dynamics, location of muscle attachments, encephalization, or basicranial flexion. PMID:18715622

  18. Development of a HPLC Method for the Quantitative Determination of Capsaicin in Collagen Sponge

    PubMed Central

    Guo, Chun-Lian; Chen, Hong-Ying; Cui, Bi-Ling; Chen, Yu-Huan; Zhou, Yan-Fang; Peng, Xin-Sheng; Wang, Qin

    2015-01-01

    Controlling the concentration of drugs in pharmaceutical products is essential to patient's safety. In this study, a simple and sensitive HPLC method is developed to quantitatively analyze capsaicin in collagen sponge. The capsaicin from sponge was extracted for 30 min with ultrasonic wave extraction technique and methanol was used as solvent. The chromatographic method was performed by using isocratic system composed of acetonitrile-water (70 : 30) with a flow rate of 1 mL/min and the detection wavelength was at 280 nm. Capsaicin can be successfully separated with good linearity (the regression equation is A = 9.7182C + 0.8547; R2 = 1.0) and perfect recovery (99.72%). The mean capsaicin concentration in collagen sponge was 49.32 mg/g (RSD = 1.30%; n = 3). In conclusion, the ultrasonic wave extraction method is simple and the extracting efficiency is high. The HPLC assay has excellent sensitivity and specificity and is a convenient method for capsaicin detection in collagen sponge. This paper firstly discusses the quantitative analysis of capsaicin in collagen sponge. PMID:26612986

  19. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating. PMID:25084671

  20. Broad-spectrum detection and quantitation methods of Soil-borne cereal mosaic virus isolates.

    PubMed

    Vaanopoulos, Cline; Legrve, Anne; Moreau, Virginie; Bragard, Claude

    2009-08-01

    A broad-spectrum reverse transcription-polymerase chain reaction (RT-PCR) protocol was developed for detecting Soil-borne cereal mosaic virus (SBCMV) isolates, responsible for mosaic diseases in Europe, using primers targeting the highly conserved 3'-untranslated region of RNA-1 and RNA-2 of SBCMV. The 3'-end region is a privileged target for the detection of a wide range of isolates, because of sequence conservation, of the tRNA-like structure, the major role in viral replication and the signal amplification due to the presence of numerous genomic and subgenomic RNAs. The primers were also designed for virus quantitation using real-time RT-PCR with SYBR-Green chemistry. No cross-reaction with Wheat spindle streak mosaic virus, frequently associated with SBCMV, was observed. The use of RT-PCR and real-time quantitative RT-PCR allowed a more sensitive detection and quantitation of SBCMV to be made than was the case with ELISA. The methods enabled European isolates of SBCMV from Belgium, France, Germany, Italy and the UK to be detected and quantified. Real-time RT-PCR represents a new tool for comparing soil inoculum potential as well as cultivar resistance to SBCMV. PMID:19490978

  1. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  2. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  3. Quantitative PCR methods for RNA and DNA in marine sediments: maximizing yield while overcoming inhibition.

    PubMed

    Lloyd, Karen G; Macgregor, Barbara J; Teske, Andreas

    2010-04-01

    For accurate quantification of DNA and RNA from environmental samples, yield loss during nucleic acid purification has to be minimized. Quantitative PCR (qPCR) and reverse transcription (RT)-qPCR require a trade-off between maximizing yield and removing inhibitors. We compared DNA and RNA yield and suitability for quantitative SYBR Green PCR and RT-PCR using the UltraClean and PowerSoil extraction kits and a bead-beating protocol with phenol/chloroform extraction steps. Purification methods included silica-column-based procedures from the MoBio kits, RNeasy MinElute, WizardPlus miniprep columns, and an acrylamide gel extraction. DNA and RNA purification with WizardPlus and RNeasy, respectively, led to significant losses of nucleic acids and archaeal 16S rRNA or 16S rRNA gene, as measured with RiboGreen or PicoGreen, and RT-qPCR or qPCR. Extraction and purification of DNA with the MoBio DNA UltraClean and DNA PowerSoil kits also decreased the yields slightly, relative to gel purification, in all sediments, except those from the deep sea in the Gulf of Mexico. Organic matter in humic-rich sediments may bind to these silica columns, reducing their nucleic acid-loading capacity. Purification with gel extraction cleans up organic-rich sediment samples sufficiently for quantitative analysis while avoiding the yield loss associated with commonly used silica columns. PMID:20059545

  4. Quantitative thin-layer chromatographic method of analysis of azithromycin in pure and capsule forms.

    PubMed

    Khedr, Alaa; Sheha, Mahmoud

    2003-01-01

    A validated stability-indicating thin-layer chromatographic (TLC) method of the analysis of azithromycin (AZT) in bulk and capsule forms is developed. Both AZT potential impurity and degradation products can be selectively and accurately estimated in both raw material and product onto one precoated silica-gel TLC plate 60F254. The development system used is n-hexane-ethyl acetate-diethylamine (75:25:10, v/v/v). The separated bands are detected as brown to brownish red spots after spraying with modified Dragendorff's solution. The Rf values of AZT, azaerythromycin A, and the three degradation products are 0.54, 0.35, 0.40, 0.20, and 0.12, respectively. The optical densities of the separated spots are found to be linear in proportion to the amount used. The stress testing of AZT shows that azaerythromycin A is the major impurity and degradation product, accompanied by three other unknown degradation products. The stability of AZT is studied under accelerated conditions in order to provide a rapid indication of differences that might result from a change in the manufacturing process or source of the sample. The forced degradation conditions include the effect of heat, moisture, light, acid-base hydrolysis, sonication, and oxidation. The compatibility of AZT with the excipients used is also studied in the presence and absence of moisture. The amounts of AZT and azaerythromycin A are calculated from the corresponding linear calibration curve; however, the amounts of any other generated or detected unknown impurities are calculated as if it were AZT. This method shows enough selectivity, sensitivity, accuracy, precision, linearity-range, and robustness to satisfy Federal Drug Administration/International Conference of Harmonization regulatory requirements. The method developed can also be used for the purity testing of AZT raw material and capsules, content uniformity testing, dissolution testing, and stability testing of AZT capsules. The potential impurity profiles of both active AZT material and capsule forms are found comparable. The linear range of AZT is between 5 and 30 mcg/spot with a limit of quantitation of 2 mcg/spot. The intraassay relative standard deviation percentage is not more than 0.54%, and the day-to-day variation is not more than 0.86%, calculated on the amounts of AZT RS recovered using different TLC plates. PMID:12597590

  5. A Dilute-and-Shoot LC-MS Method for Quantitating Opioids in Oral Fluid.

    PubMed

    Enders, Jeffrey R; McIntire, Gregory L

    2015-10-01

    Opioid testing represents a dominant share of the market in pain management clinical testing facilities. Testing of this drug class in oral fluid (OF) has begun to rise in popularity. OF analysis has traditionally required extensive clean-up protocols and sample concentration, which can be avoided. This work highlights the use of a fast, 'dilute-and-shoot' method that performs no considerable sample manipulation. A quantitative method for the determination of eight common opioids and associated metabolites (codeine, morphine, hydrocodone, hydromorphone, norhydrocodone, oxycodone, noroxycodone and oxymorphone) in OF is described herein. OF sample is diluted 10-fold in methanol/water and then analyzed using an Agilent chromatographic stack coupled with an AB SCIEX 4500. The method has a 2.2-min LC gradient and a cycle time of 2.9 min. In contrast to most published methods of this particular type, this method uses no sample clean-up or concentration and has a considerably faster LC gradient, making it ideal for very high-throughput laboratories. Importantly, the method requires only 100 ?L of sample and is diluted 10-fold prior to injection to help with instrument viability. Baseline separation of all isobaric opioids listed above was achieved on a phenyl-hexyl column. The validated calibration range for this method is 2.5-1,000 ng/mL. This 'dilute-and-shoot' method removes the unnecessary, costly and time-consuming extraction steps found in traditional methods and still surpasses all analytical requirements. PMID:26378142

  6. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12. PMID:21515963

  7. A Robust Method for Quantitative High-throughput Analysis of Proteomes by 18O Labeling*

    PubMed Central

    Bonzon-Kulichenko, Elena; Prez-Hernndez, Daniel; Nez, Estefana; Martnez-Acedo, Pablo; Navarro, Pedro; Trevisan-Herraz, Marco; del Carmen Ramos, Mara; Sierra, Saleta; Martnez-Martnez, Sara; Ruiz-Meana, Marisol; Mir-Casas, Elizabeth; Garca-Dorado, David; Redondo, Juan Miguel; Burgos, Javier S.; Vzquez, Jess

    2011-01-01

    MS-based quantitative proteomics plays an increasingly important role in biological and medical research and the development of these techniques remains one of the most important challenges in mass spectrometry. Numerous stable isotope labeling approaches have been proposed. However, and particularly in the case of 18O-labeling, a standard protocol of general applicability is still lacking, and statistical issues associated to these methods remain to be investigated. In this work we present an improved high-throughput quantitative proteomics method based on whole proteome concentration by SDS-PAGE, optimized in-gel digestion, peptide 18O-labeling, and separation by off-gel isoelectric focusing followed by liquid chromatography-LIT-MS. We demonstrate that the off-gel technique is fully compatible with 18O peptide labeling in any pH range. A recently developed statistical model indicated that partial digestions and methionine oxidation do not alter protein quantification and that variances at the scan, peptide, and protein levels are stable and reproducible in a variety of proteomes of different origin. We have also analyzed the dynamic range of quantification and demonstrated the practical utility of the method by detecting expression changes in a model of activation of Jurkat T-cells. Our protocol provides a general approach to perform quantitative proteomics by 18O-labeling in high-throughput studies, with the added value that it has a validated statistical model for the null hypothesis. To the best of our knowledge, this is the first report where a general protocol for stable isotope labeling is tested in practice using a collection of samples and analyzed at this degree of statistical detail. PMID:20807836

  8. Investigation of a diffuse optical measurements-assisted quantitative photoacoustic tomographic method in reflection geometry

    NASA Astrophysics Data System (ADS)

    Xu, Chen; Kumavor, Patrick D.; Aguirre, Andres; Zhu, Quing

    2012-06-01

    Photoacoustic tomography provides the distribution of absorbed optical energy density, which is the product of optical absorption coefficient and optical fluence distribution. We report the experimental investigation of a novel fitting procedure that quantitatively determines the optical absorption coefficient of chromophores. The experimental setup consisted of a hybrid system of a 64-channel photoacoustic imaging system with a frequency-domain diffused optical measurement system. The fitting procedure included a complete photoacoustic forward model and an analytical solution of a target chromophore using the diffusion approximation. The fitting procedure combines the information from the photoacoustic image and the background information from the diffuse optical measurements to minimize the photoacoustic measurements and forward model data and recover the target absorption coefficient quantitatively. 1-cm-cube phantom absorbers of high and low contrasts were imaged at depths of up to 3.0 cm. The fitted absorption coefficient results were at least 80% of their true values. The sensitivities of this fitting procedure to target location, target radius, and background optical properties were also investigated. We found that this fitting procedure was most sensitive to the accurate determination of the target radius and depth. Blood sample in a thin tube of radius 0.58 mm, simulating a blood vessel, was also studied. The photoacoustic images and fitted absorption coefficients are presented. These results demonstrate the clinical potential of this fitting procedure to quantitatively characterize small lesions in breast imaging.

  9. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  10. Alternative NMR method for quantitative determination of acyl positional distribution in triacylglycerols and related compounds.

    PubMed

    Simova, S; Ivanova, G; Spassov, S L

    2003-12-01

    High-resolution 13C NMR spectroscopy has been used to analyze the positional distribution of fatty acids in model triacylglycerols. A novel method for quantitative determination of the positional distribution of unsaturated chains in triacylglycerols simultaneously with the ratio of saturated/unsaturated acyl chains has been proposed, utilizing the chemical shift differences of the aliphatic atoms C4, C5, and C6. The use of HSQC-TOCSY spectra allows unequivocal proof of the position of the unsaturated chain as well as complete assignment of the 13C NMR signals in tripalmitin. PMID:14623452

  11. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  12. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J. (Los Alamos, NM); Cremers, David A. (Los Alamos, NM)

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  13. A simple method for the separation and quantitation of radiolabeled thyroid hormones in thyroxine clearance studies

    SciTech Connect

    Grossman, S.J. )

    1990-11-01

    A method was developed to facilitate the separation and quantitation of radiolabeled thyroxine in plasma for thyroxine clearance studies. Following intravenous injection of radioactive thyroxine, the radiolabeled thyroid hormones were isolated from plasma protein and polar metabolites by solid phase extraction on a C18 sorbent bed. The individual thyroid hormones were then separated by ion-pair reversed phase chromatography and sequentially eluted through a UV detector and radiochromatographic detector. The radioactivity of individual radiolabeled thyroid hormones was corrected for recovery of carrier as determined from UV absorbance. The recoveries of thyroxine and 3,5,3'-triiodothyronine (T3) were 96% and 101%, respectively.

  14. A method to prioritize quantitative traits and individuals for sequencing in family-based studies.

    PubMed

    Shah, Kaanan P; Douglas, Julie A

    2013-01-01

    Owing to recent advances in DNA sequencing, it is now technically feasible to evaluate the contribution of rare variation to complex traits and diseases. However, it is still cost prohibitive to sequence the whole genome (or exome) of all individuals in each study. For quantitative traits, one strategy to reduce cost is to sequence individuals in the tails of the trait distribution. However, the next challenge becomes how to prioritize traits and individuals for sequencing since individuals are often characterized for dozens of medically relevant traits. In this article, we describe a new method, the Rare Variant Kinship Test (RVKT), which leverages relationship information in family-based studies to identify quantitative traits that are likely influenced by rare variants. Conditional on nuclear families and extended pedigrees, we evaluate the power of the RVKT via simulation. Not unexpectedly, the power of our method depends strongly on effect size, and to a lesser extent, on the frequency of the rare variant and the number and type of relationships in the sample. As an illustration, we also apply our method to data from two genetic studies in the Old Order Amish, a founder population with extensive genealogical records. Remarkably, we implicate the presence of a rare variant that lowers fasting triglyceride levels in the Heredity and Phenotype Intervention (HAPI) Heart study (p?=?0.044), consistent with the presence of a previously identified null mutation in the APOC3 gene that lowers fasting triglyceride levels in HAPI Heart study participants. PMID:23626830

  15. The strengths and weaknesses of quantitative and qualitative research: what method for nursing?

    PubMed

    Carr, L T

    1994-10-01

    The overall purpose of research for any profession is to discover the truth of the discipline. This paper examines the controversy over the methods by which truth is obtained, by examining the differences and similarities between quantitative and qualitative research. The historically negative bias against qualitative research is discussed, as well as the strengths and weaknesses of both approaches, with issues highlighted by reference to nursing research. Consideration is given to issues of sampling; the relationship between the researcher and subject; methodologies and collated data; validity; reliability, and ethical dilemmas. The author identifies that neither approach is superior to the other; qualitative research appears invaluable for the exploration of subjective experiences of patients and nurses, and quantitative methods facilitate the discovery of quantifiable information. Combining the strengths of both approaches in triangulation, if time and money permit, is also proposed as a valuable means of discovering the truth about nursing. It is argued that if nursing scholars limit themselves to one method of enquiry, restrictions will be placed on the development of nursing knowledge. PMID:7822608

  16. A quantitative solid-state Raman spectroscopic method for control of fungicides.

    PubMed

    Ivanova, Bojidarka; Spiteller, Michael

    2012-07-21

    A new analytical procedure using solid-state Raman spectroscopy within the THz-region for the quantitative determination of mixtures of different conformations of trifloxystrobin (EE, EZ, ZE and ZZ), tebuconazole (1), and propiconazole (2) as an effective method for the fungicide product quality monitoring programmes and control has been developed and validated. The obtained quantities were controlled independently by the validated hybrid HPLC electrospray ionization (ESI) tandem mass spectrometric (MS) and matrix-assisted laser desorption/ionization (MALDI) MS methods in the condensed phase. The quantitative dependences were obtained on the twenty binary mixtures of the analytes and were further tested on the three trade fungicide products, containing mixtures of trifloxystrobin-tebuconazole and trifloxystrobin-propiconazole, as an emissive concentrate or water soluble granules of the active ingredients. The present methods provided sufficient sensitivity as reflected by the metrologic quantities, evaluating the concentration limit of detection (LOD) and quantification (LOQ), linear limit (LL), measurement accuracy and precision, true quantity value, trueness of measurement and more. PMID:22679621

  17. A quantitative autoradiographic method for the measurement of local rates of brain protein synthesis

    SciTech Connect

    Dwyer, B.E.; Donatoni, P.; Wasterlain, C.G.

    1982-05-01

    We have developed a new method for measuring local rates of brain protein synthesis in vivo. It combines the intraperitoneal injection of a large dose of low specific activity amino acid with quantitative autoradiography. This method has several advantages: 1) It is ideally suited for young or small animals or where immobilizing an animal is undesirable. 2 The amino acid injection ''floods'' amino acid pools so that errors in estimating precursor specific activity, which is especially important in pathological conditions, are minimized. 3) The method provides for the use of a radioautographic internal standard in which valine incorporation is measured directly. Internal standards from experimental animals correct for tissue protein content and self-absorption of radiation in tissue sections which could vary under experimental conditions.

  18. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  19. Quantitative Detection Method of Hydroxyapatite Nanoparticles Based on Eu(3+) Fluorescent Labeling in Vitro and in Vivo.

    PubMed

    Xie, Yunfei; Perera, Thalagalage Shalika Harshani; Li, Fang; Han, Yingchao; Yin, Meizhen

    2015-11-01

    One major challenge for application of hydroxyapatite nanoparticles (nHAP) in nanomedicine is the quantitative detection method. Herein, we exploited one quantitative detection method for nHAP based on the Eu(3+) fluorescent labeling via a simple chemical coprecipitation method. The trace amount of nHAP in cells and tissues can be quantitatively detected on the basis of the fluorescent quantitative determination of Eu(3+) ions in nHAP crystal lattice. The lowest concentration of Eu(3+) ions that can be quantitatively detected is 0.5 nM using DELFIA enhancement solution. This methodology can be broadly applicable for studying the tissue distribution and metabolization of nHAP in vivo. PMID:26495748

  20. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci. PMID:23973416

  1. Quantitative carbide analysis using the Rietveld method for 2.25Cr-1Mo-0.25V steel

    SciTech Connect

    Zhang Yongtao; Han Haibo; Miao Lede; Zhang Hanqian; Li Jinfu

    2009-09-15

    It is usually difficult to quantitatively determine the mass fraction of each type of precipitates in steels using transmission electron microscopy and traditional X-ray powder diffraction analysis methods. In this paper the Rietveld full-pattern fitting algorithm was employed to calculate the relative mass fractions of the precipitates in 2.25Cr-1Mo-0.25V steel. The results suggest that the fractions of MC, M{sub 7}C{sub 3} and M{sub 23}C{sub 6} carbides were evaluated precisely and relatively quickly. In addition, it was found that the fine MC phase dissolved into the matrix with prolonged tempering.

  2. "Do I Need Research Skills in Working Life?": University Students' Motivation and Difficulties in Quantitative Methods Courses

    ERIC Educational Resources Information Center

    Murtonen, Mari; Olkinuora, Erkki; Tynjala, Paivi; Lehtinen, Erno

    2008-01-01

    This study explored university students' views of whether they will need research skills in their future work in relation to their approaches to learning, situational orientations on a learning situation of quantitative methods, and difficulties experienced in quantitative research courses. Education and psychology students in both Finland (N =

  3. "Do I Need Research Skills in Working Life?": University Students' Motivation and Difficulties in Quantitative Methods Courses

    ERIC Educational Resources Information Center

    Murtonen, Mari; Olkinuora, Erkki; Tynjala, Paivi; Lehtinen, Erno

    2008-01-01

    This study explored university students' views of whether they will need research skills in their future work in relation to their approaches to learning, situational orientations on a learning situation of quantitative methods, and difficulties experienced in quantitative research courses. Education and psychology students in both Finland (N =…

  4. Depth determination for shallow teleseismic earthquakes Methods and results

    NASA Technical Reports Server (NTRS)

    Stein, Seth; Wiens, Douglas A.

    1986-01-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented.

  5. Depth determination for shallow teleseismic earthquakes Methods and results

    SciTech Connect

    Stein, S.; Wiens, D.A.

    1986-11-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented. 119 references.

  6. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. PMID:26243267

  7. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  8. [A multistage metrologic approach to the problem of evaluation of the results of the quantitative enzyme immunoassay determination of antitoxic antibodies].

    PubMed

    Petrov, V F; Nikolaeva, A M; Ivanova, V A; Shobukhova, T S; Tumanova, G M

    1998-01-01

    To evaluate a kit for the enzyme immunoassay (ELISA), the metrological approach was used: the total error of the method for the quantitative determination of antibodies, regarded as a multistage process, was determined as the result of the accumulation of errors made in measurements at different stages. The proposed algorithm made it possible to attest the positive control serum, to determine the total error of measurements, to mark a linear section in a limited range of values on a graduation diagram. This led to obtaining well-grounded results, comparable with the results of the reaction of neutralization at good correlation (lc = 0.9). PMID:9662797

  9. Methods for quantitative detection of antibody-induced complement activation on red blood cells.

    PubMed

    Meulenbroek, Elisabeth M; Wouters, Diana; Zeerleder, Sacha

    2014-01-01

    Antibodies against red blood cells (RBCs) can lead to complement activation resulting in an accelerated clearance via complement receptors in the liver (extravascular hemolysis) or leading to intravascular lysis of RBCs. Alloantibodies (e.g. ABO) or autoantibodies to RBC antigens (as seen in autoimmune hemolytic anemia, AIHA) leading to complement activation are potentially harmful and can be - especially when leading to intravascular lysis - fatal(1). Currently, complement activation due to (auto)-antibodies on RBCs is assessed in vitro by using the Coombs test reflecting complement deposition on RBC or by a nonquantitative hemolytic assay reflecting RBC lysis(1-4). However, to assess the efficacy of complement inhibitors, it is mandatory to have quantitative techniques. Here we describe two such techniques. First, an assay to detect C3 and C4 deposition on red blood cells that is induced by antibodies in patient serum is presented. For this, FACS analysis is used with fluorescently labeled anti-C3 or anti-C4 antibodies. Next, a quantitative hemolytic assay is described. In this assay, complement-mediated hemolysis induced by patient serum is measured making use of spectrophotometric detection of the released hemoglobin. Both of these assays are very reproducible and quantitative, facilitating studies of antibody-induced complement activation. PMID:24514151

  10. Methods for Quantitative Detection of Antibody-induced Complement Activation on Red Blood Cells

    PubMed Central

    Meulenbroek, Elisabeth M.; Wouters, Diana; Zeerleder, Sacha

    2014-01-01

    Antibodies against red blood cells (RBCs) can lead to complement activation resulting in an accelerated clearance via complement receptors in the liver (extravascular hemolysis) or leading to intravascular lysis of RBCs. Alloantibodies (e.g. ABO) or autoantibodies to RBC antigens (as seen in autoimmune hemolytic anemia, AIHA) leading to complement activation are potentially harmful and can be - especially when leading to intravascular lysis - fatal1. Currently, complement activation due to (auto)-antibodies on RBCs is assessed in vitro by using the Coombs test reflecting complement deposition on RBC or by a nonquantitative hemolytic assay reflecting RBC lysis1-4. However, to assess the efficacy of complement inhibitors, it is mandatory to have quantitative techniques. Here we describe two such techniques. First, an assay to detect C3 and C4 deposition on red blood cells that is induced by antibodies in patient serum is presented. For this, FACS analysis is used with fluorescently labeled anti-C3 or anti-C4 antibodies. Next, a quantitative hemolytic assay is described. In this assay, complement-mediated hemolysis induced by patient serum is measured making use of spectrophotometricdetection of the released hemoglobin. Both of these assays are very reproducible and quantitative, facilitating studies of antibody-induced complement activation. PMID:24514151

  11. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  12. Introduction to Quantitative Science, a Ninth-Grade Laboratory-Centered Course Stressing Quantitative Observation and Mathematical Analysis of Experimental Results. Final Report.

    ERIC Educational Resources Information Center

    Badar, Lawrence J.

    This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science

  13. Quantitative comparison of CTIPe model results with ground and space-based observations

    NASA Astrophysics Data System (ADS)

    Fedrizzi, M.; Olsen, J. R.; Fuller-Rowell, T. J.; Codrescu, M.

    2012-12-01

    Physical models are valuable tools in the task to understand and forecast complex non-linear systems. Their value has been demonstrated in the past by comparing the output of numerical simulations with reliable observations, analyzing the driving terms in the mathematical equations, and so determine the relative importance of the various physical processes. Over the past 40 years, this methodology has enabled great advances in the scientific knowledge of the complex Sun-Earth system. However, the validation of the dynamics of the best available physics-based models has been far from comprehensive, even though the pathways for atmospheric response to forcing are apparently well understood. Energy input drives temperature and density changes, which in turn drive winds, which in turn drive changes in composition, electrodynamics, and the ionosphere. There has to be consistency in all the basic state parameters, otherwise the foundation of our perception is unfounded. This study aims to quantitatively assess the capability as well as the limitations of the global, three-dimensional, time-dependent, non-linear coupled model of the thermosphere, ionosphere, plasmasphere, and electrodynamics (CTIPe) in specifying and predicting the upper atmosphere neutral and plasma response to changes in external drivers using a comprehensive observational data set from ground and space and, at the same time, advance the understanding of the T-I system dynamics on different spatial and temporal scales. Observations from CHAMP and TIMED-GUVI satellites, ionosondes and Fabry-Perot Interferometers are among the measurements used in these model/data comparisons.

  14. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  15. Development of a High-Sensitivity Quantitation Method for Arginine Vasopressin by High-Performance Liquid Chromatography Tandem Mass Spectrometry, and Comparison with Quantitative Values by Radioimmunoassay.

    PubMed

    Tsukazaki, Yasuko; Senda, Naoto; Kubo, Kinya; Yamada, Shigeru; Kugoh, Hiroyuki; Kazuki, Yasuhiro; Oshimura, Mitsuo

    2016-01-01

    Human plasma arginine vasopressin (AVP) levels serve as a clinically relevant marker of diabetes and related syndromes. We developed a highly sensitive method for measuring human plasma AVP using high-performance liquid chromatography tandem mass spectrometry. AVP was extracted from human plasma using a weak-cation solid-phase extraction plate, and separated on a wide-bore octadecyl reverse-phase column. AVP was quantified in ion-transition experiments utilizing a product ion (m/z 328.3) derived from its parent ion (m/z 542.8). The sensitivity was enhanced using 0.02% dichloromethane as a mobile-phase additive. The lower limit of quantitation was 0.200 pmol/L. The extraction recovery ranged from 70.2 ± 7.2 to 73.3 ± 6.2% (mean ± SD), and the matrix effect ranged from 1.1 - 1.9%. Quality-testing samples revealed interday/intraday accuracy and precision ranging over 0.9 - 3% and -0.3 - 2%, respectively, which included the endogenous baseline. Our results correlated well with radioimmunoassay results using 22 human volunteer plasma samples. PMID:26860558

  16. Characterization of thermal desorption instrumentation with a direct liquid deposition calibration method for trace 2,4,6-trinitrotoluene quantitation.

    PubMed

    Field, Christopher R; Giordano, Braden C; Rogers, Duane A; Lubrano, Adam L; Rose-Pehrsson, Susan L

    2012-03-01

    The use of thermal desorption systems for the analysis of trace vapors typically requires establishing a calibration curve from vapors generated with a permeation tube. The slow equilibration time of permeation tubes causes such an approach to become laborious when covering a wide dynamic range. Furthermore, many analytes of interest, such as explosives, are not available as permeation tubes. A method for easily and effectively establishing calibration curves for explosive vapor samples via direct deposition of standard solutions on thermal desorption tubes was investigated. The various components of the thermal desorption system were compared to a standard split/splitless inlet. Calibration curves using the direct liquid deposition method with a thermal desorption unit coupled to a cryo-focusing inlet were compared to a standard split/splitless inlet, and a statistical difference was observed but does not eliminate or deter the use of the direct liquid deposition method for obtaining quantitative results for explosive vapors. PMID:22265176

  17. A method for accurate detection of genomic microdeletions using real-time quantitative PCR

    PubMed Central

    Weksberg, Rosanna; Hughes, Simon; Moldovan, Laura; Bassett, Anne S; Chow, Eva WC; Squire, Jeremy A

    2005-01-01

    Background Quantitative Polymerase Chain Reaction (qPCR) is a well-established method for quantifying levels of gene expression, but has not been routinely applied to the detection of constitutional copy number alterations of human genomic DNA. Microdeletions or microduplications of the human genome are associated with a variety of genetic disorders. Although, clinical laboratories routinely use fluorescence in situ hybridization (FISH) to identify such cryptic genomic alterations, there remains a significant number of individuals in which constitutional genomic imbalance is suspected, based on clinical parameters, but cannot be readily detected using current cytogenetic techniques. Results In this study, a novel application for real-time qPCR is presented that can be used to reproducibly detect chromosomal microdeletions and microduplications. This approach was applied to DNA from a series of patient samples and controls to validate genomic copy number alteration at cytoband 22q11. The study group comprised 12 patients with clinical symptoms of chromosome 22q11 deletion syndrome (22q11DS), 1 patient trisomic for 22q11 and 4 normal controls. 6 of the patients (group 1) had known hemizygous deletions, as detected by standard diagnostic FISH, whilst the remaining 6 patients (group 2) were classified as 22q11DS negative using the clinical FISH assay. Screening of the patients and controls with a set of 10 real time qPCR primers, spanning the 22q11.2-deleted region and flanking sequence, confirmed the FISH assay results for all patients with 100% concordance. Moreover, this qPCR enabled a refinement of the region of deletion at 22q11. Analysis of DNA from chromosome 22 trisomic sample demonstrated genomic duplication within 22q11. Conclusion In this paper we present a qPCR approach for the detection of chromosomal microdeletions and microduplications. The strategic use of in silico modelling for qPCR primer design to avoid regions of repetitive DNA, whilst providing a level of genomic resolution greater than standard cytogenetic assays. The implementation of qPCR detection in clinical laboratories will address the need to replace complex, expensive and time consuming FISH screening to detect genomic microdeletions or duplications of clinical importance. PMID:16351727

  18. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity

    PubMed Central

    O’Brien-Simpson, Neil M.; Pantarat, Namfon; Attard, Troy J.; Walsh, Katrina A.; Reynolds, Eric C.

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  19. Methods for a quantitative evaluation of odd-even staggering effects

    NASA Astrophysics Data System (ADS)

    Olmi, Alessandro; Piantelli, Silvia

    2015-12-01

    Odd-even effects, also known as staggering effects, are a common feature observed in the yield distributions of fragments produced in different types of nuclear reactions. We review old methods, and we propose new ones, for a quantitative estimation of these effects as a function of proton or neutron number of the reaction products. All methods are compared on the basis of Monte Carlo simulations. We find that some are not well suited for the task, the most reliable ones being those based either on a non-linear fit with a properly oscillating function or on a third (or fourth) finite difference approach. In any case, high statistic is of paramount importance to avoid that spurious structures appear just because of statistical fluctuations in the data and of strong correlations among the yields of neighboring fragments.

  20. Method of applying internal standard to dried matrix spot samples for use in quantitative bioanalysis.

    PubMed

    Abu-Rabie, Paul; Denniff, Philip; Spooner, Neil; Brynjolffssen, Jan; Galluzzo, Paul; Sanders, Giles

    2011-11-15

    A novel technique is presented that addresses the issue of how to apply internal standard (IS) to dried matrix spot (DMS) samples that allows the IS to integrate with the sample prior to extraction. The TouchSpray, a piezo electric spray system, from The Technology Partnership (TTP), was used to apply methanol containing IS to dried blood spot (DBS) samples. It is demonstrated that this method of IS application has the potential to work in practice, for use in quantitative determination of circulating exposures of pharmaceuticals in toxicokinetic and pharmacokinetic studies. Three different methods of IS application were compared: addition of IS to control blood prior to DBS sample preparation (control 1), incorporation into extraction solvent (control 2), and the novel use of TouchSpray technology (test). It is demonstrated that there was no significant difference in accuracy and precision data using these three techniques obtained using both manual extraction and direct elution. PMID:21972889

  1. Establishment of a new method to quantitatively evaluate hyphal fusion ability in Aspergillus oryzae.

    PubMed

    Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko

    2014-01-01

    Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0. PMID:25229867

  2. A QUANTITATIVE, THREE-DIMENSIONAL METHOD FOR ANALYZING ROTATIONAL MOVEMENT FROM SINGLE-VIEW MOVIES

    PubMed

    Berg

    1994-06-01

    The study of animal movement is an important aspect of functional morphological research. The three-dimensional movements of (parts of) animals are usually recorded on two-dimensional film frames. For a quantitative analysis, the real movements should be reconstructed from their projections. If movements occur in one plane, their projection is distorted only if this plane is not parallel to the film plane. Provided that the parallel orientation of the movement with respect to the film plane is checked accurately, a two-dimensional method of analysis (ignoring projection errors) can be justified for quantitative analysis of planar movements. Films of movements of skeletal elements of the fish head have generally been analyzed with the two-dimensional method (e.g. Sibbing, 1982; Hoogenboezem et al. 1990; Westneat, 1990; Claes and de Vree, 1991), which is justifiable for planar movements. Unfortunately, the movements of the head bones of fish are often strongly non-planar, e.g. the movement of the pharyngeal jaws and the gill arches. The two-dimensional method is inappropriate for studying such complex movements (Sibbing, 1982; Hoogenboezem et al. 1990). For a qualitative description of movement patterns, the conditions for the use of the two-dimensional method may be somewhat relaxed. When two (or more) views of a movement are recorded simultaneously, the three-dimensional movements can readily be reconstructed using two two-dimensional images (e.g. Zarnack, 1972; Nachtigall, 1983; van Leeuwen, 1984; Drost and van den Boogaart, 1986). However, because of technical (and budget) limitations, simultaneous views of a movement cannot always be shot. In this paper, a method is presented for reconstructing the three-dimensional orientation and rotational movement of structures using single-view films and for calculating rotation in an object-bound frame. Ellington (1984) presented a similar method for determining three-dimensional wing movements from single-view films of flying insects. Ellington's method is based upon the bilateral symmetry of the wing movements. The present method does not depend on symmetry and can be applied to a variety of kinematic investigations. It eliminates a systematic error: the projection error. The measuring error is not discussed; it is the same in the two-dimensional and three-dimensional method of analysis. PMID:9317811

  3. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody.

    PubMed

    Yoshinari, Tomoya; Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena; Ohkawa, Hideo; Sugita-Konishi, Yoshiko

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L(-1). The coefficients of variation were 7.9% at 0.003 mg L(-1), 5.0% at 0.03 mg L(-1) and 13.7% at 0.3 mg L(-1), respectively. The limit of detection was 0.006 mg L(-1) for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9-100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg(-1). The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R(2) = 0.9760) than the immunochromatographic assay kit (R(2) = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety. PMID:26320967

  4. Voxel Spread Function (VSF) Method for Correction of Magnetic Field Inhomogeneity Effects in Quantitative Gradient-Echo-Based MRI

    PubMed Central

    Yablonskiy, Dmitriy A; Sukstanskii, Alexander L; Luo, Jie; Wang, Xiaoqi

    2012-01-01

    Purpose Macroscopic magnetic field inhomogeneities adversely affect different aspects of MRI images. In quantitative MRI when the goal is to quantify biological tissue parameters, they bias and often corrupt such measurements. The goal of this paper is to develop a method for correction of macroscopic field inhomogeneities that can be applied to a variety of quantitative gradient-echo-based MRI techniques. Methods We have re-analyzed a basic theory of gradient echo (GE) MRI signal formation in the presence of background field inhomogeneities and derived equations that allow for correction of magnetic field inhomogeneity effects based on the phase and magnitude of GE data. We verified our theory by mapping R2* relaxation rate in computer simulated, phantom, and in vivo human data collected with multi-GE sequences. Results The proposed technique takes into account voxel spread function (VSF) effects and allowed obtaining virtually free from artifacts R2* maps for all simulated, phantom and in vivo data except of the edge areas with very steep field gradients. Conclusion The VSF method, allowing quantification of tissue specific R2*-related tissue properties, has a potential to breed new MRI biomarkers serving as surrogates for tissue biological properties similar to R1 and R2 relaxation rate constants widely used in clinical and research MRI. PMID:23233445

  5. Development of a Quantitative Decision Metric for Selecting the Most Suitable Discretization Method for SN Transport Problems

    NASA Astrophysics Data System (ADS)

    Schunert, Sebastian

    In this work we develop a quantitative decision metric for spatial discretization methods of the SN equations. The quantitative decision metric utilizes performance data from selected test problems for computing a fitness score that is used for the selection of the most suitable discretization method for a particular SN transport application. The fitness score is aggregated as a weighted geometric mean of single performance indicators representing various performance aspects relevant to the user. Thus, the fitness function can be adjusted to the particular needs of the code practitioner by adding/removing single performance indicators or changing their importance via the supplied weights. Within this work a special, broad class of methods is considered, referred to as nodal methods. This class is naturally comprised of the DGFEM methods of all function space families. Within this work it is also shown that the Higher Order Diamond Difference (HODD) method is a nodal method. Building on earlier findings that the Arbitrarily High Order Method of the Nodal type (AHOTN) is also a nodal method, a generalized finite-element framework is created to yield as special cases various methods that were developed independently using profoundly different formalisms. A selection of test problems related to a certain performance aspect are considered: an Method of Manufactured Solutions (MMS) test suite for assessing accuracy and execution time, Lathrop's test problem for assessing resilience against occurrence of negative fluxes, and a simple, homogeneous cube test problem to verify if a method possesses the thick diffusive limit. The contending methods are implemented as efficiently as possible under a common SN transport code framework to level the playing field for a fair comparison of their computational load. Numerical results are presented for all three test problems and a qualitative rating of each method's performance is provided for each aspect: accuracy/efficiency, resilience against negative fluxes, and possession of the thick diffusion limit, separately. The choice of the most efficient method depends on the utilized error norm: in Lp error norms higher order methods such as the AHOTN method of order three perform best, while for computing integral quantities the linear nodal (LN) method is most efficient. The most resilient method against occurrence of negative fluxes is the simple corner balance (SCB) method. A validation of the quantitative decision metric is performed based on the NEA box-inbox suite of test problems. The validation exercise comprises two stages: first prediction of the contending methods' performance via the decision metric and second computing the actual scores based on data obtained from the NEA benchmark problem. The comparison of predicted and actual scores via a penalty function (ratio of predicted best performer's score to actual best score) completes the validation exercise. It is found that the decision metric is capable of very accurate predictions (penalty < 10%) in more than 83% of the considered cases and features penalties up to 20% for the remaining cases. An exception to this rule is the third test case NEA-III intentionally set up to incorporate a poor match of the benchmark with the "data" problems. However, even under these worst case conditions the decision metric's suggestions are never detrimental. Suggestions for improving the decision metric's accuracy are to increase the pool of employed data, to refine the mapping of a given configuration to a case in the database, and to better characterize the desired target quantities.

  6. Membrane chromatographic immunoassay method for rapid quantitative analysis of specific serum antibodies.

    PubMed

    Ghosh, Raja

    2006-02-01

    This paper discusses a membrane chromatographic immunoassay method for rapid detection and quantitative analysis of specific serum antibodies. A type of polyvinylidine fluoride (PVDF) microfiltration membrane was used in the method for its ability to reversibly and specifically bind IgG antibodies from antiserum samples by hydrophobic interaction. Using this form of selective antibody binding and enrichment an affinity membrane with antigen binding ability was obtained in-situ. This was done by passing a pulse of diluted antiserum sample through a stack of microporous PVDF membranes. The affinity membrane thus formed was challenged with a pulse of antigen solution and the amount of antigen bound was accurately determined using chromatographic methods. The antigen binding correlated well with the antibody loading on the membrane. This method is direct, rapid and accurate, does not involve any chemical reaction, and uses very few reagents. Moreover, the same membrane could be repeatedly used for sequential immunoassays on account of the reversible nature of the antibody binding. Proof of concept of this method is provided using human hemoglobin as model antigen and rabbit antiserum against human hemoglobin as the antibody source. PMID:16196053

  7. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials. PMID:24282943

  8. The Trojan horse method in nuclear astrophysics: recent results

    NASA Astrophysics Data System (ADS)

    Romano, S.; Spitaleri, C.; Cherubini, S.; Crucill, V.; Gulino, M.; La Cognata, M.; Lamia, L.; Pizzone, R. G.; Puglia, S. M. R.; Rapisarda, G. G.; Sergi, M. L.; Tudisco, S.; Tumino, A.; Tribble, R. E.; Goldberg, V. Z.; Mukhamedzhanov, A. M.; Tabacaru, G.; Trache, L.; Kroha, V.; Burjan, V.; Hons, Z.; Mrzek, J.; Somorjai, E.; Elekes, Z.; Flp, Z.; Gyrky, G.; Kiss, G.; Szanto de Toledo, A.; Carlin, N.; DeMoura, M. M.; Del Santo, M. G.; Munhoz, M. G.; Liguori Neto, R.; Souza, F. A.; Suaide, A. A. P.; Szanto, E.

    2008-01-01

    Difficulties in cross-section measurements at very low energies, when charged particles are involved, led to the development of some indirect methods. The Trojan horse method (THM) allows us to bypass the Coulomb effects and has been successfully applied to several reactions of astrophysical interest. A brief review of the THM applications is reported together with some of the most recent results.

  9. A quantitative method for zoning of protected areas and its spatial ecological implications.

    PubMed

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to improve zoning within protected areas in developing countries. PMID:16690203

  10. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams. PMID:26395288

  11. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  12. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  13. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  14. Automated and Manual Methods of DNA Extraction for Aspergillus fumigatus and Rhizopus oryzae Analyzed by Quantitative Real-Time PCR▿

    PubMed Central

    Francesconi, Andrea; Kasai, Miki; Harrington, Susan M.; Beveridge, Mara G.; Petraitiene, Ruta; Petraitis, Vidmantas; Schaufele, Robert L.; Walsh, Thomas J.

    2008-01-01

    Quantitative real-time PCR (qPCR) may improve the detection of fungal pathogens. Extraction of DNA from fungal pathogens is fundamental to optimization of qPCR; however, the loss of fungal DNA during the extraction process is a major limitation to molecular diagnostic tools for pathogenic fungi. We therefore studied representative automated and manual extraction methods for Aspergillus fumigatus and Rhizopus oryzae. Both were analyzed by qPCR for their ability to extract DNA from propagules and germinated hyphal elements (GHE). The limit of detection of A. fumigatus and R. oryzae GHE in bronchoalveolar lavage (BAL) fluid with either extraction method was 1 GHE/ml. Both methods efficiently extracted DNA from A. fumigatus, with a limit of detection of 1 × 102 conidia. Extraction of R. oryzae by the manual method resulted in a limit of detection of 1 × 103 sporangiospores. However, extraction with the automated method resulted in a limit of detection of 1 × 101 sporangiospores. The amount of time to process 24 samples by the automated method was 2.5 h prior to transferring for automation, 1.3 h of automation, and 10 min postautomation, resulting in a total time of 4 h. The total time required for the manual method was 5.25 h. The automated and manual methods were similar in sensitivity for DNA extraction from A. fumigatus conidia and GHE. For R. oryzae, the automated method was more sensitive for DNA extraction of sporangiospores, while the manual method was more sensitive for GHE in BAL fluid. PMID:18353931

  15. CREST (Climate REconstruction SofTware): a probability density function (PDF)-based quantitative climate reconstruction method

    NASA Astrophysics Data System (ADS)

    Chevalier, M.; Cheddadi, R.; Chase, B. M.

    2014-11-01

    Several methods currently exist to quantitatively reconstruct palaeoclimatic variables from fossil botanical data. Of these, probability density function (PDF)-based methods have proven valuable as they can be applied to a wide range of plant assemblages. Most commonly applied to fossil pollen data, their performance, however, can be limited by the taxonomic resolution of the pollen data, as many species may belong to a given pollen type. Consequently, the climate information associated with different species cannot always be precisely identified, resulting in less-accurate reconstructions. This can become particularly problematic in regions of high biodiversity. In this paper, we propose a novel PDF-based method that takes into account the different climatic requirements of each species constituting the broader pollen type. PDFs are fitted in two successive steps, with parametric PDFs fitted first for each species and then a combination of those individual species PDFs into a broader single PDF to represent the pollen type as a unit. A climate value for the pollen assemblage is estimated from the likelihood function obtained after the multiplication of the pollen-type PDFs, with each being weighted according to its pollen percentage. To test its performance, we have applied the method to southern Africa as a regional case study and reconstructed a suite of climatic variables (e.g. winter and summer temperature and precipitation, mean annual aridity, rainfall seasonality). The reconstructions are shown to be accurate for both temperature and precipitation. Predictable exceptions were areas that experience conditions at the extremes of the regional climatic spectra. Importantly, the accuracy of the reconstructed values is independent of the vegetation type where the method is applied or the number of species used. The method used in this study is publicly available in a software package entitled CREST (Climate REconstruction SofTware) and will provide the opportunity to reconstruct quantitative estimates of climatic variables even in areas with high geographical and botanical diversity.

  16. A Pyrosequencing Assay for the Quantitative Methylation Analysis of GALR1 in Endometrial Samples: Preliminary Results.

    PubMed

    Kottaridi, Christine; Koureas, Nikolaos; Margari, Niki; Terzakis, Emmanouil; Bilirakis, Evripidis; Pappas, Asimakis; Chrelias, Charalampos; Spathis, Aris; Aga, Evangelia; Pouliakis, Abraham; Panayiotides, Ioannis; Karakitsos, Petros

    2015-01-01

    Endometrial cancer is the most common malignancy of the female genital tract while aberrant DNA methylation seems to play a critical role in endometrial carcinogenesis. Galanin's expression has been involved in many cancers. We developed a new pyrosequencing assay that quantifies DNA methylation of galanin's receptor-1 (GALR1). In this study, the preliminary results indicate that pyrosequencing methylation analysis of GALR1 promoter can be a useful ancillary marker to cytology as the histological status can successfully predict. This marker has the potential to lead towards better management of women with endometrial lesions and eventually reduce unnecessary interventions. In addition it can provide early warning for women with negative cytological result. PMID:26504828

  17. A Pyrosequencing Assay for the Quantitative Methylation Analysis of GALR1 in Endometrial Samples: Preliminary Results

    PubMed Central

    Kottaridi, Christine; Koureas, Nikolaos; Margari, Niki; Terzakis, Emmanouil; Bilirakis, Evripidis; Pappas, Asimakis; Chrelias, Charalampos; Spathis, Aris; Aga, Evangelia; Pouliakis, Abraham; Panayiotides, Ioannis; Karakitsos, Petros

    2015-01-01

    Endometrial cancer is the most common malignancy of the female genital tract while aberrant DNA methylation seems to play a critical role in endometrial carcinogenesis. Galanin's expression has been involved in many cancers. We developed a new pyrosequencing assay that quantifies DNA methylation of galanin's receptor-1 (GALR1). In this study, the preliminary results indicate that pyrosequencing methylation analysis of GALR1 promoter can be a useful ancillary marker to cytology as the histological status can successfully predict. This marker has the potential to lead towards better management of women with endometrial lesions and eventually reduce unnecessary interventions. In addition it can provide early warning for women with negative cytological result. PMID:26504828

  18. Quantitative assessment of port-wine stains using chromametry: preliminary results

    NASA Astrophysics Data System (ADS)

    Beacco, Claire; Brunetaud, Jean Marc; Rotteleur, Guy; Steen, D. A.; Brunet, F.

    1996-12-01

    Objective assessment of the efficacy of different lasers for the treatment of port wine stains remains difficult. Chromametry gives reproducible information on the color of PWS, but its data are useless for a medical doctor. Thus a specific software was developed to allow graphic representation of PWS characteristics. Before the first laser treatment and after every treatment, tests were done using a chromameter on a marked zone of the PWS and on the control-lateral normal zone which represents the reference. The software calculates and represents graphically the difference of color between PWS and normal skin using data provided by the chromameter. Three parameters are calculated: (Delta) H is the difference of hue, (Delta) L is the difference of lightness and (Delta) E is the total difference of color. Each measured zone is represented by its coordinates. Calculated initial values were compared with the subjective initial color assessed by the dermatologist. The variation of the color difference was calculated using the successive values of (Delta) E after n treatments and was compared with the subjective classification of fading. Since January 1995, forty three locations have been measured before laser treatment. Purple PWS tended to differentiate from others but red and dark pink PWS could not be differentiated. The evolution of the color after treatment was calculated in 29 PWS treated 3 or 4 times. Poor result corresponded to an increase of (Delta) E. Fair and good results were associated to a decrease of (Delta) E. We did not observe excellent results during this study. These promising preliminary results need to be confirmed in a larger group of patients.

  19. Quantitative methods for the chemical analysis of sealing glasses. [For Li-SO/sub 2/ cells

    SciTech Connect

    Antepenko, R.J.; Carter, J.M.; Foster, J.C.; Palmer, D.C.

    1985-01-01

    Sandia developed a high corrosion-resistant glass called TA-23 for application as the header sealing glass in lithium-sulfur dioxide (LAMB) cells. This glass contained a lower silica content and higher alumina content and was far more resistant to lithium electrolyte corrosion at the glass-to-metal seal. The glass functions as the electrical insulator and hermetic seal in the header. Chemical analysis methods were developed for quantitatively measuring the elemental composition of TA-23 glass, which has a nominal composition (wt %) of 45% SiO/sub 2/, 20% Al/sub 2/O/sub 3/, 8% B/sub 2/O/sub 3/, 12% CaO, 7% MgO, 6% SrO, 2% La/sub 2/O/sub 3/ and 0.05% CoO. Classical wet-chemical methods based on either gravimetric or volumetric procedures were used to measure silica and boric oxide. Instrumental techniques using atomic absorption spectrometry (AAS) and inductively coupled plasma (ICP) - atomic emission spectrometry were used to measure the remaining elements. The sample preparation and analysis procedures will be described for these methods. These methods have also been used to analyze other sealing glasses.

  20. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    NASA Astrophysics Data System (ADS)

    Ryan, C. G.; Laird, J. S.; Fisher, L. A.; Kirkham, R.; Moorhead, G. F.

    2015-11-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  1. Comparison of quantitative spectral similarity analysis methods for protein higher-order structure confirmation.

    PubMed

    Teska, Brandon M; Li, Cynthia; Winn, Bradley C; Arthur, Kelly K; Jiang, Yijia; Gabrielson, John P

    2013-03-01

    Optical and vibrational spectroscopic techniques are important tools for evaluating secondary and tertiary structures of proteins. These spectroscopic techniques are routinely applied in biopharmaceutical development to elucidate structural characteristics of protein products, to evaluate the impact of processing and storage conditions on product quality, and to assess comparability of a protein product before and after manufacturing changes. Conventionally, the degree of similarity between two spectra has been determined visually. In addition to requiring a significant amount of analyst training and experience, visual inspection of spectra is inherently subjective, and any determination of comparability based on visual analysis of spectra is therefore arbitrary. Here, we discuss a general methodology for evaluating the suitability of numerical methods to calculate spectral similarity, and then we apply the methodology to compare four quantitative spectral similarity methods: the correlation coefficient, area of spectral overlap, derivative correlation algorithm, and spectral difference methods. While the most effective spectral similarity method may depend on the particular application, all four approaches are superior to visual evaluation, and each is suitable for assessing the degree of similarity between spectra. PMID:23219560

  2. Quantitative calcium resistivity based method for accurate and scalable water vapor transmission rate measurement

    NASA Astrophysics Data System (ADS)

    Reese, Matthew O.; Dameron, Arrelaine A.; Kempe, Michael D.

    2011-08-01

    The development of flexible organic light emitting diode displays and flexible thin film photovoltaic devices is dependent on the use of flexible, low-cost, optically transparent and durable barriers to moisture and/or oxygen. It is estimated that this will require high moisture barriers with water vapor transmission rates (WVTR) between 10-4 and 10-6 g/m2/day. Thus there is a need to develop a relatively fast, low-cost, and quantitative method to evaluate such low permeation rates. Here, we demonstrate a method where the resistance changes of patterned Ca films, upon reaction with moisture, enable one to calculate a WVTR between 10 and 10-6 g/m2/day or better. Samples are configured with variable aperture size such that the sensitivity and/or measurement time of the experiment can be controlled. The samples are connected to a data acquisition system by means of individual signal cables permitting samples to be tested under a variety of conditions in multiple environmental chambers. An edge card connector is used to connect samples to the measurement wires enabling easy switching of samples in and out of test. This measurement method can be conducted with as little as 1 h of labor time per sample. Furthermore, multiple samples can be measured in parallel, making this an inexpensive and high volume method for measuring high moisture barriers.

  3. Isotonic Regression Based-Method in Quantitative High-Throughput Screenings for Genotoxicity

    PubMed Central

    Fujii, Yosuke; Narita, Takeo; Tice, Raymond Richard; Takeda, Shunich

    2015-01-01

    Quantitative high-throughput screenings (qHTSs) for genotoxicity are conducted as part of comprehensive toxicology screening projects. The most widely used method is to compare the dose-response data of a wild-type and DNA repair gene knockout mutants, using model-fitting to the Hill equation (HE). However, this method performs poorly when the observed viability does not fit the equation well, as frequently happens in qHTS. More capable methods must be developed for qHTS where large data variations are unavoidable. In this study, we applied an isotonic regression (IR) method and compared its performance with HE under multiple data conditions. When dose-response data were suitable to draw HE curves with upper and lower asymptotes and experimental random errors were small, HE was better than IR, but when random errors were big, there was no difference between HE and IR. However, when the drawn curves did not have two asymptotes, IR showed better performance (p < 0.05, exact paired Wilcoxon test) with higher specificity (65% in HE vs. 96% in IR). In summary, IR performed similarly to HE when dose-response data were optimal, whereas IR clearly performed better in suboptimal conditions. These findings indicate that IR would be useful in qHTS for comparing dose-response data.

  4. Spectrophotometric Method for Quantitative Determination of Cefixime in Bulk and Pharmaceutical Preparation Using Ferroin Complex

    NASA Astrophysics Data System (ADS)

    Naeem Khan, M.; Qayum, A.; Ur Rehman, U.; Gulab, H.; Idrees, M.

    2015-09-01

    A method was developed for the quantitative determination of cefixime in bulk and pharmaceutical preparations using ferroin complex. The method is based on the oxidation of the cefixime with Fe(III) in acidic medium. The formed Fe(II) reacts with 1,10-phenanthroline, and the ferroin complex is measured spectrophotometrically at 510 nm against reagent blank. Beer's law was obeyed in the concentration range 0.2-10 μg/ml with a good correlation of 0.993. The molar absorptivity was calculated and was found to be 1.375×105 L/mol × cm. The limit of detection (LOD) and limit of quantification (LOQ) were found to be 0.030 and 0.101 μg/ml respectively. The proposed method has reproducibility with a relative standard deviation of 5.28% (n = 6). The developed method was validated statistically by performing a recoveries study and successfully applied for the determination of cefixime in bulk powder and pharmaceutical formulations without interferences from common excipients. Percent recoveries were found to range from 98.00 to 102.05% for the pure form and 97.83 to 102.50% for pharmaceutical preparations.

  5. A simple, quantitative method using alginate gel to determine rat colonic tumor volume in vivo.

    PubMed

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-04-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the Apc(Pirc/+) rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained. PMID:24674588

  6. Quantitative calcium resistivity based method for accurate and scalable water vapor transmission rate measurement.

    PubMed

    Reese, Matthew O; Dameron, Arrelaine A; Kempe, Michael D

    2011-08-01

    The development of flexible organic light emitting diode displays and flexible thin film photovoltaic devices is dependent on the use of flexible, low-cost, optically transparent and durable barriers to moisture and/or oxygen. It is estimated that this will require high moisture barriers with water vapor transmission rates (WVTR) between 10(-4) and 10(-6) g/m(2)/day. Thus there is a need to develop a relatively fast, low-cost, and quantitative method to evaluate such low permeation rates. Here, we demonstrate a method where the resistance changes of patterned Ca films, upon reaction with moisture, enable one to calculate a WVTR between 10 and 10(-6) g/m(2)/day or better. Samples are configured with variable aperture size such that the sensitivity and/or measurement time of the experiment can be controlled. The samples are connected to a data acquisition system by means of individual signal cables permitting samples to be tested under a variety of conditions in multiple environmental chambers. An edge card connector is used to connect samples to the measurement wires enabling easy switching of samples in and out of test. This measurement method can be conducted with as little as 1 h of labor time per sample. Furthermore, multiple samples can be measured in parallel, making this an inexpensive and high volume method for measuring high moisture barriers. PMID:21895269

  7. A method for quantitative determination of furanocoumarins in capsules and tablets of phytochemical preparations.

    PubMed

    Cardoso, Claudia Andra Lima; Pires, Adriana Elias; Honda, Neli Kika

    2006-04-01

    A method for sample preparation and analysis by high-performance liquid chromatography with UV detection (HPLC-UV) was developed for analysis of psoralen, bergapten and 5-[3-(4,5-dihydro-5,5-dimethyl-4-oxo-2-furanyl)-butoxy]-7H-furo[3-2-g][1]benzopyran-7-one in capsules and tablets employed in Brazil for certain illnesses. The linearity, accuracy, the inter- and intra-day precision of the procedure were evaluated. Analytical curves for furanocoumarins were linear in the range of 1.0-50.0 microg/ml. The recoveries of the furanocoumarins in the products analyzed were 97.3-99.5%, and the percent coefficient of variation for the quantitative analysis of the furanocoumarins in the analyses was under 5%. For inter-equipment study gas chromatography (GC) was employed. PMID:16595942

  8. Molecular methods: chip assay and quantitative real-time PCR: in detecting hepatotoxic cyanobacteria.

    PubMed

    Rantala-Ylinen, Anne; Sipari, Hanna; Sivonen, Kaarina

    2011-01-01

    Cyanobacterial mass occurrences are widespread and often contain hepatotoxic, i.e. microcystin- and nodularin-producing, species. Nowadays, detection of microcystin (mcy) and nodularin synthetase (nda) genes is widely used for the recognition of toxic cyanobacterial strains in environmental water samples. Chip assay presented here combines ligation detection reaction and hybridization on a universal microarray to detect and identify the mcyE/ndaF genes of five cyanobacterial genera specifically and sensitively. Thus, one chip assay can reveal the co-occurrence of several hepatotoxin producers. The presented quantitative real-time PCR method is used for the detection of either microcystin-producing Anabaena or Microcystis. Determination of the mcyE-gene copy numbers allows the identification of the dominant producer genus in the sample. PMID:21567319

  9. Quantitative test method for evaluation of anti-fingerprint property of coated surfaces

    NASA Astrophysics Data System (ADS)

    Wu, Linda Y. L.; Ngian, S. K.; Chen, Z.; Xuan, D. T. T.

    2011-01-01

    An artificial fingerprint liquid is formulated from artificial sweat, hydroxyl-terminated polydimethylsiloxane and a solvent for direct determination of anti-fingerprint property of a coated surface. A range of smooth and rough surfaces with different anti-fingerprint (AF) properties were fabricated by sol-gel technology, on which the AF liquid contact angles, artificial fingerprint and real human fingerprints (HF) were verified and correlated. It is proved that a surface with AF contact angle above 87 is fingerprint free. This provides an objective and quantitative test method to determine anti-fingerprint property of coated surfaces. It is also concluded that AF property can be achieved on smooth and optically clear surfaces. Deep porous structures are more favorable than bumpy structure for oleophobic and AF properties.

  10. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N., Jr.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  11. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  12. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations. PMID:25420387

  13. The Case Study of an F/OSS Virtualization Platform Deployment and Quantitative Results

    NASA Astrophysics Data System (ADS)

    Stathopoulos, Panagiotis; Soumplis, Alexandros; Houssos, Nikos

    In this paper we present practical experiences and results from the deployment of an F/OSS virtualization platform. EKTs (NDC) core IT infrastructure was transformed to a virtualized one, using exclusively F/OSS, while severe budget and timing constraints were in place. This migration was initiated in order to better cope with EKTs services requirements, while accommodating at the same time the need for the in house development of a large scale open access infrastructure. The benefits derived from this migration were not only generic virtualization benefits, such as the quantifiable reduced power consumption and cost reduction through consolidation, but also F/OSS virtualization specific ones.

  14. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  15. Transconvolution and the virtual positron emission tomograph-A new method for cross calibration in quantitative PET/CT imaging

    SciTech Connect

    Prenosil, George A.; Weitzel, Thilo; Hentschel, Michael; Klaeser, Bernd; Krause, Thomas

    2013-06-15

    Purpose: Positron emission tomography (PET)/computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET/CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET/CT in the context of multicenter trials. Methods: To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET/CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET/CT systems, a dedicated solid-state phantom incorporating {sup 68}Ge/{sup 68}Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination with a Gaussian distribution were introduced. Furthermore, simulation of a virtual PET system provided a standard imaging system with clearly defined properties to which the real PET systems were to be matched. A Hann window served as the modulation transfer function for the virtual PET. The Hann's apodization properties suppressed high spatial frequencies above a certain critical frequency, thereby fulfilling the above-mentioned boundary conditions. The determined point spread functions were subsequently used by the novel Transconvolution algorithm to match different PET/CT systems onto the virtual PET system. Finally, the theoretically elaborated Transconvolution method was validated transforming phantom images acquired on two different PET systems to nearly identical data sets, as they would be imaged by the virtual PET system. Results: The proposed Transconvolution method matched different PET/CT-systems for an improved and reproducible determination of a normalized activity concentration. The highest difference in measured activity concentration between the two different PET systems of 18.2% was found in spheres of 2 ml volume. Transconvolution reduced this difference down to 1.6%. In addition to reestablishing comparability the new method with its parameterization of point spread functions allowed a full characterization of imaging properties of the examined tomographs. Conclusions: By matching different tomographs to a virtual standardized imaging system, Transconvolution opens a new comprehensive method for cross calibration in quantitative PET imaging. The use of a virtual PET system restores comparability between data sets from different PET systems by exerting a common, reproducible, and defined partial volume effect.

  16. Application of quantitative 1H-NMR method to determination of gentiopicroside in Gentianae radix and Gentianae scabrae radix.

    PubMed

    Tanaka, Rie; Hasebe, Yuko; Nagatsu, Akito

    2014-07-01

    A quantitative (1)H-NMR method (qHNMR) was used to measure gentiopicroside content in Gentianae radix and Gentianae scabrae radix. Gentiopicroside is a major component of Gentianae radix and Gentianae scabrae radix. The purity of gentiopicroside was calculated from the ratio of the intensity of the H-3 signal at ? 7.44ppm or the H-8 signal at ? 5.78ppm in methanol-d 4 of gentiopicroside to that of a hexamethyldisilane (HMD) signal at 0ppm. The concentration of HMD was corrected with SI traceability by using potassium hydrogen phthalate of certified reference material (CRM) grade. As a result, the gentiopicroside content in two lots of Gentianae radix as determined by qHNMR was found to be 1.76 and 2.17%, respectively. The gentiopicroside content in two lots of Gentianae scabrae radix was 2.73 and 3.99%, respectively. We demonstrated that this method is useful for the quantitative analysis of crude drugs. PMID:24687868

  17. A systematic study on the influencing parameters and improvement of quantitative analysis of multi-component with single marker method using notoginseng as research subject.

    PubMed

    Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing

    2015-03-01

    A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%0.02% - 23.29%3.23% to 0.10%0.09% - 8.84%2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi-component and the quality control of TCMs and TCM prescriptions. PMID:25618711

  18. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  19. A quantitative method for evaluating numerical simulation accuracy of time-transient Lamb wave propagation with its applications to selecting appropriate element size and time step.

    PubMed

    Wan, Xiang; Xu, Guanghua; Zhang, Qing; Tse, Peter W; Tan, Haihui

    2016-01-01

    Lamb wave technique has been widely used in non-destructive evaluation (NDE) and structural health monitoring (SHM). However, due to the multi-mode characteristics and dispersive nature, Lamb wave propagation behavior is much more complex than that of bulk waves. Numerous numerical simulations on Lamb wave propagation have been conducted to study its physical principles. However, few quantitative studies on evaluating the accuracy of these numerical simulations were reported. In this paper, a method based on cross correlation analysis for quantitatively evaluating the simulation accuracy of time-transient Lamb waves propagation is proposed. Two kinds of error, affecting the position and shape accuracies are firstly identified. Consequently, two quantitative indices, i.e., the GVE (group velocity error) and MACCC (maximum absolute value of cross correlation coefficient) derived from cross correlation analysis between a simulated signal and a reference waveform, are proposed to assess the position and shape errors of the simulated signal. In this way, the simulation accuracy on the position and shape is quantitatively evaluated. In order to apply this proposed method to select appropriate element size and time step, a specialized 2D-FEM program combined with the proposed method is developed. Then, the proper element size considering different element types and time step considering different time integration schemes are selected. These results proved that the proposed method is feasible and effective, and can be used as an efficient tool for quantitatively evaluating and verifying the simulation accuracy of time-transient Lamb wave propagation. PMID:26315506

  20. Quantitative fault analysis of roller bearings based on a novel matching pursuit method with a new step-impulse dictionary

    NASA Astrophysics Data System (ADS)

    Cui, Lingli; Wu, Na; Ma, Chunqing; Wang, Huaqing

    2016-02-01

    A novel matching pursuit method based on a new step-impulse dictionary to measure the size of a bearing's spall-like fault is presented in this study. Based on the seemingly double-impact theory and the rolling bearing fault mechanism, a theoretical model for the bearing fault with different spall-like fault sizes is developed and analyzed, and the seemingly double-impact characteristic of the bearing faults is explained. The first action that causes a bearing fault is due to the entry of the roller element into the spall-like fault which can be described as a step-like response. The second action is the exit of the roller element from the spall-like fault, which can be described as an impulse-like response. Based on the quantitative relationship between the time interval of the seemingly double-impact actions and the fault size, a novel matching pursuit method is proposed based on a new step-impulse dictionary. In addition, the quantitative matching pursuit algorithm is proposed for bearing fault diagnosis based on the new dictionary model. Finally, an atomic selection mechanism is proposed to improve the measurement accuracy of bearing fault size. The simulation results of this study indicate that the new matching pursuit method based on the new step-impulse dictionary can be reliably used to measure the sizes of bearing spall-like faults. The applications of this method to the fault signals of bearing outer-races measured at different speeds have shown that the proposed method can effectively measure a bearing's spall-like fault size.

  1. Development and field application of a quantitative method for examining natural assemblages of protists with oligonucleotide probes.

    PubMed Central

    Lim, E L; Caron, D A; Delong, E F

    1996-01-01

    A fluorescent in situ hybridization method that uses rRNA-targeted oligonucleotide probes for counting protists in cultures and environmental water samples is described. Filtration, hybridization, and enumeration of fixed cells with biotinylated eukaryote-specific probes and fluorescein isothiocyanate-conjugated avidin were performed directly on 0.4-microns-pore-size polycarbonate filters of Transwell cell culture inserts (Costar Corp., Cambridge, Mass.). Counts of various species of cultured protists by this probe hybridization method were not significantly different from counts obtained by the 4',6-diamidino-2-phenylindole (DAPI) and acridine orange (AO) staining methods. However, counts of total nanoplankton (TNAN) based on probe hybridizations in several field samples and in samples collected from a mesocosm experiment were frequently higher than TNAN counts obtained by staining with DAPI or AO. On the basis of these results, 25 to 70% of the TNAN determined with probes were not detectable by DAPI or AO staining. The underestimation of TNAN abundances in samples stained with DAPI or AO was attributed to the existence of small nanoplanktonic cells which could be detected with probes but not DAPI or AO and the difficulty associated with distinguishing DAPI- or AO-stained protists attached to or embedded in aggregates. We conclude from samples examined in this study that enumeration of TNAN with oligonucleotide probes provides estimates of natural TNAN abundances that are at least as high as (and in some cases higher than) counts obtained with commonly employed fluorochrome stains. The quantitative in situ hybridization method we have described here enables the direct enumeration of free-living protists in water samples with oligonucleotide probes. When combined with species-specific probes, this method will enable quantitative studies of the abundance and distribution of specific protistan taxa. PMID:8919803

  2. Development of an Effective Method for Recovery of Viral Genomic RNA from Environmental Silty Sediments for Quantitative Molecular Detection ▿

    PubMed Central

    Miura, Takayuki; Masago, Yoshifumi; Sano, Daisuke; Omura, Tatsuo

    2011-01-01

    Nine approaches to recover viral RNA from environmental silty sediments were newly developed and compared to quantify RNA viruses in sediments using molecular methods. Four of the nine approaches employed direct procedures for extracting RNA from sediments (direct methods), and the remaining five approaches used indirect methods wherein viral particles were recovered before RNA extraction. A direct method using an SDS buffer with EDTA to lyse viral capsids in sediments, phenol-chloroform-isoamyl alcohol to extract RNA, isopropanol to concentrate RNA, and magnetic beads to purify RNA resulted in the highest rate of recovery (geometric mean of 11%, with a geometric standard deviation of 0.02; n = 7) of poliovirus 1 (PV1) inoculated in an environmental sediment sample. The direct method exhibiting the highest rate of PV1 recovery was applied to environmental sediment samples. One hundred eight sediment samples were collected from the Takagi River, Miyagi, Japan, and its estuary from November 2007 to April 2009, and the genomic RNAs of enterovirus and human norovirus in these samples were quantified by reverse transcription (RT)-quantitative PCR (qPCR). The human norovirus genome was detected in one sample collected at the bay, although its concentration was below the quantification limit. Meanwhile, the enterovirus genome was detected in two samples at the river mouth and river at concentrations of 8.6 × 102 and 2.4 × 102 copies/g (wet weight), respectively. This is the first report to obtain quantitative data for a human pathogenic virus in a river and in estuarine sediments using RT-qPCR. PMID:21515729

  3. Parents' decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results.

    PubMed

    Krawczyk, Andrea; Knuper, Brbel; Gilca, Vladimir; Dub, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Qubec, 774 parents of 9-10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents' general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  4. Quantitative Results from Shockless Compression Experiments on Solids to Multi-Megabar Pressure

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul; Brown, Justin; Knudson, Marcus; Lemke, Raymond

    2015-03-01

    Quasi-isentropic, shockless ramp-wave experiments promise accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multi-megabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited pressure accuracy due to reliance on theoretical EOS of calibration standards, thus accurate quasi-isentropic compression data would help immensely in constraining EOS models. Multi-megabar shockless compression experiments using the Z Machine at Sandia as a magnetic drive with stripline targets continue to be performed on a number of solids. New developments will be presented in the design and analysis of these experiments, including topics such as 2-D and magneto-hydrodynamic (MHD) effects and the use of LiF windows. Results will be presented for tantalum and/or gold metals, with comparisons to independently developed EOS. * Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  5. Parents’ decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results

    PubMed Central

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9–10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents’ general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  6. Quantitative ultrasound method for assessing stress-strain properties and the cross-sectional area of Achilles tendon

    NASA Astrophysics Data System (ADS)

    Du, Yi-Chun; Chen, Yung-Fu; Li, Chien-Ming; Lin, Chia-Hung; Yang, Chia-En; Wu, Jian-Xing; Chen, Tainsong

    2013-12-01

    The Achilles tendon is one of the most commonly observed tendons injured with a variety of causes, such as trauma, overuse and degeneration, in the human body. Rupture and tendinosis are relatively common for this strong tendon. Stress-strain properties and shape change are important biomechanical properties of the tendon to assess surgical repair or healing progress. Currently, there are rather limited non-invasive methods available for precisely quantifying the in vivo biomechanical properties of the tendons. The aim of this study was to apply quantitative ultrasound (QUS) methods, including ultrasonic attenuation and speed of sound (SOS), to investigate porcine tendons in different stress-strain conditions. In order to find a reliable method to evaluate the change of tendon shape, ultrasound measurement was also utilized for measuring tendon thickness and compared with the change in tendon cross-sectional area under different stress. A total of 15 porcine tendons of hind trotters were examined. The test results show that the attenuation and broadband ultrasound attenuation decreased and the SOS increased by a smaller magnitude as the uniaxial loading of the stress-strain upon tendons increased. Furthermore, the tendon thickness measured with the ultrasound method was significantly correlated with tendon cross-sectional area (Pearson coefficient = 0.86). These results also indicate that attenuation of QUS and ultrasonic thickness measurement are reliable and potential parameters for assessing biomechanical properties of tendons. Further investigations are needed to warrant the application of the proposed method in a clinical setting.

  7. Quantitatively estimating defects in graphene devices using discharge current analysis method

    PubMed Central

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 1014/cm2, which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  8. Quantitatively estimating defects in graphene devices using discharge current analysis method.

    PubMed

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 10(14)/cm(2), which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  9. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    PubMed Central

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  10. Spatial stochastic simulation offers potential as a quantitative method for pest risk analysis.

    PubMed

    Rafoss, Trond

    2003-08-01

    Pest risk analysis represents an emerging field of risk analysis that evaluates the potential risks of the introduction and establishment of plant pests into a new geographic location and then assesses the management options to reduce those potential risks. Development of new and adapted methodology is required to answer questions concerning pest risk analysis of exotic plant pests. This research describes a new method for predicting the potential establishment and spread of a plant pest into new areas using a case study, Ralstonia solanacearum, a bacterial disease of potato. This method combines current quantitative methodologies, stochastic simulation, and geographic information systems with knowledge of pest biology and environmental data to derive new information about pest establishment potential in a geographical region where a pest had not been introduced. This proposed method extends an existing methodology for matching pest characteristics with environmental conditions by modeling and simulating dissemination behavior of a pest organism. Issues related to integrating spatial variables into risk analysis models are further discussed in this article. PMID:12926559

  11. Semi-quantitative characterisation of ambient ultrafine aerosols resulting from emissions of coal fired power stations.

    PubMed

    Hinkley, J T; Bridgman, H A; Buhre, B J P; Gupta, R P; Nelson, P F; Wall, T F

    2008-02-25

    Emissions from coal fired power stations are known to be a significant anthropogenic source of fine atmospheric particles, both through direct primary emissions and secondary formation of sulfate and nitrate from emissions of gaseous precursors. However, there is relatively little information available in the literature regarding the contribution emissions make to the ambient aerosol, particularly in the ultrafine size range. In this study, the contribution of emissions to particles smaller than 0.3 mum in the ambient aerosol was examined at a sampling site 7 km from two large Australian coal fired power stations equipped with fabric filters. A novel approach was employed using conditional sampling based on sulfur dioxide (SO(2)) as an indicator species, and a relatively new sampler, the TSI Nanometer Aerosol Sampler. Samples were collected on transmission electron microscope (TEM) grids and examined using a combination of TEM imaging and energy dispersive X-ray (EDX) analysis for qualitative chemical analysis. The ultrafine aerosol in low SO(2) conditions was dominated by diesel soot from vehicle emissions, while significant quantities of particles, which were unstable under the electron beam, were observed in the high SO(2) samples. The behaviour of these particles was consistent with literature accounts of sulfate and nitrate species, believed to have been derived from precursor emissions from the power stations. A significant carbon peak was noted in the residues from the evaporated particles, suggesting that some secondary organic aerosol formation may also have been catalysed by these acid seed particles. No primary particulate material was observed in the minus 0.3 mum fraction. The results of this study indicate the contribution of species more commonly associated with gas to particle conversion may be more significant than expected, even close to source. PMID:18054995

  12. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  13. A novel method for quantitative determination of tea polysaccharide by resonance light scattering.

    PubMed

    Wei, Xinlin; Xi, Xionggang; Wu, Muxia; Wang, Yuanfeng

    2011-09-01

    A new method for the determination of tea polysaccharide (TPS) in green tea (Camellia sinensis) leaves has been developed. The method was based on the enhancement of resonance light scattering (RLS) of TPS in the presence of cetylpyridinium chloride (CPC)-NaOH system. Under the optimum conditions, the RLS intensity of CPC was greatly enhanced by adding TPS. The maximum peak of the enhanced RLS spectra was located at 484.02 nm. The enhanced RLS intensity was proportional to the concentration of TPS in the range of 2.0-20 ?g/ml. It showed that the new method and phenol-sulfuric acid method give some equivalent results by measuring the standard compounds. The recoveries of the two methods were 96.39-103.7% (novel method) and 100.15-103.65% (phenol-sulfuric acid method), respectively. However, it showed that the two methods were different to some extent. The new method offered a limit of detection (LOD) of 0.047 ?g/ml, whereas the phenol-sulfuric acid method gives a LOD of 1.54 ?g/ml. Interfered experiment demonstrated that the new method had highly selectivity, and was more suitable for the determination of TPS than phenol-sulfuric method. Stability test showed that new method had good stability. Moreover, the proposed method owns the advantages of easy operation, rapidity and practicability, which suggested that the proposed method could be satisfactorily applied to the determination of TPS in green tea. PMID:21571584

  14. A novel method for quantitative determination of tea polysaccharide by resonance light scattering

    NASA Astrophysics Data System (ADS)

    Wei, Xinlin; Xi, Xionggang; Wu, Muxia; Wang, Yuanfeng

    2011-09-01

    A new method for the determination of tea polysaccharide (TPS) in green tea ( Camellia sinensis) leaves has been developed. The method was based on the enhancement of resonance light scattering (RLS) of TPS in the presence of cetylpyridinium chloride (CPC)-NaOH system. Under the optimum conditions, the RLS intensity of CPC was greatly enhanced by adding TPS. The maximum peak of the enhanced RLS spectra was located at 484.02 nm. The enhanced RLS intensity was proportional to the concentration of TPS in the range of 2.0-20 ?g/ml. It showed that the new method and phenol-sulfuric acid method give some equivalent results by measuring the standard compounds. The recoveries of the two methods were 96.39-103.7% (novel method) and 100.15-103.65% (phenol-sulfuric acid method), respectively. However, it showed that the two methods were different to some extent. The new method offered a limit of detection (LOD) of 0.047 ?g/ml, whereas the phenol-sulfuric acid method gives a LOD of 1.54 ?g/ml. Interfered experiment demonstrated that the new method had highly selectivity, and was more suitable for the determination of TPS than phenol-sulfuric method. Stability test showed that new method had good stability. Moreover, the proposed method owns the advantages of easy operation, rapidity and practicability, which suggested that the proposed method could be satisfactorily applied to the determination of TPS in green tea.

  15. Quantifying social norms: by coupling the ecosystem management concept and semi-quantitative sociological methods

    NASA Astrophysics Data System (ADS)

    Zhang, D.; Xu, H.

    2012-12-01

    Over recent decades, human-induced environmental changes have steadily and rapidly grown in intensity and impact to where they now often exceed natural impacts. As one of important components of human activities, social norms play key roles in environmental and natural resources management. But the lack of relevant quantitative data about social norms greatly limits our scientific understanding of the complex linkages between humans and nature, and hampers our solving of pressing environmental and social problems. In this study, we built a quantified method by coupling the ecosystem management concept, semi-quantitative sociological methods and mathematical statistics. We got the quantified value of social norms from two parts, whether the content of social norms coincide with the concept of ecosystem management (content value) and how about the performance after social norms were put into implementation (implementation value) . First, we separately identified 12 core elements of ecosystem management and 16 indexes of social norms, and then matched them one by one. According to their matched degree, we got the content value of social norms. Second, we selected 8 key factors that can represent the performance of social norms after they were put into implementation, and then we got the implementation value by Delph method. Adding these two parts values, we got the final value of each social norms. Third, we conducted a case study in Heihe river basin, the second largest inland river in China, by selecting 12 official edicts related to the river basin ecosystem management of Heihe River Basin. By doing so, we first got the qualified data of social norms which can be directly applied to the research that involved observational or experimental data collection of natural processes. Second, each value was supported by specific contents, so it can assist creating a clear road map for building or revising management and policy guidelines. For example, in this case study, the final quantified data of each social norm showed highly positive correlations with their content value rather than their implementation value, which implied the final value of social norms are mainly affected by the content of social norms. And the implementation of social norms had reached a relatively high degree compare to their theoretical maxvalue (from 71.29% to 80.25%) because of the compelling force of themselves, while the content value of social norms is so weak (from 16.69% to 30.62%) that urgently need to be improved. Third, the method can be extended to quantify the social norms of other ecosystems and further contributed to our understanding of the Coupled Human and Natural Systems and sustainability research.;

  16. Evaluation of a rapid, quantitative real-time PCR method for enumeration of pathogenic Candida cells in water

    USGS Publications Warehouse

    Brinkman, Nichole E.; Haugland, Richard A.; Wymer, Larry J.; Byappanahalli, Muruleedhara N.; Whitman, Richard L.; Vesper, Stephen J.

    2003-01-01

    Quantitative PCR (QPCR) technology, incorporating fluorigenic 5' nuclease (TaqMan) chemistry, was utilized for the specific detection and quantification of six pathogenic species of Candida (C. albicans, C. tropicalis, C. krusei, C. parapsilosis, C. glabrata and C. lusitaniae) in water. Known numbers of target cells were added to distilled and tap water samples, filtered, and disrupted directly on the membranes for recovery of DNA for QPCR analysis. The assay's sensitivities were between one and three cells per filter. The accuracy of the cell estimates was between 50 and 200% of their true value (95% confidence level). In similar tests with surface water samples, the presence of PCR inhibitory compounds necessitated further purification and/or dilution of the DNA extracts, with resultant reductions in sensitivity but generally not in quantitative accuracy. Analyses of a series of freshwater samples collected from a recreational beach showed positive correlations between the QPCR results and colony counts of the corresponding target species. Positive correlations were also seen between the cell quantities of the target Candida species detected in these analyses and colony counts of Enterococcus organisms. With a combined sample processing and analysis time of less than 4 h, this method shows great promise as a tool for rapidly assessing potential exposures to waterborne pathogenic Candida species from drinking and recreational waters and may have applications in the detection of fecal pollution.

  17. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors of uncertainty, and by partial dissolution of the injected CO2, thus reducing the amount of free gas, which can be detected by seismic time-lapse observations. These quantitative assessment studies have shown that conformity between injected and estimated CO2 quantities can only be achieved with some degree of uncertainty which needs to be quantified for a realistic assessment of conformity studies.

  18. mcrA-Targeted Real-Time Quantitative PCR Method To Examine Methanogen Communities▿

    PubMed Central

    Steinberg, Lisa M.; Regan, John M.

    2009-01-01

    Methanogens are of great importance in carbon cycling and alternative energy production, but quantitation with culture-based methods is time-consuming and biased against methanogen groups that are difficult to cultivate in a laboratory. For these reasons, methanogens are typically studied through culture-independent molecular techniques. We developed a SYBR green I quantitative PCR (qPCR) assay to quantify total numbers of methyl coenzyme M reductase α-subunit (mcrA) genes. TaqMan probes were also designed to target nine different phylogenetic groups of methanogens in qPCR assays. Total mcrA and mcrA levels of different methanogen phylogenetic groups were determined from six samples: four samples from anaerobic digesters used to treat either primarily cow or pig manure and two aliquots from an acidic peat sample stored at 4°C or 20°C. Only members of the Methanosaetaceae, Methanosarcina, Methanobacteriaceae, and Methanocorpusculaceae and Fen cluster were detected in the environmental samples. The three samples obtained from cow manure digesters were dominated by members of the genus Methanosarcina, whereas the sample from the pig manure digester contained detectable levels of only members of the Methanobacteriaceae. The acidic peat samples were dominated by both Methanosarcina spp. and members of the Fen cluster. In two of the manure digester samples only one methanogen group was detected, but in both of the acidic peat samples and two of the manure digester samples, multiple methanogen groups were detected. The TaqMan qPCR assays were successfully able to determine the environmental abundance of different phylogenetic groups of methanogens, including several groups with few or no cultivated members. PMID:19447957

  19. Cloned plasmid DNA fragments as calibrators for controlling GMOs: different real-time duplex quantitative PCR methods.

    PubMed

    Taverniers, Isabel; Van Bockstaele, Erik; De Loose, Marc

    2004-03-01

    Analytical real-time PCR technology is a powerful tool for implementation of the GMO labeling regulations enforced in the EU. The quality of analytical measurement data obtained by quantitative real-time PCR depends on the correct use of calibrator and reference materials (RMs). For GMO methods of analysis, the choice of appropriate RMs is currently under debate. So far, genomic DNA solutions from certified reference materials (CRMs) are most often used as calibrators for GMO quantification by means of real-time PCR. However, due to some intrinsic features of these CRMs, errors may be expected in the estimations of DNA sequence quantities. In this paper, two new real-time PCR methods are presented for Roundup Ready soybean, in which two types of plasmid DNA fragments are used as calibrators. Single-target plasmids (STPs) diluted in a background of genomic DNA were used in the first method. Multiple-target plasmids (MTPs) containing both sequences in one molecule were used as calibrators for the second method. Both methods simultaneously detect a promoter 35S sequence as GMO-specific target and a lectin gene sequence as endogenous reference target in a duplex PCR. For the estimation of relative GMO percentages both "delta C(T)" and "standard curve" approaches are tested. Delta C(T) methods are based on direct comparison of measured C(T) values of both the GMO-specific target and the endogenous target. Standard curve methods measure absolute amounts of target copies or haploid genome equivalents. A duplex delta C(T) method with STP calibrators performed at least as well as a similar method with genomic DNA calibrators from commercial CRMs. Besides this, high quality results were obtained with a standard curve method using MTP calibrators. This paper demonstrates that plasmid DNA molecules containing either one or multiple target sequences form perfect alternative calibrators for GMO quantification and are especially suitable for duplex PCR reactions. PMID:14689155

  20. Initial Results of an MDO Method Evaluation Study

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley MDO method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of re- producible experiments. In the first phase of the study, three MDO methods were implemented in the SIGHT: framework and used to solve a set of ten relatively simple problems. In this paper, we comment on the general considerations for conducting method evaluation studies and report some initial results obtained to date. In particular, although the results are not conclusive because of the small initial test set, other formulations, optimality conditions, and sensitivity of solutions to various perturbations. Optimization algorithms are used to solve a particular MDO formulation. It is then appropriate to speak of local convergence rates and of global convergence properties of an optimization algorithm applied to a specific formulation. An analogous distinction exists in the field of partial differential equations. On the one hand, equations are analyzed in terms of regularity, well-posedness, and the existence and unique- ness of solutions. On the other, one considers numerous algorithms for solving differential equations. The area of MDO methods studies MDO formulations combined with optimization algorithms, although at times the distinction is blurred. It is important to

  1. Comparison of concentration methods for rapid detection of hookworm ova in wastewater matrices using quantitative PCR.

    PubMed

    Gyawali, P; Ahmed, W; Jagals, P; Sidhu, J P S; Toze, S

    2015-12-01

    Hookworm infection contributes around 700 million infections worldwide especially in developing nations due to increased use of wastewater for crop production. The effective recovery of hookworm ova from wastewater matrices is difficult due to their low concentrations and heterogeneous distribution. In this study, we compared the recovery rates of (i) four rapid hookworm ova concentration methods from municipal wastewater, and (ii) two concentration methods from sludge samples. Ancylostoma caninum ova were used as surrogate for human hookworm (Ancylostoma duodenale and Necator americanus). Known concentration of A.caninum hookworm ova were seeded into wastewater (treated and raw) and sludge samples collected from two wastewater treatment plants (WWTPs) in Brisbane and Perth, Australia. The A.caninum ova were concentrated from treated and raw wastewater samples using centrifugation (Method A), hollow fiber ultrafiltration (HFUF) (Method B), filtration (Method C) and flotation (Method D) methods. For sludge samples, flotation (Method E) and direct DNA extraction (Method F) methods were used. Among the four methods tested, filtration (Method C) method was able to recover higher concentrations of A.caninum ova consistently from treated wastewater (39-50%) and raw wastewater (7.1-12%) samples collected from both WWTPs. The remaining methods (Methods A, B and D) yielded variable recovery rate ranging from 0.2 to 40% for treated and raw wastewater samples. The recovery rates for sludge samples were poor (0.02-4.7), although, Method F (direct DNA extraction) provided 1-2 orders of magnitude higher recovery rate than Method E (flotation). Based on our results it can be concluded that the recovery rates of hookworm ova from wastewater matrices, especially sludge samples, can be poor and highly variable. Therefore, choice of concentration method is vital for the sensitive detection of hookworm ova in wastewater matrices. PMID:26358269

  2. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  3. Quantitative (1)H NMR method for hydrolytic kinetic investigation of salvianolic acid B.

    PubMed

    Pan, Jianyang; Gong, Xingchu; Qu, Haibin

    2013-11-01

    This work presents an exploratory study for monitoring the hydrolytic process of salvianolic acid B (Sal B) in low oxygen condition using a simple quantitative (1)H NMR (Q-NMR) method. The quantity of the compounds was calculated by the relative ratio of the integral values of the target peak for each compound to the known amount of the internal standard trimethylsilyl propionic acid (TSP). Kinetic runs have been carried out on different initial concentrations of Sal B (5.00, 10.0, 20.0mg/mL) and temperatures of 70, 80, 90C. The effect of these two factors during the transformation process of Sal B was investigated. The hydrolysis followed pseudo-first-order kinetics and the apparent degradation kinetic constant at 80C decreased when concentration of Sal B increased. Under the given conditions, the rate constant of overall hydrolysis as a function of temperature obeyed the Arrhenius equation. Six degradation products were identified by NMR and mass spectrometric analysis. Four of these degradation products, i.e. danshensu (DSS), protocatechuic aldehyde (PRO), salvianolic acid D (Sal D) and lithospermic acid (LA) were further identified by comparing the retention times with standard compounds. The advantage of this Q-NMR method was that no reference compounds were required for calibration curves, the quantification could be directly realized on hydrolyzed samples. It was proved to be simple, convenient and accurate for hydrolytic kinetic study of Sal B. PMID:23867115

  4. 3D reconstruction and quantitative assessment method of mitral eccentric regurgitation from color Doppler echocardiography

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Ge, Yi Nan; Wang, Tian Fu; Zheng, Chang Qiong; Zheng, Yi

    2005-10-01

    Based on the two-dimensional color Doppler image in this article, multilane transesophageal rotational scanning method is used to acquire original Doppler echocardiography while echocardiogram is recorded synchronously. After filtering and interpolation, the surface rendering and volume rendering methods are performed. Through analyzing the color-bar information and the color Doppler flow image's superposition principle, the grayscale mitral anatomical structure and color-coded regurgitation velocity parameter were separated from color Doppler flow images, three-dimensional reconstruction of mitral structure and regurgitation velocity distribution was implemented separately, fusion visualization of the reconstructed regurgitation velocity distribution parameter with its corresponding 3D mitral anatomical structures was realized, which can be used in observing the position, phase, direction and measuring the jet length, area, volume, space distribution and severity level of the mitral regurgitation. In addition, in patients with eccentric mitral regurgitation, this new modality overcomes the inherent limitations of two-dimensional color Doppler flow image by depicting the full extent of the jet trajectory, the area of eccentric regurgitation on three-dimensional image was much larger than that on two-dimensional image, the area variation tendency and volume variation tendency of regurgitation have been shown in figure at different angle and different systolic phase. The study shows that three-dimensional color Doppler provides quantitative measurements of eccentric mitral regurgitation that are more accurate and reproducible than conventional color Doppler.

  5. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%.

  6. A Novel Method for Relative Quantitation of N-Glycans by Isotopic Labeling Using 18O-Water

    PubMed Central

    Tao, Shujuan; Orlando, Ron

    2014-01-01

    Quantitation is an essential aspect of comprehensive glycomics study. Here, a novel isotopic-labeling method is described for N-glycan quantitation using 18O-water. The incorporation of the 18O-labeling into the reducing end of N-glycans is simply and efficiently achieved during peptide-N4-(N-acetyl-?-glucosaminyl) asparagine amidase F release. This process provides a 2-Da mass difference compared with the N-glycans released in 16O-water. A mathematical calculation method was also developed to determine the 18O/16O ratios from isotopic peaks. Application of this method to several standard glycoprotein mixtures and human serum demonstrated that this method can facilitate the relative quantitation of N-glycans over a linear dynamic range of two orders, with high accuracy and reproducibility. PMID:25365792

  7. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging

    NASA Astrophysics Data System (ADS)

    Knik, Arda; Kupinski, Meredith; Hendrik Pretorius, P.; King, Michael A.; Barrett, Harrison H.

    2015-08-01

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3?cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested.

  8. Results of a comparison of methods for calculating thermal radiation

    NASA Astrophysics Data System (ADS)

    Ponomareva, T. Ia.; Skrotskaia, O. P.

    1991-03-01

    This paper compares several methods for estimating thermal radiation fluxes using data for standard atmosphere models of the sub-Arctic, temperate-zone, and tropical latitudes. It is shown that the results depend on the degree of accuracy in approximating the atmospheric gas transmission function at long waves. Also considered are the effects of pressure and temperature on the parameters of gas absorption.

  9. Development of a quantitative method for the characterization of hole quality during laser trepan drilling of high-temperature alloy

    NASA Astrophysics Data System (ADS)

    Zhang, Hongyu; Zhou, Ming; Wang, Yunlong; Zhang, Xiangchao; Yan, Yu; Wang, Rong

    2016-02-01

    Short-pulsed lasers are of significant industrial relevance in laser drilling, with an acceptable compromise between accuracy and efficiency. However, an intensive research with regard to qualitative and quantitative characterization of the hole quality has rarely been reported. In the present study, a series of through holes were fabricated on a high-temperature alloy workpiece with a thickness of 3 mm using a LASERTEC 80 PowerDrill manufacturing system, which incorporated a Nd:YAG millisecond laser into a five-axis positioning platform. The quality of the holes manufactured under different laser powers (80-140 W) and beam expanding ratios (1-6) was characterized by a scanning electron microscope associated with an energy-dispersive X-ray analysis, focusing mainly on the formation of micro-crack and recast layer. Additionally, the conicity and circularity of the holes were quantitatively evaluated by the apparent radius, root-mean-square deviation, and maximum deviation, which were calculated based on the extraction of hole edge through programming with MATLAB. The results showed that an amount of melting and spattering contents were presented at the entrance end and the exit end of the holes, and micro-cracks and recast layer (average thickness 15-30 µm) were detected along the side wall of the holes. The elemental composition of the melting and spattering contents and the recast layer was similar, with an obvious increase in the contents of O, Nb, and Cr and a great reduction in the contents of Fe and Ni in comparison with the bulk material. Furthermore, the conicity and circularity evaluation of the holes indicated that a laser power of 100 W and a beam expanding ratio of 4 or 5 represented the typical optimal drilling parameters in this specific experimental situation. It is anticipated that the quantitative method developed in the present study can be applied for the evaluation of hole quality in laser drilling and other drilling conditions.

  10. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  11. Design and validation of a novel quantitative method for rapid bacterial enumeration using programmed stage movement scanning electron microscopy.

    PubMed

    Sanders, David L; Bond, Peter; Moate, Roy; Steer, Jane A

    2012-12-01

    The adhesion of bacteria to surgical implants is the first stage of implant infection. The method for detecting bound bacteria is an important consideration in the study of bacterial adherence and colonisation. Enumeration of bacteria by direct visualisation techniques is labour intensive and time consuming. We have developed and validated a method for enumerating bacteria on porous material surfaces using programmed stage movement scanning electron microscopy and compared cumulative counts after 1-10 stage movements with absolute bacterial counts. We describe this method with three commercially sourced meshes used for abdominal wall hernia repair and with three different inoculums of Staphylococcus epidermidis. The results demonstrate significant correlation to the absolute count after five cumulative counts for all meshes analysed. The mean time saved by the cumulative counting method was 1h and 9 min per mesh. We conclude that advances in scanning electron microscopy and the advent of precise automated stage control have facilitated rapid data acquisition for bacterial counting purposes and that five cumulative counts at 1000 or 2500 magnification are a valid quantitative method for enumerating S. epidermidis bacteria on porous surfaces (with a pore size of up to 1.3 mm). PMID:23041496

  12. Quantitative schlieren method for studying the wavefront reconstructed from a hologram

    SciTech Connect

    Lyalikov, A.M.

    1995-03-01

    A schlieren method is proposed for visualizing the deflection angles of the light beams reconstructed from a phase object hologram. The method is based on employing a stationary visualizing slit and selecting the image of a slit light source by a movable slit. This light source comprises several equidistant slit sources. Compensation for the aberrations of the hologram-recording system is considered. Experimental results of the evaluation tests showing the performance of the method developed are presented. 15 refs., 4 figs.

  13. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    ERIC Educational Resources Information Center

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of…

  14. Monochloramine disinfection kinetics of Nitrosomonas europaea by propidium monoazide quantitative PCR and Live/Dead BacLight Methods

    EPA Science Inventory

    Monochloramine disinfection kinetics were determined for the pure culture ammonia-oxidizing bacterium Nitrosomonas europaea (ATCC 19718) by two culture independent methods: (1) LIVE/DEAD® BacLight™ (LD) and (2) propidium monoazide quantitative PCR (PMA-qPCR). Both methods were f...

  15. Monochloramine disinfection kinetics of Nitrosomonas europaea by propidium monoazide quantitative PCR and Live/Dead BacLight Methods

    EPA Science Inventory

    Monochloramine disinfection kinetics were determined for the pure culture ammonia-oxidizing bacterium Nitrosomonas europaea (ATCC 19718) by two culture independent methods: (1) LIVE/DEAD BacLight (LD) and (2) propidium monoazide quantitative PCR (PMA-qPCR). Both methods were f...

  16. Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods. NCEE 2014-4017

    ERIC Educational Resources Information Center

    Schochet, Peter Z.; Puma, Mike; Deke, John

    2014-01-01

    This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed

  17. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    ERIC Educational Resources Information Center

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of

  18. A Quantitative Study of a Software Tool that Supports a Part-Complete Solution Method on Learning Outcomes

    ERIC Educational Resources Information Center

    Garner, Stuart

    2009-01-01

    This paper reports on the findings from a quantitative research study into the use of a software tool that was built to support a part-complete solution method (PCSM) for the learning of computer programming. The use of part-complete solutions to programming problems is one of the methods that can be used to reduce the cognitive load that students

  19. Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative, and Mixed Methods. Second Edition

    ERIC Educational Resources Information Center

    Mertens, Donna M.

    2004-01-01

    In this new edition, the author explains quantitative, qualitative, and mixed methods, and incorporates the viewpoints of various research paradigms (postpositivist, constructivist, transformative, and pragmatic) into descriptions of these methods. Special emphasis is provided for conducting research in culturally complex communities. Each chapter

  20. Electron Paramagnetic Resonance Oximetry as a Quantitative Method to Measure Cellular Respiration: A Consideration of Oxygen Diffusion Interference

    PubMed Central

    Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L.; Ilangovan, Govindasamy

    2006-01-01

    Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO2-independent respiration at higher pO2 ranges, pO2-dependent respiration at low pO2 ranges, and a static equilibrium with no change in pO2 at very low pO2 values. The approach here enables one to comprehensively analyze all of the three zones togetherwhere the progression of O2 diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO2 data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry. PMID:17012319

  1. Novel Method for Automated Analysis of Retinal Images: Results in Subjects with Hypertensive Retinopathy and CADASIL

    PubMed Central

    Cavallari, Michele; Stamile, Claudio; Umeton, Renato; Calimeri, Francesco; Orzi, Francesco

    2015-01-01

    Morphological analysis of the retinal vessels by fundoscopy provides noninvasive means for detecting and staging systemic microvascular damage. However, full exploitation of fundoscopy in clinical settings is limited by paucity of quantitative, objective information obtainable through the observer-driven evaluations currently employed in routine practice. Here, we report on the development of a semiautomated, computer-based method to assess retinal vessel morphology. The method allows simultaneous and operator-independent quantitative assessment of arteriole-to-venule ratio, tortuosity index, and mean fractal dimension. The method was implemented in two conditions known for being associated with retinal vessel changes: hypertensive retinopathy and Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL). The results showed that our approach is effective in detecting and quantifying the retinal vessel abnormalities. Arteriole-to-venule ratio, tortuosity index, and mean fractal dimension were altered in the subjects with hypertensive retinopathy or CADASIL with respect to age- and gender-matched controls. The interrater reliability was excellent for all the three indices (intraclass correlation coefficient ? 85%). The method represents simple and highly reproducible means for discriminating pathological conditions characterized by morphological changes of retinal vessels. The advantages of our method include simultaneous and operator-independent assessment of different parameters and improved reliability of the measurements. PMID:26167496

  2. Novel Method for Automated Analysis of Retinal Images: Results in Subjects with Hypertensive Retinopathy and CADASIL.

    PubMed

    Cavallari, Michele; Stamile, Claudio; Umeton, Renato; Calimeri, Francesco; Orzi, Francesco

    2015-01-01

    Morphological analysis of the retinal vessels by fundoscopy provides noninvasive means for detecting and staging systemic microvascular damage. However, full exploitation of fundoscopy in clinical settings is limited by paucity of quantitative, objective information obtainable through the observer-driven evaluations currently employed in routine practice. Here, we report on the development of a semiautomated, computer-based method to assess retinal vessel morphology. The method allows simultaneous and operator-independent quantitative assessment of arteriole-to-venule ratio, tortuosity index, and mean fractal dimension. The method was implemented in two conditions known for being associated with retinal vessel changes: hypertensive retinopathy and Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL). The results showed that our approach is effective in detecting and quantifying the retinal vessel abnormalities. Arteriole-to-venule ratio, tortuosity index, and mean fractal dimension were altered in the subjects with hypertensive retinopathy or CADASIL with respect to age- and gender-matched controls. The interrater reliability was excellent for all the three indices (intraclass correlation coefficient ? 85%). The method represents simple and highly reproducible means for discriminating pathological conditions characterized by morphological changes of retinal vessels. The advantages of our method include simultaneous and operator-independent assessment of different parameters and improved reliability of the measurements. PMID:26167496

  3. A new quantitative screening method for removable prosthesis using pressure-indicating paste.

    PubMed

    Sanagawa, T; Hara, T; Minagi, S

    2014-10-01

    Aim of this study was to quantitatively evaluate the adaptation of the denture base to the mucosa using a non-setting pressure-indicating paste and to examine the relationship between quality of fit and the need for denture relining. A total of 123 dentures from 70 partially edentulous patients were studied. Examination paste extruded from the tip of the 18-G needle was applied to those denture surfaces contacting the alveolar crest. The denture was manually positioned with all clasps engaged on abutment teeth, and adaptation was assessed through paste distribution. Multiple logistic regression was used to analyse variables associated with diagnosing the need for a denture reline, producing odds ratios and 95% confidence intervals. The spread width was inversely proportional to the gap between the denture and mucosa. Regression analysis revealed statistically significant associations between the need for a denture reline and both the paste spread width and the duration of denture use. According to ROC curve analysis of the 'reline' and 'non-reline' groups, the need for a denture reline was indicated at a paste spread width of 20mm or less. At this 20-mm threshold, the sensitivity was 851% and the specificity was 750%. The fit of removable denture bases was quantitatively evaluated by measuring the spread width of non-setting pressure-indicating paste extruded onto denture fit surfaces. The results suggest that the paste spread width is a useful parameter for discriminating the need for a denture reline. PMID:24894573

  4. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2015-06-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  5. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  6. Peer Effects and Peer Group Processes: Joining the Conversation on Quantitative and Qualitative Methods.

    ERIC Educational Resources Information Center

    Nash, Roy

    2002-01-01

    Discusses quantitative and qualitative approaches in research on peer effects on student attainment, using two texts to argue that the definition of "effect" cannot be restricted to "statistical effect," and that institutional properties are not the sum of individual properties. Asserts that quantitative investigators have statistical effects that…

  7. Effect of platform, reference material, and quantification model on enumeration of Enterococcus by quantitative PCR methods

    EPA Science Inventory

    Quantitative polymerase chain reaction (qPCR) is increasingly being used for the quantitative detection of fecal indicator bacteria in beach water. QPCR allows for same-day health warnings, and its application is being considered as an optionn for recreational water quality testi...

  8. A method for quantitative analysis of clump thickness in cervical cytology slides.

    PubMed

    Fan, Yilun; Bradley, Andrew P

    2016-01-01

    Knowledge of the spatial distribution and thickness of cytology specimens is critical to the development of digital slide acquisition techniques that minimise both scan times and image file size. In this paper, we evaluate a novel method to achieve this goal utilising an exhaustive high-resolution scan, an over-complete wavelet transform across multi-focal planes and a clump segmentation of all cellular materials on the slide. The method is demonstrated with a quantitative analysis of ten normal, but difficult to scan Pap stained, Thin-prep, cervical cytology slides. We show that with this method the top and bottom of the specimen can be estimated to an accuracy of 1?m in 88% and 97% of the fields of view respectively. Overall, cellular material can be over 30?m thick and the distribution of cells is skewed towards the cover-slip (top of the slide). However, the median clump thickness is 10?m and only 31% of clumps contain more than three nuclei. Therefore, by finding a focal map of the specimen the number of 1?m spaced focal planes that are required to be scanned to acquire 95% of the in-focus material can be reduced from 25.4 to 21.4 on average. In addition, we show that by considering the thickness of the specimen, an improved focal map can be produced which further reduces the required number of 1?m spaced focal planes to 18.6. This has the potential to reduce scan times and raw image data by over 25%. PMID:26477005

  9. A new quantitative method to evaluate the in vitro bioactivity of melt and sol-gel-derived silicate glasses.

    PubMed

    Arcos, D; Greenspan, D C; Vallet-Reg, M

    2003-06-01

    Two melt-derived glasses (45S5 and 60S) and four sol-gel glasses (58S, 68S, 77S, and 91S) have been synthesized. The activation energy for the silicon release was determined, and a very close correlation was observed between this value and published results of the bioactive behavior of the glasses. This relationship can be explained in terms of the influence of chemical composition, textural properties, and structural density on the silanol group formation and silicon dissolution. These measurements provide a quantitative method to evaluate the in vitro bioactivity of SiO(2)-based glasses. Preliminary studies suggest an activation energy gap (Ea) of 0.35-0.5 eV as a boundary between bioactive and nonbioactive glasses. PMID:12746881

  10. New method for the detection of micro-organisms in blood: application of quantitative interpretation model to aerobic blood cultures

    NASA Astrophysics Data System (ADS)

    Huffman, Debra E.; Serebrennikova, Yulia M.; Smith, Jennifer M.; Leparc, German F.; Garca-Rubio, Luis H.

    2009-05-01

    The physical and chemical changes occurring in blood that has been inoculated into a blood culture bottle can be used as means to detect the presence of microorganisms in blood cultures. These changes include primarily the conversion of oxy- to deoxyhemoglobin within the red blood cells (RBCs) and changes in the cell number densities. These changes in the physical and chemical properties of blood can be readily detected using spectrophometric methods thus enabling the continuous monitoring of blood culture vials to provide quantitative information on the growth behavior of the microorganisms present. This paper reports on the application of spectrophotometric information obtained from diffuse reflectance measurements of aerobic blood cultures to detect microbial growth and compares the results to those obtained using the standard blood culture system.

  11. Semi-quantitative determination of the modes of occurrence of elements in coal: Results from an International Round Robin Project

    SciTech Connect

    Willett, J.C.; Finkelman, R.B.; Mroczkowski, S.J.; Palmer, C.A.; Kolker, A.

    1999-07-01

    Quantifying the modes of occurrence of elements in coal is necessary for the development of models to predict an element's behavior during in-ground leaching, weathering, coal cleaning, and combustion. Anticipating the behavior of the trace elements is necessary for evaluating the environmental and human health impacts, technological impacts, and economic byproduct potential of coal use. To achieve the goal of quantifying element modes of occurrence, an international round robin project was initiated. Four bituminous coal samples (from the United States, England, Australia and Canada) were distributed to participating laboratories (9 labs from 5 countries) for analysis. Preliminary results indicate that there is good agreement among six laboratories for the chemical analyses. Using selective leaching, quantitative electron microprobe analyses, and semi-quantitative X-ray diffraction, the authors found that many elements have similar modes of occurrence in all four samples. For example, at least 75% of the Al, K, and Li and about 50% of Be, Sc, V, and Cr are leached by HF. Because HF dissolves silicates, the authors infer that these elements are in the clays. As, Hg, Cu, Zn, Cd, and Pb are leached primarily by HCl and HNO{sub 3}, indicating that they are associated with mono- (such as sphalerite and galena) and di-sulfides (pyrite). Leaching results indicate that small amounts of these metals may be associated with clays and organics. Iron behaves differently in each three of the samples, likely due to different proportions of iron in sulfide, carbonate, and silicate phases. Results from the other laboratories (using selective leaching and density separations) appear to be consistent with these results.

  12. Spectral simulation methods for enhancing qualitative and quantitative analyses based on infrared spectroscopy and quantitative calibration methods for passive infrared remote sensing of volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Sulub, Yusuf Ismail

    Infrared spectroscopy (IR) has over the years found a myriad of applications including passive environmental remote sensing of toxic pollutants and the development of a blood glucose sensor. In this dissertation, capabilities of both these applications are further enhanced with data analysis strategies employing digital signal processing and novel simulation approaches. Both quantitative and qualitative determinations of volatile organic compounds are investigated in the passive IR remote sensing research described in this dissertation. In the quantitative work, partial least-squares (PLS) regression analysis is used to generate multivariate calibration models for passive Fourier transform IR remote sensing measurements of open-air generated vapors of ethanol in the presence methanol as an interfering species. A step-wise co-addition scheme coupled with a digital filtering approach is used to attenuate the effects of variation in optical path length or plume width. For the qualitative study, an IR imaging line scanner is used to acquire remote sensing data in both spatial and spectral domains. This technology is capable of not only identifying but also specifying the location of the sample under investigation. Successful implementation of this methodology is hampered by the huge costs incurred to conduct these experiments and the impracticality of acquiring large amounts of representative training data. To address this problem, a novel simulation approach is developed that generates training data based on synthetic analyte-active and measured analyte-inactive data. Subsequently, automated pattern classifiers are generated using piecewise linear discriminant analysis to predict the presence of the analyte signature in measured imaging data acquired in remote sensing applications. Near infrared glucose determinations based on the region of 5000--4000 cm-1 is the focus of the research in the latter part of this dissertation. A six-component aqueous matrix of glucose in the presence of five other interferent species, all spanning physiological levels, is analyzed quantitatively. Multivariate PLS regression analysis in conjunction with samples designated into a calibration set is used to formulate models for predicting glucose concentrations. Variations in the instrumental response caused by drift and environmental factors are observed to degrade the performance of these models. As a remedy, a model updating approach based on spectral simulation is developed that is highly successful in eliminating the adverse effects of non-chemical variations.

  13. Standardisation of data from real-time quantitative PCR methods evaluation of outliers and comparison of calibration curves

    PubMed Central

    Burns, Malcolm J; Nixon, Gavin J; Foy, Carole A; Harris, Neil

    2005-01-01

    Background As real-time quantitative PCR (RT-QPCR) is increasingly being relied upon for the enforcement of legislation and regulations dependent upon the trace detection of DNA, focus has increased on the quality issues related to the technique. Recent work has focused on the identification of factors that contribute towards significant measurement uncertainty in the real-time quantitative PCR technique, through investigation of the experimental design and operating procedure. However, measurement uncertainty contributions made during the data analysis procedure have not been studied in detail. This paper presents two additional approaches for standardising data analysis through the novel application of statistical methods to RT-QPCR, in order to minimise potential uncertainty in results. Results Experimental data was generated in order to develop the two aspects of data handling and analysis that can contribute towards measurement uncertainty in results. This paper describes preliminary aspects in standardising data through the application of statistical techniques to the area of RT-QPCR. The first aspect concerns the statistical identification and subsequent handling of outlying values arising from RT-QPCR, and discusses the implementation of ISO guidelines in relation to acceptance or rejection of outlying values. The second aspect relates to the development of an objective statistical test for the comparison of calibration curves. Conclusion The preliminary statistical tests for outlying values and comparisons between calibration curves can be applied using basic functions found in standard spreadsheet software. These two aspects emphasise that the comparability of results arising from RT-QPCR needs further refinement and development at the data-handling phase. The implementation of standardised approaches to data analysis should further help minimise variation due to subjective judgements. The aspects described in this paper will help contribute towards the development of a set of best practice guidelines regarding standardising handling and interpretation of data arising from RT-QPCR experiments. PMID:16336641

  14. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V. (Tijeras, NM)

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  15. Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation

    PubMed Central

    Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.

    2013-01-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  16. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  17. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Nria; Herrero, Pol; Marin, Slvia; Nadal, Pedro; Ras, Maria Rosa; Rodrguez, Miguel ngel; Arola, Llus

    2016-01-01

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. PMID:26275862

  18. Challenges of Interdisciplinary Research: Reconciling Qualitative and Quantitative Methods for Understanding Human-Landscape Systems

    NASA Astrophysics Data System (ADS)

    Lach, Denise

    2014-01-01

    While interdisciplinary research is increasingly practiced as a way to transcend the limitations of individual disciplines, our concepts, and methods are primarily rooted in the disciplines that shape the way we think about the world and how we conduct research. While natural and social scientists may share a general understanding of how science is conducted, disciplinary differences in methodologies quickly emerge during interdisciplinary research efforts. This paper briefly introduces and reviews different philosophical underpinnings of quantitative and qualitative methodological approaches and introduces the idea that a pragmatic, realistic approach may allow natural and social scientists to work together productively. While realism assumes that there is a reality that exists independently of our perceptions, the work of scientists is to explore the mechanisms by which actions cause meaningful outcomes and the conditions under which the mechanisms can act. Our task as interdisciplinary researchers is to use the insights of our disciplines in the context of the problem to co-produce an explanation for the variables of interest. Research on qualities necessary for successful interdisciplinary researchers is also discussed along with recent efforts by funding agencies and academia to increase capacities for interdisciplinary research.

  19. Quantitative assessment of MS plaques and brain atrophy in multiple sclerosis using semiautomatic segmentation method

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Dastidar, Prasun; Ryymin, Pertti; Lahtinen, Antti J.; Eskola, Hannu; Malmivuo, Jaakko

    1997-05-01

    Quantitative magnetic resonance (MR) imaging of the brain is useful in multiple sclerosis (MS) in order to obtain reliable indices of disease progression. The goal of this project was to estimate the total volume of gliotic and non gliotic plaques in chronic progressive multiple sclerosis with the help of a semiautomatic segmentation method developed at the Ragnar Granit Institute. Youth developed program running on a PC based computer provides de displays of the segmented data, in addition to the volumetric analyses. The volumetric accuracy of the program was demonstrated by segmenting MR images of fluid filed syringes. An anatomical atlas is to be incorporated in the segmentation system to estimate the distribution of MS plaques in various neural pathways of the brain. A total package including MS plaque volume estimation, estimation of brain atrophy and ventricular enlargement, distribution of MS plaques in different neural segments of the brain has ben planned for the near future. Our study confirmed that total lesion volumes in chronic MS disease show a poor correlation to EDSS scores but show a positive correlation to neuropsychological scores. Therefore accurate total volume measurements of MS plaques using the developed semiautomatic segmentation technique helped us to evaluate the degree of neuropsychological impairment.

  20. Development and validation of an event-specific quantitative PCR method for genetically modified maize MIR162.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2014-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162. PMID:25743383

  1. Quantitative determination of zopiclone and its impurity by four different spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Abdelrahman, Maha M.; Naguib, Ibrahim A.; El Ghobashy, Mohamed R.; Ali, Nesma A.

    2015-02-01

    Four simple, sensitive and selective spectrophotometric methods are presented for determination of Zopiclone (ZPC) and its impurity, one of its degradation products, namely; 2-amino-5-chloropyridine (ACP). Method A is a dual wavelength spectrophotometry; where two wavelengths (252 and 301 nm for ZPC, and 238 and 261 nm for ACP) were selected for each component in such a way that difference in absorbance is zero for the second one. Method B is isoabsorptive ratio method by combining the isoabsorptive point (259.8 nm) in the ratio spectrum using ACP as a divisor and the ratio difference for a single step determination of both components. Method C is third derivative (D3) spectrophotometric method which allows determination of both ZPC at 283.6 nm and ACP at 251.6 nm without interference of each other. Method D is based on measuring the peak amplitude of the first derivative of the ratio spectra (DD1) at 263.2 nm for ZPC and 252 nm for ACP. The suggested methods were validated according to ICH guidelines and can be applied for routine analysis in quality control laboratories. Statistical analysis of the results obtained from the proposed methods and those obtained from the reported method has been carried out revealing high accuracy and good precision.

  2. Quantitative determination of zopiclone and its impurity by four different spectrophotometric methods.

    PubMed

    Abdelrahman, Maha M; Naguib, Ibrahim A; El Ghobashy, Mohamed R; Ali, Nesma A

    2015-02-25

    Four simple, sensitive and selective spectrophotometric methods are presented for determination of Zopiclone (ZPC) and its impurity, one of its degradation products, namely; 2-amino-5-chloropyridine (ACP). Method A is a dual wavelength spectrophotometry; where two wavelengths (252 and 301 nm for ZPC, and 238 and 261 nm for ACP) were selected for each component in such a way that difference in absorbance is zero for the second one. Method B is isoabsorptive ratio method by combining the isoabsorptive point (259.8 nm) in the ratio spectrum using ACP as a divisor and the ratio difference for a single step determination of both components. Method C is third derivative (D(3)) spectrophotometric method which allows determination of both ZPC at 283.6 nm and ACP at 251.6 nm without interference of each other. Method D is based on measuring the peak amplitude of the first derivative of the ratio spectra (DD(1)) at 263.2 nm for ZPC and 252 nm for ACP. The suggested methods were validated according to ICH guidelines and can be applied for routine analysis in quality control laboratories. Statistical analysis of the results obtained from the proposed methods and those obtained from the reported method has been carried out revealing high accuracy and good precision. PMID:25244295

  3. Test Results for Entry Guidance Methods for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2003-01-01

    There are a number of approaches to advanced guidance and control (AG&C) that have the potential for achieving the goals of significantly increasing reusable launch vehicle (RLV) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies (ITAGCT) has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future reusable vehicle concepts.

  4. Test Results for Entry Guidance Methods for Space Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2004-01-01

    There are a number of approaches to advanced guidance and control that have the potential for achieving the goals of significantly increasing reusable launch vehicle (or any space vehicle that enters an atmosphere) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future vehicle concepts.

  5. [Comparison of direct colony count methods and the MPN-method for quantitative detection of Listeria in model and field conditions].

    TOXLINE Toxicology Bibliographic Information

    Hildebrandt G; Schott W

    2001-11-01

    In order to compare the plate count method for quantitating Listeria, as published in the "Official Collection of Testing Methods" in section 35 LMBG (L. 00.00-22), to an MPN-method for Listeria based on the same mediums, these two detection methods for Listeria were tested in three sets of experiments and a routine sample status evaluation. A pure broth culture of L. monocytogenes, artificially with L. monocytogenes contaminated ground meat, artificially contaminated and cold stored ground meat as well as 77 ground beef samples from Berlin retail food stores were used in the four trials. The detection limit of the MPN-method is about 66% lower than the plate count method allowing detection of a clearly greater number of Listeria-positive samples from naturally contaminated ground meat. The MPN-method yielded more Listeria spp.-positive samples (rel. 43%) and more L. monocytogenes-positive samples (rel. 21%) versus the colony count method based on the results from the field trial using ground beef samples from retail food stores in Berlin. Nevertheless the standardized colony count method is preferred over the MPN-method for routine use because of its slightly higher productivity and much smaller variation in the results. However, the MPN-method is preferable for epidemiological studies because of the significance of the lower detection level. The random sampling evaluation of ground beef from retail stores indicated that 39% of the samples were Listeria spp.-positive and 31% were L. monocytogenes-positive when using the colony count method. A total of 56% of the meat samples were found to be Listeria spp.-positive and 38% L. monocytogenes-positive when the MPN-method was used. Population levels ranged from 10 to 580 cfu/g (Listeria spp.-positive samples) and from 10 to 270 cfu/g (L. monocytogenes-positive samples) for the colony count method. The MPN-method yielded population levels of 3.6 to 930 MPN/g for Listeria spp.-positive samples and 3.6 to 150 MPN/g for L. monocytogenes-positive samples. L. monocytogenes strains isolated using the colony count method belonged to the following serovars: 1/2a (46%), 1/2b (13%), 1/2c (33%), 3b (4%) and 4c (4%). A similar serovar isolation pattern was found for L. monocytogenes-positive MPN-tubes. The most common serotype was 1/2a (43%), followed by 1/2c (32%) and 1/2b (14%). The serotypes 3c, 4b and 4c were all isolated 4% of the time.

  6. Quantitative ultrasound method to detect and monitor laser-induced cavitation bubbles

    PubMed Central

    Karpiouk, Andrei B.; Aglyamov, Salavat R.; Bourgeois, Frederic; Ben-Yakar, Adela; Emelianov, Stanislav Y.

    2008-01-01

    An ultrasound technique to measure the spatial and temporal behavior of the laser-induced cavitation bubble is introduced. The cavitation bubbles were formed in water and in gels using a nanosecond pulsed Nd:YAG laser operating at 532 nm. A focused, single-element, 25-MHz ultrasound transducer was employed both to detect the acoustic emission generated by plasma expansion and to acoustically probe the bubble at different stages of its evolution. The arrival time of the passive acoustic emission was used to estimate the location of the cavitation bubbles origin and the time of flight of the ultrasound pulse-echo signal was used to define its spatial extent. The results of ultrasound estimations of the bubble size were compared and found to be in agreement with both the direct optical measurements of the stationary bubble and the theoretical estimates of bubble dynamics derived from the well-known Rayleigh model of a cavity collapse. The results of this study indicate that the proposed quantitative ultrasound technique, capable of detecting and accurately measuring laser-induced cavitation bubbles in water and in a tissue-like medium, could be used in various biomedical and clinical applications. PMID:18601556

  7. Errors in quantitative T1rho imaging and the correction methods

    PubMed Central

    2015-01-01

    The spin-lattice relaxation time constant in rotating frame (T1rho) is useful for assessment of the properties of macromolecular environment inside tissue. Quantification of T1rho is found promising in various clinical applications. However, T1rho imaging is prone to image artifacts and quantification errors, which remains one of the greatest challenges to adopt this technique in routine clinical practice. The conventional continuous wave spin-lock is susceptible to B1 radiofrequency (RF) and B0 field inhomogeneity, which appears as banding artifacts in acquired images. A number of methods have been reported to modify T1rho prep RF pulse cluster to mitigate this effect. Adiabatic RF pulse can also be used for spin-lock with insensitivity to both B1 RF and B0 field inhomogeneity. Another source of quantification error in T1rho imaging is signal evolution during imaging data acquisition. Care is needed to affirm such error does not take place when specific pulse sequence is used for imaging data acquisition. Another source of T1rho quantification error is insufficient signal-to-noise ratio (SNR), which is common among various quantitative imaging approaches. Measurement of T1rho within an ROI can mitigate this issue, but at the cost of reduced resolution. Noise-corrected methods are reported to address this issue in pixel-wise quantification. For certain tissue type, T1rho quantification can be confounded by magic angle effect and the presence of multiple tissue components. Review of these confounding factors from inherent tissue properties is not included in this article. PMID:26435922

  8. Effects of drying methods on qualitative and quantitative properties of essential oil of two basil landraces.

    PubMed

    Ghasemi Pirbalouti, Abdollah; Mahdad, Elahe; Craker, Lyle

    2013-12-01

    Sweet basil, a plant that is extensively cultivated in some countries, is used to enhance the flavour of salads, sauces, pasta and confectioneries as both a fresh and dried herb. To determine the effect of drying methods on qualitative and quantitative characteristics of the plant and essential oil of basil, two landraces, Purple and Green, were dried in sunlight, shade, mechanical ovens at 40 C and 60 C, a microwave oven at 500 W and by freeze-drying. For comparison, the essential oils of all samples were extracted by hydrodistillation and analyzed using GC and GC-MS. The highest essential oil yields (v/w on dry weight basis) were obtained from shade-dried tissue in both landraces followed by the freeze-dried sample of the purple landrace and the fresh sample of green landrace. Increasing the drying temperature significantly decreased the essential oil content of all samples. Significant changes in the chemical profile of the essential oils from each of the landrace were associated with the drying method, including the loss of most monoterpene hydrocarbons, as compared with fresh samples. No significant differences occurred among several constituents in the extracted essential oils, including methyl chavicol (estragole), the major compound in the oil of both landraces, whether the plants were dried in the shade or sun, oven at 40 C or freeze-dried, as compared with a fresh sample. The percentage methyl chavicol in the oil, however, decreased significantly when the plant material was dried in the oven at 60 C or microwaved. In addition, linalool, the second major compound in the purple landrace, and geranial and neral, major compounds in the green landrace, decreased significantly when the plant tissue was dried in the oven at 60 C or microwaved. PMID:23870979

  9. Errors in quantitative T1rho imaging and the correction methods.

    PubMed

    Chen, Weitian

    2015-08-01

    The spin-lattice relaxation time constant in rotating frame (T1rho) is useful for assessment of the properties of macromolecular environment inside tissue. Quantification of T1rho is found promising in various clinical applications. However, T1rho imaging is prone to image artifacts and quantification errors, which remains one of the greatest challenges to adopt this technique in routine clinical practice. The conventional continuous wave spin-lock is susceptible to B1 radiofrequency (RF) and B0 field inhomogeneity, which appears as banding artifacts in acquired images. A number of methods have been reported to modify T1rho prep RF pulse cluster to mitigate this effect. Adiabatic RF pulse can also be used for spin-lock with insensitivity to both B1 RF and B0 field inhomogeneity. Another source of quantification error in T1rho imaging is signal evolution during imaging data acquisition. Care is needed to affirm such error does not take place when specific pulse sequence is used for imaging data acquisition. Another source of T1rho quantification error is insufficient signal-to-noise ratio (SNR), which is common among various quantitative imaging approaches. Measurement of T1rho within an ROI can mitigate this issue, but at the cost of reduced resolution. Noise-corrected methods are reported to address this issue in pixel-wise quantification. For certain tissue type, T1rho quantification can be confounded by magic angle effect and the presence of multiple tissue components. Review of these confounding factors from inherent tissue properties is not included in this article. PMID:26435922

  10. Comparison of interlaboratory results for blood lead with results from a definitive method.

    PubMed

    Boone, J; Hearn, T; Lewis, S

    1979-03-01

    Results reported by 113 participants in the Blood Lead Proficiency Testing Program conducted by the Center for Disease Control were compared with those obtained by the National Bureau of Standards (NBS) with a definitive methods (mass spectroscopy-isotopic dilution) for blood lead analyses. Data were compiled from the results obtained for 12 whole-blood samples containing 1.5 g of disodium EDTA per liter. Twelve separate blood samples were obtained from cattle which had been given lead nitrate orally. Lead concentrations in the samples ranged from 0.628 to 4.93 mumol/L (130-1020 micrograms/L) as determined by NBS. The methods used by laboratories were classified according to six basic groups: anodic stripping voltametry; and atomic absorption spectroscopy in which either extraction, carbon rod, graphite furnace, tantalum strip, or Delves cup was used. For results obtained in each group a linear regression analyses of laboratory values was made on the basis of NBS values. In comparison to the definitive method, most field methods for blood lead tended to overestimate the lead concentration when the actual lead concentration was less than 1.96 mumol/L (400 micrograms/L) and to underestimate the lead concentration when the actual lead concentration was greater than 2.45 mumol/L (500 micrograms/L). PMID:262177

  11. A new sample preparation method for the absolute quantitation of a target proteome using (18)O labeling combined with multiple reaction monitoring mass spectrometry.

    PubMed

    Li, Jiabin; Zhou, Lianqi; Wang, Huanhuan; Yan, Hui; Li, Nannan; Zhai, Rui; Jiao, Fenglong; Hao, Feiran; Jin, Zuyao; Tian, Fang; Peng, Bo; Zhang, Yangjun; Qian, Xiaohong

    2015-02-21

    A key step in the workflow of bottom-up proteomics is the proteolysis of proteins into peptides with trypsin. In addition, enzyme-catalytic (18)O labeled peptides as internal standards coupled with multiple reaction monitoring mass spectrometry (MRM MS) for the absolute quantitation of the target proteome is commonly used for its convenient operation and low cost. However, long digestion and labeling times, incomplete digestion and (18)O to (16)O back exchange limit its application, therefore, we developed a rapid and efficient digestion method based on a high ratio of trypsin to protein. In addition, after separation of the digested samples using pipette tips packed with reversed-phase packing materials in house, the trypsin can be separated, collected and reused at least four times. Based on this approach, a novel protein quantification method using (18)O-labeled QconCAT peptides as internal standards combined with MRM MS for the absolute quantitation of a target proteome is established. Experimental results showed that the novel method had high digestion and (18)O labeling efficiencies, and no (18)O to (16)O back-exchange occurred. A linear range covering 2 orders of magnitude and a limit of quantification (LOQ) as low as 5 fmol were achieved with an RSD below 10%. Then, the quantitative method is used for the absolute quantitation of drug metabolizing enzymes in human liver microsomes. The results are in good agreement with the previously reported data, which demonstrates that the novel method can be used for absolute quantitative analyses of target proteomes in complex biological samples. PMID:25568899

  12. Quantitative structure-activity relationship of the curcumin-related compounds using various regression methods

    NASA Astrophysics Data System (ADS)

    Khazaei, Ardeshir; Sarmasti, Negin; Seyf, Jaber Yousefi

    2016-03-01

    Quantitative structure activity relationship were used to study a series of curcumin-related compounds with inhibitory effect on prostate cancer PC-3 cells, pancreas cancer Panc-1 cells, and colon cancer HT-29 cells. Sphere exclusion method was used to split data set in two categories of train and test set. Multiple linear regression, principal component regression and partial least squares were used as the regression methods. In other hand, to investigate the effect of feature selection methods, stepwise, Genetic algorithm, and simulated annealing were used. In two cases (PC-3 cells and Panc-1 cells), the best models were generated by a combination of multiple linear regression and stepwise (PC-3 cells: r2 = 0.86, q2 = 0.82, pred_r2 = 0.93, and r2m (test) = 0.43, Panc-1 cells: r2 = 0.85, q2 = 0.80, pred_r2 = 0.71, and r2m (test) = 0.68). For the HT-29 cells, principal component regression with stepwise (r2 = 0.69, q2 = 0.62, pred_r2 = 0.54, and r2m (test) = 0.41) is the best method. The QSAR study reveals descriptors which have crucial role in the inhibitory property of curcumin-like compounds. 6ChainCount, T_C_C_1, and T_O_O_7 are the most important descriptors that have the greatest effect. With a specific end goal to design and optimization of novel efficient curcumin-related compounds it is useful to introduce heteroatoms such as nitrogen, oxygen, and sulfur atoms in the chemical structure (reduce the contribution of T_C_C_1 descriptor) and increase the contribution of 6ChainCount and T_O_O_7 descriptors. Models can be useful in the better design of some novel curcumin-related compounds that can be used in the treatment of prostate, pancreas, and colon cancers.

  13. Supersonic cruise research aircraft structural studies: Methods and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Gross, D.; Kurtze, W.; Newsom, J.; Wrenn, G.; Greene, W.

    1981-01-01

    NASA Langley Research Center SCAR in-house structural studies are reviewed. In methods development, advances include a new system of integrated computer programs called ISSYS, progress in determining aerodynamic loads and aerodynamically induced structural loads (including those due to gusts), flutter optimization for composite and metal airframe configurations using refined and simplified mathematical models, and synthesis of active controls. Results given address several aspects of various SCR configurations. These results include flutter penalties on composite wing, flutter suppression using active controls, roll control effectiveness, wing tip ground clearance, tail size effect on flutter, engine weight and mass distribution influence on flutter, and strength and flutter optimization of new configurations. The ISSYS system of integrated programs performed well in all the applications illustrated by the results, the diversity of which attests to ISSYS' versatility.

  14. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    EPA Science Inventory

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  15. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    PubMed

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. PMID:26269460

  16. Structured decision making as a method for linking quantitative decision support to community fundamental objectives

    EPA Science Inventory

    Decision support intended to improve ecosystem sustainability requires that we link stakeholder priorities directly to quantitative tools and measures of desired outcomes. Actions taken at the community level can have large impacts on production and delivery of ecosystem service...

  17. A processing method and results of meteor shower radar observations

    NASA Technical Reports Server (NTRS)

    Belkovich, O. I.; Suleimanov, N. I.; Tokhtasjev, V. S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed.

  18. Methods and preliminary measurement results of liquid Li wettability

    SciTech Connect

    Zuo, G. Z. Hu, J. S.; Ren, J.; Sun, Z.; Yang, Q. X.; Li, J. G.; Zakharov, L. E.; Mansfield, D. K.

    2014-02-15

    A test of lithium wettability was performed in high vacuum (< 3 × 10{sup −4} Pa). High magnification images of Li droplets on stainless steel substrates were produced and processed using the MATLAB{sup ®} program to obtain clear image edge points. In contrast to the more standard “θ/2” or polynomial fitting methods, ellipse fitting of the complete Li droplet shape resulted in reliable contact angle measurements over a wide range of contact angles. Using the ellipse fitting method, it was observed that the contact angle of a liquid Li droplet on a stainless steel substrate gradually decreased with increasing substrate temperature. The critical wetting temperature of liquid Li on stainless steel was observed to be about 290 °C.

  19. Real-time quantitative polymerase chain reaction methods for four genetically modified maize varieties and maize DNA content in food.

    PubMed

    Brodmann, Peter D; Ilg, Evelyn C; Berthoud, Hlne; Herrmann, Andre

    2002-01-01

    Quantitative detection methods are needed for enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients. This labeling threshold, which is set to 1% in the European Union and Switzerland, must be applied to all approved GMOs. Four different varieties of maize are approved in the European Union: the insect-resistant Bt176 maize (Maximizer), Btl 1 maize, Mon810 (YieldGard) maize, and the herbicide-tolerant T25 (Liberty Link) maize. Because the labeling must be considered individually for each ingredient, a quantitation system for the endogenous maize content is needed in addition to the GMO-specific detection systems. Quantitative real-time polymerase chain reaction detection methods were developed for the 4 approved genetically modified maize varieties and for an endogenous maize (invertase) gene system. PMID:12083257

  20. Methods to measure peripheral and central sensitization using quantitative sensory testing: A focus on individuals with low back pain.

    PubMed

    Starkweather, Angela R; Heineman, Amy; Storey, Shannon; Rubia, Gil; Lyon, Debra E; Greenspan, Joel; Dorsey, Susan G

    2016-02-01

    Quantitative sensory testing can be used to assess peripheral and central sensitization; important factors that contribute to the individual's experience of pain and disability. Many studies use quantitative sensory testing in patients with low back pain to detect alterations in pain sensitivity, however, because investigators employ different protocols, interpretation of findings across studies can become problematic. The purpose of this article is to propose a standardized method of testing peripheral and central pain sensitization in patients with low back pain. Video clips are provided to demonstrate correct procedures for measuring the response to experimental pain using mechanical, thermal and pressure modalities. As nurse researchers and clinicians increase utilization of quantitative sensory testing to examine pain phenotypes, it is anticipated that more personalized methods for monitoring the trajectory of low back pain and response to treatment will improve outcomes for this patient population. PMID:26856520