Science.gov

Sample records for advanced quantitative methods

  1. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  2. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  3. Response monitoring using quantitative ultrasound methods and supervised dictionary learning in locally advanced breast cancer

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.

    2016-03-01

    A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.

  4. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  5. Structural Analysis and Quantitative Determination of Clevidipine Butyrate Impurities Using an Advanced RP-HPLC Method.

    PubMed

    Zhou, Yuxia; Zhou, Fan; Yan, Fei; Yang, Feng; Yao, Yuxian; Zou, Qiaogen

    2016-03-01

    Eleven potential impurities, including process-related compounds and degradation products, have been analyzed by comprehensive studies on the manufacturing process of clevidipine butyrate. Possible formation mechanisms could also be devised. MS and NMR techniques have been used for the structural characterization of three previously unreported impurities (Imp-3, Imp-5 and Imp-11). To separate and quantify the potential impurities in a simultaneous fashion, an efficient and advanced RP-HPLC method has been developed. In doing so, four major degradation products (Imp-2, Imp-4, Imp-8 and Imp-10) can be observed under varying stress conditions. This analytical method has been validated according to ICH guidelines with respect to specificity, accuracy, linearity, robustness and stability. The method described has been demonstrated to be applicable in routine quality control processes and stability evaluation studies of clevidipine butyrate.

  6. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  7. Advances in Statistical Methods to Map Quantitative Trait Loci in Outbred Populations

    PubMed Central

    Hoeschele, I.; Uimari, P.; Grignola, F. E.; Zhang, Q.; Gage, K. M.

    1997-01-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown. PMID:9383084

  8. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to

  9. Advances in quantitative Kerr microscopy

    NASA Astrophysics Data System (ADS)

    Soldatov, I. V.; Schäfer, R.

    2017-01-01

    An advanced wide-field Kerr microscopy approach to the vector imaging of magnetic domains is demonstrated. Utilizing the light from eight monochrome light emitting diodes, guided to the microscope by glass fibers, and being properly switched in synchronization with the camera exposure, domain images with orthogonal in-plane sensitivity are obtained simultaneously at real time. After calibrating the Kerr contrast under the same orthogonal sensitivity conditions, the magnetization vector field of complete magnetization cycles along the hysteresis loop can be calculated and plotted as a coded color or vector image. In the pulsed mode also parasitic, magnetic field-dependent Faraday rotations in the microscope optics are eliminated, thus increasing the accuracy of the measured magnetization angles to better than 5∘. The method is applied to the investigation of the magnetization process in a patterned Permalloy film element. Furthermore it is shown that the effective magnetic anisotropy axes in a GaMnAs semiconducting film can be quantitatively measured by vectorial analysis of the domain structure.

  10. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  11. Advanced Mass Spectrometric Methods for the Rapid and Quantitative Characterization of Proteomes

    DOE PAGES

    Smith, Richard D.

    2002-01-01

    Progress is reviewedmore » towards the development of a global strategy that aims to extend the sensitivity, dynamic range, comprehensiveness and throughput of proteomic measurements based upon the use of high performance separations and mass spectrometry. The approach uses high accuracy mass measurements from Fourier transform ion cyclotron resonance mass spectrometry (FTICR) to validate peptide ‘accurate mass tags’ (AMTs) produced by global protein enzymatic digestions for a specific organism, tissue or cell type from ‘potential mass tags’ tentatively identified using conventional tandem mass spectrometry (MS/MS). This provides the basis for subsequent measurements without the need for MS/ MS. High resolution capillary liquid chromatography separations combined with high sensitivity, and high resolution accurate FTICR measurements are shown to be capable of characterizing peptide mixtures of more than 10 5 components. The strategy has been initially demonstrated using the microorganisms Saccharomyces cerevisiae and Deinococcus radiodurans. Advantages of the approach include the high confidence of protein identification, its broad proteome coverage, high sensitivity, and the capability for stableisotope labeling methods for precise relative protein abundance measurements. Abbreviations : LC, liquid chromatography; FTICR, Fourier transform ion cyclotron resonance; AMT, accurate mass tag; PMT, potential mass tag; MMA, mass measurement accuracy; MS, mass spectrometry; MS/MS, tandem mass spectrometry; ppm, parts per million.« less

  12. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  13. Advanced stability indicating chemometric methods for quantitation of amlodipine and atorvastatin in their quinary mixture with acidic degradation products

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2016-02-01

    Two advanced, accurate and precise chemometric methods are developed for the simultaneous determination of amlodipine besylate (AML) and atorvastatin calcium (ATV) in the presence of their acidic degradation products in tablet dosage forms. The first method was Partial Least Squares (PLS-1) and the second was Artificial Neural Networks (ANN). PLS was compared to ANN models with and without variable selection procedure (genetic algorithm (GA)). For proper analysis, a 5-factor 5-level experimental design was established resulting in 25 mixtures containing different ratios of the interfering species. Fifteen mixtures were used as calibration set and the other ten mixtures were used as validation set to validate the prediction ability of the suggested models. The proposed methods were successfully applied to the analysis of pharmaceutical tablets containing AML and ATV. The methods indicated the ability of the mentioned models to solve the highly overlapped spectra of the quinary mixture, yet using inexpensive and easy to handle instruments like the UV-VIS spectrophotometer.

  14. Recent Advances In Quantitative Volcano Seismology

    NASA Astrophysics Data System (ADS)

    Chouet, B.

    A fundamental goal of volcano seismology is to understand active magmatic systems, to characterize the configuration of such systems, and to determine the extent and evolution of source regions of magmatic energy. Such understanding is critical to our assessment of eruptive behavior and its hazardous impacts. With the emergence of portable broadband seismic instrumentation, availability of digital networks with wide dynamic range, and development of new powerful analysis techniques, rapid progress is being made toward a synthesis of high-quality seismic data to develop a coherent model of eruption mechanics. Examples of recent advances are: (1) high-resolution to- mography to image subsurface volcanic structures at scales of a few hundred meters; (2) use of small-aperture seismic antennas to map the spatio-temporal properties of long-period (LP) seismicity; (3) moment tensor inversions of very-long-period (VLP) data to derive the source geometry and mass-transport budget of magmatic fluids; (4) spectral analyses of LP events to determine the acoustic properties of magmatic and associated hydrothermal fluids; and (5) experimental modeling of the source dynamics of volcanic tremor. These promising advances provide new insights into the mechan- ical properties of volcanic fluids and subvolcanic mass-transport dynamics. As new seismic methods refine our understanding of seismic sources, and geochemical meth- ods better constrain mass balance and magma behavior, we face new challenges in elucidating the physico-chemical processes that cause volcanic unrest and its seismic and gas-discharge manifestations. Much work remains to be done toward a synthesis of seismological, geochemical, and petrological observations into an integrated model of volcanic behavior. Future important goals must include: (1) interpreting the key types of magma movement, degassing and boiling events that produce characteristic seismic phenomena; (2) characterizing multiphase fluids in subvolcanic

  15. Advanced time average holographic method for measurement in extensive vibration amplitude range with quantitative single-pixel analysis

    NASA Astrophysics Data System (ADS)

    Psota, Pavel; Lédl, Vít.; Vojtíšek, Petr; Václavík, Jan; Doleček, Roman; Mokrý, Pavel

    2015-05-01

    In this paper we propose a time average digital holographical arrangement employing frequency shift of reference wave and its phase modulation. It results in Phase Modulated Frequency Shifted Time Average Digital Holography PMFSTADH method. This method has a potential to extend currently using frequency shifted time average digital holography to possibility of numerical analysis. It is primarily useful for measurement of great or very small amplitudes of vibration. Moreover we use acusto-optical modulators to realize frequency as well as phase modulation so we need no additional hardware in our experimental setup.

  16. Advances in Quantitative Analyses and Reference Materials Related to Laser Ablation ICP-MS: A Look at Methods and New Directions

    NASA Astrophysics Data System (ADS)

    Koenig, A. E.; Ridley, W. I.

    2009-12-01

    The role of laser ablation ICP-MS (LA-ICP-MS) continues to expand both in geological sciences and other fields. As the technique continues to gain popularity, so too does the need for good reference materials and methods development and validation. Matrix matched reference materials (RMs) are required for calibration and quality control of LA-ICP-MS analyses. New advances in technology such as <200nm lasers and femtosecond lasers have reduced the dependence on matrix matching to some degree, but general matrix matching is still preferred. Much work has revolved around the available RMs such as the NIST 61x silicate glasses and several series of basaltic composition glasses such as the USGS natural basaltic glasses BCR-2g and synthetic basaltic glasses, the GS series (e.g. GSD-1g). While many quantitative hurdles have been recognized by analogous techniques such as EPMA and SIMS, some of these hurdles have not been fully addressed or validated for some cases of LA-ICP-MS. Trace element mapping by LA-ICP-MS is rapidly becoming more widespread for samples. Here relative differences in raw signal can be easily and rapidly obtained. However as too often is the case the magnitude of the relative differences in raw intensity are a function of different ablation yields, sample density or other factors. Methods of quantification for trace element mapping will be presented. The USGS has been developing microanalytical RMs intended for LA-ICP-MS for several years. The widely popular basaltic rock powders BCR-2, BIR-1 and BHVO-2 have all been successfully converted to homogeneous glasses suitable for LA-ICP-MS and have been in use by many workers. The newer synthetic basaltic glass GS series consists of 4 glasses of basaltic composition artificially doped at nominal concentrations of almost of trace elements at 400, 40, 4 and < 1 ppm. Additional developments in non-silcate or basaltic materials include the previously released MASS-1 Cu, Fe, Zn sulfide calibration RM (Wilson et

  17. Advanced Usability Evaluation Methods

    DTIC Science & Technology

    2007-04-01

    tracking in usability evaluation : A practitioner’s guide. In J. Hyönä, R. Radach, & H. Deubel. (Eds.), The mind’s eye: Cognitive and applied...Advanced Usability Evaluation Methods Terence S. Andre, Lt Col, USAF Margaret Schurig, Human Factors Design Specialist, The Boeing Co...TITLE AND SUBTITLE Advanced Usability Evaluation Methods 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT

  18. Quantitative imaging methods in osteoporosis

    PubMed Central

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M. Carola

    2016-01-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research. PMID:28090446

  19. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  20. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three

  1. Quantitative characterization of the protein contents of the exocrine pancreatic acinar cell by soft x-ray microscopy and advanced digital imaging methods

    SciTech Connect

    Loo, Jr., Billy W.

    2000-06-01

    The study of the exocrine pancreatic acinar cell has been central to the development of models of many cellular processes, especially of protein transport and secretion. Traditional methods used to examine this system have provided a wealth of qualitative information from which mechanistic models have been inferred. However they have lacked the ability to make quantitative measurements, particularly of the distribution of protein in the cell, information critical for grounding of models in terms of magnitude and relative significance. This dissertation describes the development and application of new tools that were used to measure the protein content of the major intracellular compartments in the acinar cell, particularly the zymogen granule. Soft x-ray microscopy permits image formation with high resolution and contrast determined by the underlying protein content of tissue rather than staining avidity. A sample preparation method compatible with x-ray microscopy was developed and its properties evaluated. Automatic computerized methods were developed to acquire, calibrate, and analyze large volumes of x-ray microscopic images of exocrine pancreatic tissue sections. Statistics were compiled on the protein density of several organelles, and on the protein density, size, and spatial distribution of tens of thousands of zymogen granules. The results of these measurements, and how they compare to predictions of different models of protein transport, are discussed.

  2. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  3. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  4. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  5. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  6. Linear-After-The-Exponential (LATE)–PCR: An advanced method of asymmetric PCR and its uses in quantitative real-time analysis

    PubMed Central

    Sanchez, J. Aquiles; Pierce, Kenneth E.; Rice, John E.; Wangh, Lawrence J.

    2004-01-01

    Conventional asymmetric PCR is inefficient and difficult to optimize because limiting the concentration of one primer lowers its melting temperature below the reaction annealing temperature. Linear-After-The-Exponential (LATE)–PCR describes a new paradigm for primer design that renders assays as efficient as symmetric PCR assays, regardless of primer ratio. LATE-PCR generates single-stranded products with predictable kinetics for many cycles beyond the exponential phase. LATE-PCR also introduces new probe design criteria that uncouple hybridization probe detection from primer annealing and extension, increase probe reliability, improve allele discrimination, and increase signal strength by 80–250% relative to symmetric PCR. These improvements in PCR are particularly useful for real-time quantitative analysis of target numbers in small samples. LATE-PCR is adaptable to high throughput applications in fields such as clinical diagnostics, biodefense, forensics, and DNA sequencing. We showcase LATE-PCR via amplification of the cystic fibrosis CFΔ508 allele and the Tay-Sachs disease TSD 1278 allele from single heterozygous cells. PMID:14769930

  7. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  8. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  9. Quantitative Methods for Software Selection and Evaluation

    DTIC Science & Technology

    2006-09-01

    Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program...5 2 Evaluation Methods ...Abstract When performing a “buy” analysis and selecting a product as part of a software acquisition strategy , most organizations will consider primarily

  10. A new HPLC method for azithromycin quantitation.

    PubMed

    Zubata, Patricia; Ceresole, Rita; Rosasco, Maria Ana; Pizzorno, Maria Teresa

    2002-02-01

    A simple liquid chromatographic method was developed for the estimation of azithromycin raw material and in pharmaceutical forms. The sample was chromatographed on a reverse phase C18 column and eluants monitored at a wavelength of 215 nm. The method was accurate, precise and sufficiently selective. It is applicable for its quantitation, stability and dissolution tests.

  11. Advances in Adaptive Control Methods

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2009-01-01

    This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.

  12. Editorial: biotech methods and advances.

    PubMed

    Jungbauer, Alois

    2013-01-01

    This annual Methods and Advances Special Issue of Biotechnology Journal contains a selection of cutting-edge research and review articles with a particular emphasis on vertical process understanding – read more in this editorial by Prof. Alois Jungbauer, BTJ co-Editor-in-Chief.

  13. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  14. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  15. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  16. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  17. Quantitative levels of immunoglobulin E in advanced tuberculosis.

    PubMed

    Casterline, C L; Evans, R; Ward, G W

    1976-07-01

    Quantitative levels of immunoglobulin E (IgE) were determined in samples of sera obtained from 29 patients with proven moderate to far-advanced tuberculosis. The sensitive radioimmunoassay test for IgE was used. Statistical analysis of the results revealed no difference in IgE values as compared to a control group of normal sera. In contrast to other chronic pulmonary infections, such as bronchopulmonary aspergillosis, the IgE level in pulmonary tuberculous infection is of no diagnostic significance. Simultaneous determination of levels of immunoglobulins G, A, M, and D (IgG, IgA, IgM, IgD) in these same sera by radial immunodiffusion showed elevated IgG and lowered IgM levels in the tuberculous patients, confirming previous studies. The significance of these alterations in immunoglobulin levels is unclear and may represent a secondary phenomenon rather than a primary host response.

  18. Quantitative methods in classical perturbation theory.

    NASA Astrophysics Data System (ADS)

    Giorgilli, A.

    Poincaré proved that the series commonly used in Celestial mechanics are typically non convergent, although their usefulness is generally evident. Recent work in perturbation theory has enlightened this conjecture of Poincaré, bringing into evidence that the series of perturbation theory, although non convergent in general, furnish nevertheless valuable approximations to the true orbits for a very large time, which in some practical cases could be comparable with the age of the universe. The aim of the author's paper is to introduce the quantitative methods of perturbation theory which allow to obtain such powerful results.

  19. [The Study of Advanced Fundamental Parameter Method in EDXRFA].

    PubMed

    Cheng, Feng; Zhang, Qing-xian; Ge, Liang-quan; Gu, Yi; Zeng, Guo-qiang; Luo, Yao-yao; Chen, Shuang; Wang, Lei; Zhao, Jian-kun

    2015-07-01

    The X-ray Fluorescence Analysis(XRFA) is an important and efficient method on the element anylsis and is used in geology, industry and environment protection. But XRFA has a backdraw that the determination limit and accuracy are effected by the matrix of the sample. Now the fundamental parameter is usually used to calculate the content of elements in XRFA, and it is an efficient method if the matrix and net area of characteristic X-ray peak are obtained. But this is invalide in in-stu XRFA. Also the method of net area and the "black material" of sample are the key point of the fundamental parameter method when the Energy Dispersive X-ray Fluorescence Analysis(EDXRFA) method is used in the low content sample. In this paper a advanced fundamental parameter method is discussed. The advanced fundamental parameter method includes the spectra analysis and the fundamental parameter method, which inserts the overlapping peaks separation method into the iteration process of the fundamental parameter method. The advanced method can resolve the net area and the quantitative analysis. The advanced method is used to analyse the standard sample. Compare to the content obtained from the coefficient method, the precision of Cu, Ni and Zn is better than coeffieciency method. The result shows that the advanced method could improve the precision of the EDXRFA, so the advanced method is better than the coefficient method.

  20. Advance in orientation microscopy: quantitative analysis of nanocrystalline structures.

    PubMed

    Seyring, Martin; Song, Xiaoyan; Rettenmayr, Markus

    2011-04-26

    The special properties of nanocrystalline materials are generally accepted to be a consequence of the high density of planar defects (grain and twin boundaries) and their characteristics. However, until now, nanograin structures have not been characterized with similar detail and statistical relevance as coarse-grained materials, due to the lack of an appropriate method. In the present paper, a novel method based on quantitative nanobeam diffraction in transmission electron microscopy (TEM) is presented to determine the misorientation of adjacent nanograins and subgrains. Spatial resolution of <5 nm can be achieved. This method is applicable to characterize orientation relationships in wire, film, and bulk materials with nanocrystalline structures. As a model material, nanocrystalline Cu is used. Several important features of the nanograin structure are discovered utilizing quantitative analysis: the fraction of twin boundaries is substantially higher than that observed in bright-field images in the TEM; small angle grain boundaries are prominent; there is an obvious dependence of the grain boundary characteristics on grain size distribution and mean grain size.

  1. [Progress in stable isotope labeled quantitative proteomics methods].

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  2. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  3. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  4. Advances in Quantitative UV-Visible Spectroscopy for Clinical and Pre-clinical Application in Cancer

    PubMed Central

    Brown, J. Quincy; Vishwanath, Karthik; Palmer, Gregory M.; Ramanujam, Nirmala

    2009-01-01

    Summary Methods of optical spectroscopy which provide quantitative, physically or physiologically meaningful measures of tissue properties are an attractive tool for the study, diagnosis, prognosis, and treatment of various cancers. Recent development of methodologies to convert measured reflectance and fluorescence spectra from tissue to cancer-relevant parameters such as vascular volume, oxygenation, extracellular matrix extent, metabolic redox states, and cellular proliferation have significantly advanced the field of tissue optical spectroscopy. The number of publications reporting quantitative tissue spectroscopy results in the UV-visible wavelength range has increased sharply in the last 3 years, and includes new and emerging studies which correlate optically-measured parameters with independent measures such as immunohistochemistry, which should aid in increased clinical acceptance of these technologies. PMID:19268567

  5. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    PubMed

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  6. Advanced method for oligonucleotide deprotection

    PubMed Central

    Surzhikov, Sergey A.; Timofeev, Edward N.; Chernov, Boris K.; Golova, Julia B.; Mirzabekov, Andrei D.

    2000-01-01

    A new procedure for rapid deprotection of synthetic oligodeoxynucleotides has been developed. While all known deprotection methods require purification to remove the residual protective groups (e.g. benzamide) and insoluble silicates, the new procedure based on the use of an ammonia-free reagent mixture allows one to avoid the additional purification steps. The method can be applied to deprotect the oligodeoxynucleotides synthesized by using the standard protected nucleoside phosphoramidites dGiBu, dCBz and dABz. PMID:10734206

  7. Advanced Fine Particulate Characterization Methods

    SciTech Connect

    Steven Benson; Lingbu Kong; Alexander Azenkeng; Jason Laumb; Robert Jensen; Edwin Olson; Jill MacKenzie; A.M. Rokanuzzaman

    2007-01-31

    The characterization and control of emissions from combustion sources are of significant importance in improving local and regional air quality. Such emissions include fine particulate matter, organic carbon compounds, and NO{sub x} and SO{sub 2} gases, along with mercury and other toxic metals. This project involved four activities including Further Development of Analytical Techniques for PM{sub 10} and PM{sub 2.5} Characterization and Source Apportionment and Management, Organic Carbonaceous Particulate and Metal Speciation for Source Apportionment Studies, Quantum Modeling, and High-Potassium Carbon Production with Biomass-Coal Blending. The key accomplishments included the development of improved automated methods to characterize the inorganic and organic components particulate matter. The methods involved the use of scanning electron microscopy and x-ray microanalysis for the inorganic fraction and a combination of extractive methods combined with near-edge x-ray absorption fine structure to characterize the organic fraction. These methods have direction application for source apportionment studies of PM because they provide detailed inorganic analysis along with total organic and elemental carbon (OC/EC) quantification. Quantum modeling using density functional theory (DFT) calculations was used to further elucidate a recently developed mechanistic model for mercury speciation in coal combustion systems and interactions on activated carbon. Reaction energies, enthalpies, free energies and binding energies of Hg species to the prototype molecules were derived from the data obtained in these calculations. Bimolecular rate constants for the various elementary steps in the mechanism have been estimated using the hard-sphere collision theory approximation, and the results seem to indicate that extremely fast kinetics could be involved in these surface reactions. Activated carbon was produced from a blend of lignite coal from the Center Mine in North Dakota and

  8. Recent advances in lattice Boltzmann methods

    SciTech Connect

    Chen, S.; Doolen, G.D.; He, X.; Nie, X.; Zhang, R.

    1998-12-31

    In this paper, the authors briefly present the basic principles of lattice Boltzmann method and summarize recent advances of the method, including the application of the lattice Boltzmann method for fluid flows in MEMS and simulation of the multiphase mixing and turbulence.

  9. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  10. Advances in Quantitative Proteomics of Microbes and Microbial Communities

    NASA Astrophysics Data System (ADS)

    Waldbauer, J.; Zhang, L.; Rizzo, A. I.

    2015-12-01

    Quantitative measurements of gene expression are key to developing a mechanistic, predictive understanding of how microbial metabolism drives many biogeochemical fluxes and responds to environmental change. High-throughput RNA-sequencing can afford a wealth of information about transcript-level expression patterns, but it is becoming clear that expression dynamics are often very different at the protein level where biochemistry actually occurs. These divergent dynamics between levels of biological organization necessitate quantitative proteomic measurements to address many biogeochemical questions. The protein-level expression changes that underlie shifts in the magnitude, or even the direction, of metabolic and biogeochemical fluxes can be quite subtle and test the limits of current quantitative proteomics techniques. Here we describe methodologies for high-precision, whole-proteome quantification that are applicable to both model organisms of biogeochemical interest that may not be genetically tractable, and to complex community samples from natural environments. Employing chemical derivatization of peptides with multiple isotopically-coded tags, this strategy is rapid and inexpensive, can be implemented on a wide range of mass spectrometric instrumentation, and is relatively insensitive to chromatographic variability. We demonstrate the utility of this quantitative proteomics approach in application to both isolates and natural communities of sulfur-metabolizing and photosynthetic microbes.

  11. Breeding and quantitative genetics advances in sunflower Sclerotinia research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genetic research of the sunflower research unit, USDA-ARS, in Fargo, ND, was discussed in a presentation to a group of producers, industry representatives, and scientists. The need for sunflower quantitative genetics research to find and capture Sclerotinia resistance is increasing with every year t...

  12. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field.

  13. Noninvasive Characterization of Locally Advanced Breast Cancer Using Textural Analysis of Quantitative Ultrasound Parametric Images

    PubMed Central

    Tadayyon, Hadi; Sadeghi-Naini, Ali; Czarnota, Gregory J.

    2014-01-01

    PURPOSE: The identification of tumor pathologic characteristics is an important part of breast cancer diagnosis, prognosis, and treatment planning but currently requires biopsy as its standard. Here, we investigated a noninvasive quantitative ultrasound method for the characterization of breast tumors in terms of their histologic grade, which can be used with clinical diagnostic ultrasound data. METHODS: Tumors of 57 locally advanced breast cancer patients were analyzed as part of this study. Seven quantitative ultrasound parameters were determined from each tumor region from the radiofrequency data, including mid-band fit, spectral slope, 0-MHz intercept, scatterer spacing, attenuation coefficient estimate, average scatterer diameter, and average acoustic concentration. Parametric maps were generated corresponding to the region of interest, from which four textural features, including contrast, energy, homogeneity, and correlation, were determined as further tumor characterization parameters. Data were examined on the basis of tumor subtypes based on histologic grade (grade I versus grade II to III). RESULTS: Linear discriminant analysis of the means of the parametric maps resulted in classification accuracy of 79%. On the other hand, the linear combination of the texture features of the parametric maps resulted in classification accuracy of 82%. Finally, when both the means and textures of the parametric maps were combined, the best classification accuracy was obtained (86%). CONCLUSIONS: Textural characteristics of quantitative ultrasound spectral parametric maps provided discriminant information about different types of breast tumors. The use of texture features significantly improved the results of ultrasonic tumor characterization compared to conventional mean values. Thus, this study suggests that texture-based quantitative ultrasound analysis of in vivo breast tumors can provide complementary diagnostic information about tumor histologic characteristics

  14. Machine Learning methods for Quantitative Radiomic Biomarkers

    PubMed Central

    Parmar, Chintan; Grossmann, Patrick; Bussink, Johan; Lambin, Philippe; Aerts, Hugo J. W. L.

    2015-01-01

    Radiomics extracts and mines large number of medical imaging features quantifying tumor phenotypic characteristics. Highly accurate and reliable machine-learning approaches can drive the success of radiomic applications in clinical care. In this radiomic study, fourteen feature selection methods and twelve classification methods were examined in terms of their performance and stability for predicting overall survival. A total of 440 radiomic features were extracted from pre-treatment computed tomography (CT) images of 464 lung cancer patients. To ensure the unbiased evaluation of different machine-learning methods, publicly available implementations along with reported parameter configurations were used. Furthermore, we used two independent radiomic cohorts for training (n = 310 patients) and validation (n = 154 patients). We identified that Wilcoxon test based feature selection method WLCX (stability = 0.84 ± 0.05, AUC = 0.65 ± 0.02) and a classification method random forest RF (RSD = 3.52%, AUC = 0.66 ± 0.03) had highest prognostic performance with high stability against data perturbation. Our variability analysis indicated that the choice of classification method is the most dominant source of performance variation (34.21% of total variance). Identification of optimal machine-learning methods for radiomic applications is a crucial step towards stable and clinically relevant radiomic biomarkers, providing a non-invasive way of quantifying and monitoring tumor-phenotypic characteristics in clinical practice. PMID:26278466

  15. Machine Learning methods for Quantitative Radiomic Biomarkers.

    PubMed

    Parmar, Chintan; Grossmann, Patrick; Bussink, Johan; Lambin, Philippe; Aerts, Hugo J W L

    2015-08-17

    Radiomics extracts and mines large number of medical imaging features quantifying tumor phenotypic characteristics. Highly accurate and reliable machine-learning approaches can drive the success of radiomic applications in clinical care. In this radiomic study, fourteen feature selection methods and twelve classification methods were examined in terms of their performance and stability for predicting overall survival. A total of 440 radiomic features were extracted from pre-treatment computed tomography (CT) images of 464 lung cancer patients. To ensure the unbiased evaluation of different machine-learning methods, publicly available implementations along with reported parameter configurations were used. Furthermore, we used two independent radiomic cohorts for training (n = 310 patients) and validation (n = 154 patients). We identified that Wilcoxon test based feature selection method WLCX (stability = 0.84 ± 0.05, AUC = 0.65 ± 0.02) and a classification method random forest RF (RSD = 3.52%, AUC = 0.66 ± 0.03) had highest prognostic performance with high stability against data perturbation. Our variability analysis indicated that the choice of classification method is the most dominant source of performance variation (34.21% of total variance). Identification of optimal machine-learning methods for radiomic applications is a crucial step towards stable and clinically relevant radiomic biomarkers, providing a non-invasive way of quantifying and monitoring tumor-phenotypic characteristics in clinical practice.

  16. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  17. Methods and challenges in quantitative imaging biomarker development.

    PubMed

    Abramson, Richard G; Burton, Kirsteen R; Yu, John-Paul J; Scalzetti, Ernest M; Yankeelov, Thomas E; Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M

    2015-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This article, drafted by the Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field.

  18. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems

  19. Advanced radioactive waste assay methods: Final report

    SciTech Connect

    Cline, J.E.; Robertson, D.E.; DeGroot, S.E.

    1987-11-01

    This report describes an evaluation of advanced methodologies for the radioassay of low power-plant low-level radioactive waste for compliance with the 10CFR61 classification rules. The project evaluated current assay practices in ten operating plants and identified areas where advanced methods would apply, studied two direct-assay methodologies, demonstrated these two techniques on radwaste in four operating plants and on irradiated components in two plants, and developed techniques for obtaining small representative aliquots from larger samples and for enhancing the /sup 144/Ce activity analysis in samples of waste. The study demonstrated the accuracy, practicality, and ALARA aspects of advanced methods and indicates that cost savings, resulting from the accuracy improvement and reduction in sampling requirements can be significant. 24 refs., 60 figs., 67 tabs.

  20. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  1. Chemoenzymatic method for glycomics: isolation, identification, and quantitation

    PubMed Central

    Yang, Shuang; Rubin, Abigail; Eshghi, Shadi Toghi; Zhang, Hui

    2015-01-01

    Over the past decade, considerable progress has been made with respect to the analytical methods for analysis of glycans from biological sources. Regardless of the specific methods that are used, glycan analysis includes isolation, identification, and quantitation. Derivatization is indispensable to increase their identification. Derivatization of glycans can be performed by permethylation or carbodiimide coupling / esterification. By introducing a fluorophore or chromophore at their reducing end, glycans can be separated by electrophoresis or chromatography. The fluorogenically labeled glycans can be quantitated using fluorescent detection. The recently developed approaches using solid-phase such as glycoprotein immobilization for glycan extraction and on-tissue glycan mass spectrometry imaging demonstrate advantages over methods performed in solution. Derivatization of sialic acids is favorably implemented on the solid support using carbodiimide coupling, and the released glycans can be further modified at the reducing end or permethylated for quantitative analysis. In this review, methods for glycan isolation, identification, and quantitation are discussed. PMID:26390280

  2. Method for quantitating sensitivity to a staphylococcal bacteriocin.

    PubMed Central

    Van Norman, G; Groman, N

    1979-01-01

    A convenient method for quantitating the sensitivity of large numbers of bacterial strains (presently Corynebacterium diphtheriae) to a Staphylococcus aureus phage type 71 bacteriocin is described. Images PMID:121117

  3. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A.

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  4. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  5. Quantitative methods to study epithelial morphogenesis and polarity.

    PubMed

    Aigouy, B; Collinet, C; Merkel, M; Sagner, A

    2017-01-01

    Morphogenesis of an epithelial tissue emerges from the behavior of its constituent cells, including changes in shape, rearrangements, and divisions. In many instances the directionality of these cellular events is controlled by the polarized distribution of specific molecular components. In recent years, our understanding of morphogenesis and polarity highly benefited from advances in genetics, microscopy, and image analysis. They now make it possible to measure cellular dynamics and polarity with unprecedented precision for entire tissues throughout their development. Here we review recent approaches to visualize and measure cell polarity and tissue morphogenesis. The chapter is organized like an experiment. We first discuss the choice of cell and polarity reporters and describe the use of mosaics to reveal hidden cell polarities or local morphogenetic events. Then, we outline application-specific advantages and disadvantages of different microscopy techniques and image projection algorithms. Next, we present methods to extract cell outlines to measure cell polarity and detect cellular events underlying morphogenesis. Finally, we bridge scales by presenting approaches to quantify the specific contribution of each cellular event to global tissue deformation. Taken together, we provide an in-depth description of available tools and theoretical concepts to quantitatively study cell polarity and tissue morphogenesis over multiple scales.

  6. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.

  7. ABRF-PRG07: Advanced Quantitative Proteomics Study

    PubMed Central

    Falick, Arnold M.; Lane, William S.; Lilley, Kathryn S.; MacCoss, Michael J.; Phinney, Brett S.; Sherman, Nicholas E.; Weintraub, Susan T.; Witkowska, H. Ewa; Yates, Nathan A.

    2011-01-01

    A major challenge for core facilities is determining quantitative protein differences across complex biological samples. Although there are numerous techniques in the literature for relative and absolute protein quantification, the majority is nonroutine and can be challenging to carry out effectively. There are few studies comparing these technologies in terms of their reproducibility, accuracy, and precision, and no studies to date deal with performance across multiple laboratories with varied levels of expertise. Here, we describe an Association of Biomolecular Resource Facilities (ABRF) Proteomics Research Group (PRG) study based on samples composed of a complex protein mixture into which 12 known proteins were added at varying but defined ratios. All of the proteins were present at the same concentration in each of three tubes that were provided. The primary goal of this study was to allow each laboratory to evaluate its capabilities and approaches with regard to: detection and identification of proteins spiked into samples that also contain complex mixtures of background proteins and determination of relative quantities of the spiked proteins. The results returned by 43 participants were compiled by the PRG, which also collected information about the strategies used to assess overall performance and as an aid to development of optimized protocols for the methodologies used. The most accurate results were generally reported by the most experienced laboratories. Among laboratories that used the same technique, values that were closer to the expected ratio were obtained by more experienced groups. PMID:21455478

  8. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  9. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  10. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  11. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  12. Quantitative (177)Lu SPECT imaging using advanced correction algorithms in non-reference geometry.

    PubMed

    D'Arienzo, M; Cozzella, M L; Fazio, A; De Felice, P; Iaccarino, G; D'Andrea, M; Ungania, S; Cazzato, M; Schmidt, K; Kimiaei, S; Strigari, L

    2016-12-01

    Peptide receptor therapy with (177)Lu-labelled somatostatin analogues is a promising tool in the management of patients with inoperable or metastasized neuroendocrine tumours. The aim of this work was to perform accurate activity quantification of (177)Lu in complex anthropomorphic geometry using advanced correction algorithms. Acquisitions were performed on the higher (177)Lu photopeak (208keV) using a Philips IRIX gamma camera provided with medium-energy collimators. System calibration was performed using a 16mL Jaszczak sphere surrounded by non-radioactive water. Attenuation correction was performed using μ-maps derived from CT data, while scatter and septal penetration corrections were performed using the transmission-dependent convolution-subtraction method. SPECT acquisitions were finally corrected for dead time and partial volume effects. Image analysis was performed using the commercial QSPECT software. The quantitative SPECT approach was validated on an anthropomorphic phantom provided with a home-made insert simulating a hepatic lesion. Quantitative accuracy was studied using three tumour-to-background activity concentration ratios (6:1, 9:1, 14:1). For all acquisitions, the recovered total activity was within 12% of the calibrated activity both in the background region and in the tumour. Using a 6:1 tumour-to-background ratio the recovered total activity was within 2% in the tumour and within 5% in the background. Partial volume effects, if not properly accounted for, can lead to significant activity underestimations in clinical conditions. In conclusion, accurate activity quantification of (177)Lu can be obtained if activity measurements are performed with equipment traceable to primary standards, advanced correction algorithms are used and acquisitions are performed at the 208keV photopeak using medium-energy collimators.

  13. Current methods and advances in bone densitometry

    NASA Technical Reports Server (NTRS)

    Guglielmi, G.; Gluer, C. C.; Majumdar, S.; Blunt, B. A.; Genant, H. K.

    1995-01-01

    Bone mass is the primary, although not the only, determinant of fracture. Over the past few years a number of noninvasive techniques have been developed to more sensitively quantitate bone mass. These include single and dual photon absorptiometry (SPA and DPA), single and dual X-ray absorptiometry (SXA and DXA) and quantitative computed tomography (QCT). While differing in anatomic sites measured and in their estimates of precision, accuracy, and fracture discrimination, all of these methods provide clinically useful measurements of skeletal status. It is the intent of this review to discuss the pros and cons of these techniques and to present the new applications of ultrasound (US) and magnetic resonance (MRI) in the detection and management of osteoporosis.

  14. Editorial: Latest methods and advances in biotechnology.

    PubMed

    Lee, Sang Yup; Jungbauer, Alois

    2014-01-01

    The latest "Biotech Methods and Advances" special issue of Biotechnology Journal continues the BTJ tradition of featuring the latest breakthroughs in biotechnology. The special issue is edited by our Editors-in-Chief, Prof. Sang Yup Lee and Prof. Alois Jungbauer and covers a wide array of topics in biotechnology, including the perennial favorite workhorses of the biotech industry, Chinese hamster ovary (CHO) cell and Escherichia coli.

  15. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  16. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  17. Robust quantitative parameter estimation by advanced CMP measurements for vadose zone hydrological studies

    NASA Astrophysics Data System (ADS)

    Koyama, C.; Wang, H.; Khuut, T.; Kawai, T.; Sato, M.

    2015-12-01

    Soil moisture plays a crucial role in the understanding of processes in the vadose zone hydrology. In the last two decades ground penetrating radar (GPR) has been widely discussed has nondestructive measurement technique for soil moisture data. Especially the common mid-point (CMP) technique, which has been used in both seismic and GPR surveys to investigate the vertical velocity profiles, has a very high potential for quantitaive obervsations from the root zone to the ground water aquifer. However, the use is still rather limited today and algorithms for robust quantitative paramter estimation are lacking. In this study we develop an advanced processing scheme for operational soil moisture reetrieval at various depth. Using improved signal processing, together with a semblance - non-normalized cross-correlation sum combined stacking approach and the Dix formula, the interval velocities for multiple soil layers are obtained from the RMS velocities allowing for more accurate estimation of the permittivity at the reflecting point. Where the presence of a water saturated layer, like a groundwater aquifer, can be easily identified by its RMS velocity due to the high contrast compared to the unsaturated zone. By using a new semi-automated measurement technique the acquisition time for a full CMP gather with 1 cm intervals along a 10 m profile can be reduced significantly to under 2 minutes. The method is tested and validated under laboratory conditions in a sand-pit as well as on agricultural fields and beach sand in the Sendai city area. Comparison between CMP estimates and TDR measurements yield a very good agreement with RMSE of 1.5 Vol.-%. The accuracy of depth estimation is validated with errors smaller than 2%. Finally, we demonstrate application of the method in a test site in semi-arid Mongolia, namely the Orkhon River catchment in Bulgan, using commercial 100 MHz and 500 MHz RAMAC GPR antennas. The results demonstrate the suitability of the proposed method for

  18. Advanced Bayesian Method for Planetary Surface Navigation

    NASA Technical Reports Server (NTRS)

    Center, Julian

    2015-01-01

    Autonomous Exploration, Inc., has developed an advanced Bayesian statistical inference method that leverages current computing technology to produce a highly accurate surface navigation system. The method combines dense stereo vision and high-speed optical flow to implement visual odometry (VO) to track faster rover movements. The Bayesian VO technique improves performance by using all image information rather than corner features only. The method determines what can be learned from each image pixel and weighs the information accordingly. This capability improves performance in shadowed areas that yield only low-contrast images. The error characteristics of the visual processing are complementary to those of a low-cost inertial measurement unit (IMU), so the combination of the two capabilities provides highly accurate navigation. The method increases NASA mission productivity by enabling faster rover speed and accuracy. On Earth, the technology will permit operation of robots and autonomous vehicles in areas where the Global Positioning System (GPS) is degraded or unavailable.

  19. Wave propagation models for quantitative defect detection by ultrasonic methods

    NASA Astrophysics Data System (ADS)

    Srivastava, Ankit; Bartoli, Ivan; Coccia, Stefano; Lanza di Scalea, Francesco

    2008-03-01

    Ultrasonic guided wave testing necessitates of quantitative, rather than qualitative, information on flaw size, shape and position. This quantitative diagnosis ability can be used to provide meaningful data to a prognosis algorithm for remaining life prediction, or simply to generate data sets for a statistical defect classification algorithm. Quantitative diagnostics needs models able to represent the interaction of guided waves with various defect scenarios. One such model is the Global-Local (GL) method, which uses a full finite element discretization of the region around a flaw to properly represent wave diffraction, and a suitable set of wave functions to simulate regions away from the flaw. Displacement and stress continuity conditions are imposed at the boundary between the global and the local regions. In this paper the GL method is expanded to take advantage of the Semi-Analytical Finite Element (SAFE) method in the global portion of the waveguide. The SAFE method is efficient because it only requires the discretization of the cross-section of the waveguide to obtain the wave dispersion solutions and it can handle complex structures such as multilayered sandwich panels. The GL method is applied to predicting quantitatively the interaction of guided waves with defects in aluminum and composites structural components.

  20. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry

    PubMed Central

    Chavez, Juan D.; Eng, Jimmy K.; Schweppe, Devin K.; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E.

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions. PMID:27997545

  1. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  2. A general radiochemical-color method for quantitation of immunoblots.

    PubMed

    Esmaeli-Azad, B; Feinstein, S C

    1991-12-01

    Quantitative interpretation of protein immunoblotting procedures is hampered by a variety of technical liabilities inherent in the use of photographic and densitometric methods. In this paper, we present a novel, simple, and generally applicable alternative procedure to acquire quantitative data from immunoblots. Our strategy employs both the standard alkaline phosphatase color reaction and radiolabelled Protein A. The color reaction is used to localize the polypeptide of interest after transfer to a solid support. The colored bands are then excised and the radioactivity in the colocalized Protein A is quantitated in a gamma counter. In addition to avoiding the problems associated with photographic and densitometric procedures, our assay also overcomes common problems associated with variable gel lane width and individual band distortion. The resulting data is linear over a range of at least 50-fold (10-500 ng of specific protein, for the example used in this study) and is highly reproducible.

  3. Use advanced methods to treat wastewater

    SciTech Connect

    Davis, M. )

    1994-08-01

    Common sense guidelines offer plausible, progressive techniques to treat wastewater. Because current and pending local, state and federal regulations are ratcheting lower effluent discharge limits, familiar treatment methods, such as biological, don't meet new restrictions. Now operating facilities must combine traditional methods with advanced remedial options such as thermal, physical, electro and chemical treatments. these new techniques remove organics, metals, nonhazardous dissolved salts, etc., but carry higher operating and installation costs. Due to tighter effluent restrictions and pending zero-discharge initiatives, managers of operating facilities must know and understand the complexity, composition and contaminant concentration of their wastewaters. No one-size-fits-all solution exists. However, guidelines can simplify decision making and help operators nominate the most effective and economical strategy to handle their waste situation. The paper describes the common treatment and the importance of alternatives, then describes biological, electro, physical, thermal, and chemical treatments.

  4. A Novel Targeted Learning Method for Quantitative Trait Loci Mapping

    PubMed Central

    Wang, Hui; Zhang, Zhongyang; Rose, Sherri; van der Laan, Mark

    2014-01-01

    We present a novel semiparametric method for quantitative trait loci (QTL) mapping in experimental crosses. Conventional genetic mapping methods typically assume parametric models with Gaussian errors and obtain parameter estimates through maximum-likelihood estimation. In contrast with univariate regression and interval-mapping methods, our model requires fewer assumptions and also accommodates various machine-learning algorithms. Estimation is performed with targeted maximum-likelihood learning methods. We demonstrate our semiparametric targeted learning approach in a simulation study and a well-studied barley data set. PMID:25258376

  5. A novel targeted learning method for quantitative trait loci mapping.

    PubMed

    Wang, Hui; Zhang, Zhongyang; Rose, Sherri; van der Laan, Mark

    2014-12-01

    We present a novel semiparametric method for quantitative trait loci (QTL) mapping in experimental crosses. Conventional genetic mapping methods typically assume parametric models with Gaussian errors and obtain parameter estimates through maximum-likelihood estimation. In contrast with univariate regression and interval-mapping methods, our model requires fewer assumptions and also accommodates various machine-learning algorithms. Estimation is performed with targeted maximum-likelihood learning methods. We demonstrate our semiparametric targeted learning approach in a simulation study and a well-studied barley data set.

  6. Novel method for ANA quantitation using IIF imaging system.

    PubMed

    Peng, Xiaodong; Tang, Jiangtao; Wu, Yongkang; Yang, Bin; Hu, Jing

    2014-02-01

    A variety of antinuclear antibodies (ANAs) are found in the serum of patients with autoimmune diseases. The detection of abnormal ANA titers is a critical criterion for diagnosis of systemic lupus erythematosus (SLE) and other connective tissue diseases. Indirect immunofluorescence assay (IIF) on HEp-2 cells is the gold standard method to determine the presence of ANA and therefore provides information about the localization of autoantigens that are useful for diagnosis. However, its utility was limited in prognosing and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. On the other hand, ELISA for the detection of ANA can quantitate ANA but could not provide further information about the localization of the autoantigens. It would be ideal to integrate both of the quantitative and qualitative methods. To address this issue, this study was conducted to quantitatively detect ANAs by using IIF imaging analysis system. Serum samples from patients with ANA positive (including speckled, homogeneous, nuclear mixture and cytoplasmic mixture patterns) and negative were detected for ANA titers by the classical IIF and analyzed by an image system, the image of each sample was acquired by the digital imaging system and the green fluorescence intensity was quantified by the Image-Pro plus software. A good correlation was found in between two methods and the correlation coefficients (R(2)) of various ANA patterns were 0.942 (speckled), 0.942 (homogeneous), 0.923 (nuclear mixture) and 0.760 (cytoplasmic mixture), respectively. The fluorescence density was linearly correlated with the log of ANA titers in various ANA patterns (R(2)>0.95). Moreover, the novel ANA quantitation method showed good reproducibility (F=0.091, p>0.05) with mean±SD and CV% of positive, and negative quality controls were equal to 126.4±9.6 and 7.6%, 10.4±1.25 and 12

  7. Advances in quantitative nanoscale subsurface imaging by mode-synthesizing atomic force microscopy

    SciTech Connect

    Vitry, P.; Bourillot, E.; Plassard, C.; Lacroute, Y.; Lesniewska, E.; Tetard, L.

    2014-08-04

    This paper reports on advances toward quantitative non-destructive nanoscale subsurface investigation of a nanofabricated sample based on mode synthesizing atomic force microscopy with heterodyne detection, addressing the need to correlate the role of actuation frequencies of the probe f{sub p} and the sample f{sub s} with depth resolution for 3D tomography reconstruction. Here, by developing a simple model and validating the approach experimentally through the study of the nanofabricated calibration depth samples consisting of buried metallic patterns, we demonstrate avenues for quantitative nanoscale subsurface imaging. Our findings enable the reconstruction of the sample depth profile and allow high fidelity resolution of the buried nanostructures. Non-destructive quantitative nanoscale subsurface imaging offers great promise in the study of the structures and properties of complex systems at the nanoscale.

  8. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  9. Informatics Methods to Enable Sharing of Quantitative Imaging Research Data

    PubMed Central

    Levy, Mia A.; Freymann, John B.; Kirby, Justin S.; Fedorov, Andriy; Fennessy, Fiona M.; Eschrich, Steven A.; Berglund, Anders E.; Fenstermacher, David A.; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L.; Brown, Bartley J.; Braun, Terry A.; Dekker, Andre; Roelofs, Erik; Mountz, James M.; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-01-01

    Introduction The National Cancer Institute (NCI) Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. Methods We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. Results There area variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. Conclusions As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. PMID:22770688

  10. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  11. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  12. Testing of flat optical surfaces by the quantitative Foucault method.

    PubMed

    Simon, M C; Simon, J M

    1978-01-01

    The complete theory of measurement of optical flat mirrors of circular or elliptical shape using the quantitative Foucault method is described here. It has been used in Córdoba since 1939 in a partially intuitive but correct form. The surface, not yet flat and, at times, astigmatic, is assimilated to the sum of a spherical plus a cylindrical dome. The errors of the three possible ways of reckoning are calculated.

  13. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  14. A quantitative SMRT cell sequencing method for ribosomal amplicons.

    PubMed

    Jones, Bethan M; Kustka, Adam B

    2017-04-01

    Advances in sequencing technologies continue to provide unprecedented opportunities to characterize microbial communities. For example, the Pacific Biosciences Single Molecule Real-Time (SMRT) platform has emerged as a unique approach harnessing DNA polymerase activity to sequence template molecules, enabling long reads at low costs. With the aim to simultaneously classify and enumerate in situ microbial populations, we developed a quantitative SMRT (qSMRT) approach that involves the addition of exogenous standards to quantify ribosomal amplicons derived from environmental samples. The V7-9 regions of 18S SSU rDNA were targeted and quantified from protistan community samples collected in the Ross Sea during the Austral summer of 2011. We used three standards of different length and optimized conditions to obtain accurate quantitative retrieval across the range of expected amplicon sizes, a necessary criterion for analyzing taxonomically diverse 18S rDNA molecules from natural environments. The ability to concurrently identify and quantify microorganisms in their natural environment makes qSMRT a powerful, rapid and cost-effective approach for defining ecosystem diversity and function.

  15. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  16. Advanced fault diagnosis methods in molecular networks.

    PubMed

    Habibi, Iman; Emamian, Effat S; Abdi, Ali

    2014-01-01

    Analysis of the failure of cell signaling networks is an important topic in systems biology and has applications in target discovery and drug development. In this paper, some advanced methods for fault diagnosis in signaling networks are developed and then applied to a caspase network and an SHP2 network. The goal is to understand how, and to what extent, the dysfunction of molecules in a network contributes to the failure of the entire network. Network dysfunction (failure) is defined as failure to produce the expected outputs in response to the input signals. Vulnerability level of a molecule is defined as the probability of the network failure, when the molecule is dysfunctional. In this study, a method to calculate the vulnerability level of single molecules for different combinations of input signals is developed. Furthermore, a more complex yet biologically meaningful method for calculating the multi-fault vulnerability levels is suggested, in which two or more molecules are simultaneously dysfunctional. Finally, a method is developed for fault diagnosis of networks based on a ternary logic model, which considers three activity levels for a molecule instead of the previously published binary logic model, and provides equations for the vulnerabilities of molecules in a ternary framework. Multi-fault analysis shows that the pairs of molecules with high vulnerability typically include a highly vulnerable molecule identified by the single fault analysis. The ternary fault analysis for the caspase network shows that predictions obtained using the more complex ternary model are about the same as the predictions of the simpler binary approach. This study suggests that by increasing the number of activity levels the complexity of the model grows; however, the predictive power of the ternary model does not appear to be increased proportionally.

  17. An improved quantitative analysis method for plant cortical microtubules.

    PubMed

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  18. A quantitative method for optimized placement of continuous air monitors.

    PubMed

    Whicker, Jeffrey J; Rodgers, John C; Moxley, John S

    2003-11-01

    Alarming continuous air monitors (CAMs) are a critical component for worker protection in facilities that handle large amounts of hazardous materials. In nuclear facilities, continuous air monitors alarm when levels of airborne radioactive materials exceed alarm thresholds, thus prompting workers to exit the room to reduce inhalation exposures. To maintain a high level of worker protection, continuous air monitors are required to detect radioactive aerosol clouds quickly and with good sensitivity. This requires that there are sufficient numbers of continuous air monitors in a room and that they are well positioned. Yet there are no published methodologies to quantitatively determine the optimal number and placement of continuous air monitors in a room. The goal of this study was to develop and test an approach to quantitatively determine optimal number and placement of continuous air monitors in a room. The method we have developed uses tracer aerosol releases (to simulate accidental releases) and the measurement of the temporal and spatial aspects of the dispersion of the tracer aerosol through the room. The aerosol dispersion data is then analyzed to optimize continuous air monitor utilization based on simulated worker exposure. This method was tested in a room within a Department of Energy operated plutonium facility at the Savannah River Site in South Carolina, U.S. Results from this study show that the value of quantitative airflow and aerosol dispersion studies is significant and that worker protection can be significantly improved while balancing the costs associated with CAM programs.

  19. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  20. A rapid chemiluminescent method for quantitation of human DNA.

    PubMed Central

    Walsh, P S; Varlaro, J; Reynolds, R

    1992-01-01

    A sensitive and simple method for the quantitation of human DNA is described. This method is based on probe hybridization to a human alpha satellite locus, D17Z1. The biotinylated probe is hybridized to sample DNA immobilized on nylon membrane. The subsequent binding of streptavidin-horseradish peroxidase to the bound probe allows for chemiluminescent detection using a luminol-based reagent and X-ray film. Less than 150 pg of human DNA can easily be detected with a 15 minute exposure. The entire procedure can be performed in 1.5 hours. Microgram quantities of nonhuman DNA have been tested and the results indicate very high specificity for human DNA. The data on film can be scanned into a computer and a commercially available program can be used to create a standard curve where DNA quantity is plotted against the mean density of each slot blot signal. The methods described can also be applied to the very sensitive determination of quantity and quality (size) of DNA on Southern blots. The high sensitivity of this quantitation method requires the consumption of only a fraction of sample for analysis. Determination of DNA quantity is necessary for RFLP and many PCR-based tests where optimal results are obtained only with a relatively narrow range of DNA quantities. The specificity of this quantitation method for human DNA will be useful for the analysis of samples that may also contain bacterial or other non-human DNA, for example forensic evidence samples, ancient DNA samples, or clinical samples. Images PMID:1408822

  1. A rapid chemiluminescent method for quantitation of human DNA.

    PubMed

    Walsh, P S; Varlaro, J; Reynolds, R

    1992-10-11

    A sensitive and simple method for the quantitation of human DNA is described. This method is based on probe hybridization to a human alpha satellite locus, D17Z1. The biotinylated probe is hybridized to sample DNA immobilized on nylon membrane. The subsequent binding of streptavidin-horseradish peroxidase to the bound probe allows for chemiluminescent detection using a luminol-based reagent and X-ray film. Less than 150 pg of human DNA can easily be detected with a 15 minute exposure. The entire procedure can be performed in 1.5 hours. Microgram quantities of nonhuman DNA have been tested and the results indicate very high specificity for human DNA. The data on film can be scanned into a computer and a commercially available program can be used to create a standard curve where DNA quantity is plotted against the mean density of each slot blot signal. The methods described can also be applied to the very sensitive determination of quantity and quality (size) of DNA on Southern blots. The high sensitivity of this quantitation method requires the consumption of only a fraction of sample for analysis. Determination of DNA quantity is necessary for RFLP and many PCR-based tests where optimal results are obtained only with a relatively narrow range of DNA quantities. The specificity of this quantitation method for human DNA will be useful for the analysis of samples that may also contain bacterial or other non-human DNA, for example forensic evidence samples, ancient DNA samples, or clinical samples.

  2. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach.

  3. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikolić, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  4. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    PubMed

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  5. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  6. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  7. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  8. Quantitative analytical method to evaluate the metabolism of vitamin D.

    PubMed

    Mena-Bravo, A; Ferreiro-Vera, C; Priego-Capote, F; Maestro, M A; Mouriño, A; Quesada-Gómez, J M; Luque de Castro, M D

    2015-03-10

    A method for quantitative analysis of vitamin D (both D2 and D3) and its main metabolites - monohydroxylated vitamin D (25-hydroxyvitamin D2 and 25-hydroxyvitamin D3) and dihydroxylated metabolites (1,25-dihydroxyvitamin D2, 1,25-dihydroxyvitamin D3 and 24,25-dihydroxyvitamin D3) in human serum is here reported. The method is based on direct analysis of serum by an automated platform involving on-line coupling of a solid-phase extraction workstation to a liquid chromatograph-tandem mass spectrometer. Detection of the seven analytes was carried out by the selected reaction monitoring (SRM) mode, and quantitative analysis was supported on the use of stable isotopic labeled internal standards (SIL-ISs). The detection limits were between 0.3-75pg/mL for the target compounds, while precision (expressed as relative standard deviation) was below 13.0% for between-day variability. The method was externally validated according to the vitamin D External Quality Assurance Scheme (DEQAS) through the analysis of ten serum samples provided by this organism. The analytical features of the method support its applicability in nutritional and clinical studies targeted at elucidating the role of vitamin D metabolism.

  9. A New Kinetic Spectrophotometric Method for the Quantitation of Amorolfine.

    PubMed

    Soto, César; Poza, Cristian; Contreras, David; Yáñez, Jorge; Nacaratte, Fallon; Toral, M Inés

    2017-01-01

    Amorolfine (AOF) is a compound with fungicide activity based on the dual inhibition of growth of the fungal cell membrane, the biosynthesis and accumulation of sterols, and the reduction of ergosterol. In this work a sensitive kinetic and spectrophotometric method for the AOF quantitation based on the AOF oxidation by means of KMnO4 at 30 min (fixed time), pH alkaline, and ionic strength controlled was developed. Measurements of changes in absorbance at 610 nm were used as criterion of the oxidation progress. In order to maximize the sensitivity, different experimental reaction parameters were carefully studied via factorial screening and optimized by multivariate method. The linearity, intraday, and interday assay precision and accuracy were determined. The absorbance-concentration plot corresponding to tap water spiked samples was rectilinear, over the range of 7.56 × 10(-6)-3.22 × 10(-5) mol L(-1), with detection and quantitation limits of 2.49 × 10(-6) mol L(-1) and 7.56 × 10(-6) mol L(-1), respectively. The proposed method was successfully validated for the application of the determination of the drug in the spiked tap water samples and the percentage recoveries were 94.0-105.0%. The method is simple and does not require expensive instruments or complicated extraction steps of the reaction product.

  10. A New Kinetic Spectrophotometric Method for the Quantitation of Amorolfine

    PubMed Central

    Poza, Cristian; Contreras, David; Yáñez, Jorge; Nacaratte, Fallon; Toral, M. Inés

    2017-01-01

    Amorolfine (AOF) is a compound with fungicide activity based on the dual inhibition of growth of the fungal cell membrane, the biosynthesis and accumulation of sterols, and the reduction of ergosterol. In this work a sensitive kinetic and spectrophotometric method for the AOF quantitation based on the AOF oxidation by means of KMnO4 at 30 min (fixed time), pH alkaline, and ionic strength controlled was developed. Measurements of changes in absorbance at 610 nm were used as criterion of the oxidation progress. In order to maximize the sensitivity, different experimental reaction parameters were carefully studied via factorial screening and optimized by multivariate method. The linearity, intraday, and interday assay precision and accuracy were determined. The absorbance-concentration plot corresponding to tap water spiked samples was rectilinear, over the range of 7.56 × 10−6–3.22 × 10−5 mol L−1, with detection and quantitation limits of 2.49 × 10−6 mol L−1 and 7.56 × 10−6 mol L−1, respectively. The proposed method was successfully validated for the application of the determination of the drug in the spiked tap water samples and the percentage recoveries were 94.0–105.0%. The method is simple and does not require expensive instruments or complicated extraction steps of the reaction product. PMID:28348920

  11. Advanced continuous cultivation methods for systems microbiology.

    PubMed

    Adamberg, Kaarel; Valgepea, Kaspar; Vilu, Raivo

    2015-09-01

    Increasing the throughput of systems biology-based experimental characterization of in silico-designed strains has great potential for accelerating the development of cell factories. For this, analysis of metabolism in the steady state is essential as only this enables the unequivocal definition of the physiological state of cells, which is needed for the complete description and in silico reconstruction of their phenotypes. In this review, we show that for a systems microbiology approach, high-resolution characterization of metabolism in the steady state--growth space analysis (GSA)--can be achieved by using advanced continuous cultivation methods termed changestats. In changestats, an environmental parameter is continuously changed at a constant rate within one experiment whilst maintaining cells in the physiological steady state similar to chemostats. This increases the resolution and throughput of GSA compared with chemostats, and, moreover, enables following of the dynamics of metabolism and detection of metabolic switch-points and optimal growth conditions. We also describe the concept, challenge and necessary criteria of the systematic analysis of steady-state metabolism. Finally, we propose that such systematic characterization of the steady-state growth space of cells using changestats has value not only for fundamental studies of metabolism, but also for systems biology-based metabolic engineering of cell factories.

  12. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Sun, Weimin; El-Sharawy, El-Budawy; Aberle, James T.; Birtcher, Craig R.; Peng, Jian; Tirkas, Panayiotis A.; Andrew, William V.; Kokotoff, David; Zavosh, Frank

    1993-01-01

    The Advanced Helicopter Electromagnetics (AHE) Industrial Associates Program has fruitfully completed its fourth year. Under the support of the AHE members and the joint effort of the research team, new and significant progress has been achieved in the year. Following the recommendations by the Advisory Task Force, the research effort is placed on more practical helicopter electromagnetic problems, such as HF antennas, composite materials, and antenna efficiencies. In this annual report, the main topics to be addressed include composite materials and antenna technology. The research work on each topic has been driven by the AHE consortium members' interests and needs. The remarkable achievements and progresses in each subject is reported respectively in individual sections of the report. The work in the area of composite materials includes: modeling of low conductivity composite materials by using Green's function approach; guidelines for composite material modeling by using the Green's function approach in the NEC code; development of 3-D volume mesh generator for modeling thick and volumetric dielectrics by using FD-TD method; modeling antenna elements mounted on a composite Comanche tail stabilizer; and antenna pattern control and efficiency estimate for a horn antenna loaded with composite dielectric materials.

  13. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Sun, Weimin; El-Sharawy, El-Budawy; Aberle, James T.; Birtcher, Craig R.; Peng, Jian; Tirkas, Panayiotis A.; Kokotoff, David; Zavosh, Frank

    1993-01-01

    The Advanced Helicopter Electromagnetics (AHE) Industrial Associates Program has continuously progressed with its research effort focused on subjects identified and recommended by the Advisory Task Force of the program. The research activities in this reporting period have been steered toward practical helicopter electromagnetic problems, such as HF antenna problems and antenna efficiencies, recommended by the AHE members at the annual conference held at Arizona State University on 28-29 Oct. 1992 and the last biannual meeting held at the Boeing Helicopter on 19-20 May 1993. The main topics addressed include the following: Composite Materials and Antenna Technology. The research work on each topic is closely tied with the AHE Consortium members' interests. Significant progress in each subject is reported. Special attention in the area of Composite Materials has been given to the following: modeling of material discontinuity and their effects on towel-bar antenna patterns; guidelines for composite material modeling by using the Green's function approach in the NEC code; measurements of towel-bar antennas grounded with a partially material-coated plate; development of 3-D volume mesh generator for modeling thick and volumetric dielectrics by using FD-TD method; FDTD modeling of horn antennas with composite E-plane walls; and antenna efficiency analysis for a horn antenna loaded with composite dielectric materials.

  14. Quantitative cell imaging using single beam phase retrieval method

    NASA Astrophysics Data System (ADS)

    Anand, Arun; Chhaniwal, Vani; Javidi, Bahram

    2011-06-01

    Quantitative three-dimensional imaging of cells can provide important information about their morphology as well as their dynamics, which will be useful in studying their behavior under various conditions. There are several microscopic techniques to image unstained, semi-transparent specimens, by converting the phase information into intensity information. But most of the quantitative phase contrast imaging techniques is realized either by using interference of the object wavefront with a known reference beam or using phase shifting interferometry. A two-beam interferometric method is challenging to implement especially with low coherent sources and it also requires a fine adjustment of beams to achieve high contrast fringes. In this letter, the development of a single beam phase retrieval microscopy technique for quantitative phase contrast imaging of cells using multiple intensity samplings of a volume speckle field in the axial direction is described. Single beam illumination with multiple intensity samplings provides fast convergence and a unique solution of the object wavefront. Three-dimensional thickness profiles of different cells such as red blood cells and onion skin cells were reconstructed using this technique with an axial resolution of the order of several nanometers.

  15. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    PubMed

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.

  16. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  17. Advances in nondestructive evaluation methods for inspection of refractory concretes

    SciTech Connect

    Ellingson, W. A.

    1980-01-01

    Refractory concrete linings are essential to protect steel pressure boundaries from high-temperature agressive erosive/corrosive environments. Castable refractory concretes have been gaining more acceptance as information about their performance increases. Economic factors, however, have begun to impose high demands on the reliability of refractory materials. Advanced nondestructive evaluation methods are being developed to assist the refractory user. Radiographic techniques, thermography, acoustic-emission detection, and interferometry have been shown to yield information on the structural status of refractory concrete. Methods using /sup 60/Co radiation sources are capable of yielding measurements of refractory wear rate as well as images of cracks and/or voids in pre- and post-fired refractory linings up to 60 cm thick. Thermographic (infrared) images serve as a qualitative indicator of refractory spalling, but quantitative measurements are difficult to obtain from surface-temperature mapping. Acoustic emission has been shown to be a qualitative indicator of thermomechanical degradation of thick panels of 50 and 95% Al/sub 2/O/sub 3/ during initial heating and cooling at rates of 100 to 220/sup 0/C/h. Laser interferometry methods have been shown to be capable of complete mappings of refractory lining thicknesses. This paper will present results obtained from laboratory and field applications of these methods in petrochemical, steel, and coal-conversion plants.

  18. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  19. [Quantitative and qualitative research methods, can they coexist yet?].

    PubMed

    Hunt, Elena; Lavoie, Anne-Marise

    2011-06-01

    Qualitative design is gaining ground in Nursing research. In spite of a relative progress however, the evidence based practice movement continues to dominate and to underline the exclusive value of quantitative design (particularly that of randomized clinical trials) for clinical decision making. In the actual context convenient to those in power making utilitarian decisions on one hand, and facing nursing criticism of the establishment in favor of qualitative research on the other hand, it is difficult to chose a practical and ethical path that values the nursing role within the health care system, keeping us committed to quality care and maintaining researcher's integrity. Both qualitative and quantitative methods have advantages and disadvantages, and clearly, none of them can, by itself, capture, describe and explain reality adequately. Therefore, a balance between the two methods is needed. Researchers bare responsibility to society and science, and they should opt for the appropriate design susceptible to answering the research question, not promote the design favored by the research funding distributors.

  20. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  1. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment.

  2. Advanced forensic validation for human spermatozoa identification using SPERM HY-LITER™ Express with quantitative image analysis.

    PubMed

    Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko

    2017-01-19

    Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.

  3. Novel method for the quantitative measurement of color vision deficiencies

    NASA Astrophysics Data System (ADS)

    Xiong, Kai; Hou, Minxian; Ye, Guanrong

    2005-01-01

    The method is based on chromatic visual evoked potential (VEP) measurement. The equiluminance of color stimulus in normal subjects is characterized by L-cone and M-cone activation in retina. For the deuteranopes and protanopes, only the activations of one relevant remaining cone type should be considered. The equiluminance turning curve was established for the recorded VEPs of the luminance changes of the red and green color stimulus, and the position of the equiluminance was used to define the kind and degree of color vision deficiencies. In the test of 47 volunteers, we got the VEP traces and the equiluminance turning curves, which was in accordance with the judgment by the pseudoisochromatic plate used in clinic. The method fulfills the impersonal and quantitative requirements in color vision deficiencies test.

  4. In vivo osteogenesis assay: a rapid method for quantitative analysis.

    PubMed

    Dennis, J E; Konstantakos, E K; Arm, D; Caplan, A I

    1998-08-01

    A quantitative in vivo osteogenesis assay is a useful tool for the analysis of cells and bioactive factors that affect the amount or rate of bone formation. There are currently two assays in general use for the in vivo assessment of osteogenesis by isolated cells: diffusion chambers and porous calcium phosphate ceramics. Due to the relative ease of specimen preparation and reproducibility of results, the porous ceramic assay was chosen for the development of a rapid method for quantitating in vivo bone formation. The ceramic cube implantation technique consists of combining osteogenic cells with 27-mm3 porous calcium phosphate ceramics, implanting the cell-ceramic composites subcutaneously into an immuno-tolerant host, and, after 2-6 weeks, harvesting and preparing the ceramic implants for histologic analysis. A drawback to the analysis of bone formation within these porous ceramics is that the entire cube must be examined to find small foci of bone present in some samples; a single cross-sectional area is not representative. For this reason, image analysis of serial sections from ceramics is often prohibitively time-consuming. Two alternative scoring methodologies were tested and compared to bone volume measurements obtained by image analysis. The two subjective scoring methods were: (1) Bone Scale: the amount of bone within pores of the ceramic implant is estimated on a scale of 0-4 based on the degree of bone fill (0=no bone, 1=up to 25%, 2=25 to 75%, 4=75 to 100% fill); and (2) Percentage Bone: the amount of bone is estimated by determining the percentage of ceramic pores which contain bone. Every tenth section of serially sectioned cubes was scored by each of these methods under double-blind conditions, and the Bone Scale and Percentage Bone results were directly compared to image analysis measurements from identical samples. Correlation coefficients indicate that the Percentage Bone method was more accurate than the Bone Scale scoring method. The Bone Scale

  5. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  6. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  7. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  8. Ongoing advances in quantitative PpIX fluorescence guided intracranial tumor resection (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Olson, Jonathan D.; Kanick, Stephen C.; Bravo, Jaime J.; Roberts, David W.; Paulsen, Keith D.

    2016-03-01

    Aminolevulinc-acid induced protoporphyrin IX (ALA-PpIX) is being investigated as a biomarker to guide neurosurgical resection of brain tumors. ALA-PpIX fluorescence can be observed visually in the surgical field; however, raw fluorescence emissions can be distorted by factors other than the fluorophore concentration. Specifically, fluorescence emissions are mixed with autofluorescence and attenuated by background absorption and scattering properties of the tissue. Recent work at Dartmouth has developed advanced fluorescence detection approaches that return quantitative assessments of PpIX concentration, which are independent of background optical properties. The quantitative fluorescence imaging (qFI) approach has increased sensitivity to residual disease within the resection cavity at the end of surgery that was not visible to the naked eye through the operating microscope. This presentation outlines clinical observations made during an ongoing investigation of ALA-PpIX based guidance of tumor resection. PpIX fluorescence measurements made in a wide-field hyperspectral imaging approach are co-registered with point-assessment using a fiber optic probe. Data show variations in the measured PpIX accumulation among different clinical tumor grades (i.e. high grade glioma, low grade glioma), types (i.e. primary tumors. metastases) and normal structures of interest (e.g. normal cortex, hippocampus). These results highlight the contrast enhancement and underscore the potential clinical benefit offered from quantitative measurements of PpIX concentration during resection of intracranial tumors.

  9. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows.

  10. Quantitative methods in electroencephalography to access therapeutic response.

    PubMed

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  11. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl.

  12. Advanced Source Deconvolution Methods for Compton Telescopes

    NASA Astrophysics Data System (ADS)

    Zoglauer, Andreas

    The next generation of space telescopes utilizing Compton scattering for astrophysical observations is destined to one day unravel the mysteries behind Galactic nucleosynthesis, to determine the origin of the positron annihilation excess near the Galactic center, and to uncover the hidden emission mechanisms behind gamma-ray bursts. Besides astrophysics, Compton telescopes are establishing themselves in heliophysics, planetary sciences, medical imaging, accelerator physics, and environmental monitoring. Since the COMPTEL days, great advances in the achievable energy and position resolution were possible, creating an extremely vast, but also extremely sparsely sampled data space. Unfortunately, the optimum way to analyze the data from the next generation of Compton telescopes has not yet been found, which can retrieve all source parameters (location, spectrum, polarization, flux) and achieves the best possible resolution and sensitivity at the same time. This is especially important for all sciences objectives looking at the inner Galaxy: the large amount of expected sources, the high background (internal and Galactic diffuse emission), and the limited angular resolution, make it the most taxing case for data analysis. In general, two key challenges exist: First, what are the best data space representations to answer the specific science questions? Second, what is the best way to deconvolve the data to fully retrieve the source parameters? For modern Compton telescopes, the existing data space representations can either correctly reconstruct the absolute flux (binned mode) or achieve the best possible resolution (list-mode), both together were not possible up to now. Here we propose to develop a two-stage hybrid reconstruction method which combines the best aspects of both. Using a proof-of-concept implementation we can for the first time show that it is possible to alternate during each deconvolution step between a binned-mode approach to get the flux right and a

  13. A Quantitative Vainberg Method for Black Box Scattering

    NASA Astrophysics Data System (ADS)

    Galkowski, Jeffrey

    2017-01-01

    We give a quantitative version of Vainberg's method relating pole free regions to propagation of singularities for black box scatterers. In particular, we show that there is a logarithmic resonance free region near the real axis of size {τ} with polynomial bounds on the resolvent if and only if the wave propagator gains derivatives at rate {τ}. Next we show that if there exist singularities in the wave trace at times tending to infinity which smooth at rate {τ}, then there are resonances in logarithmic strips whose width is given by {τ}. As our main application of these results, we give sharp bounds on the size of resonance free regions in scattering on geometrically nontrapping manifolds with conic points. Moreover, these bounds are generically optimal on exteriors of nontrapping polygonal domains.

  14. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    SciTech Connect

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  15. A quantitative dimming method for LED based on PWM

    NASA Astrophysics Data System (ADS)

    Wang, Jiyong; Mou, Tongsheng; Wang, Jianping; Tian, Xiaoqing

    2012-10-01

    Traditional light sources were required to provide stable and uniform illumination for a living or working environment considering performance of visual function of human being. The requirement was always reasonable until non-visual functions of the ganglion cells in the retina photosensitive layer were found. New generation of lighting technology, however, is emerging based on novel lighting materials such as LED and photobiological effects on human physiology and behavior. To realize dynamic lighting of LED whose intensity and color were adjustable to the need of photobiological effects, a quantitative dimming method based on Pulse Width Modulation (PWM) and light-mixing technology was presented. Beginning with two channels' PWM, this paper demonstrated the determinacy and limitation of PWM dimming for realizing Expected Photometric and Colorimetric Quantities (EPCQ), in accordance with the analysis on geometrical, photometric, colorimetric and electrodynamic constraints. A quantitative model which mapped the EPCQ into duty cycles was finally established. The deduced model suggested that the determinacy was a unique individuality only for two channels' and three channels' PWM, but the limitation was an inevitable commonness for multiple channels'. To examine the model, a light-mixing experiment with two kinds of white LED simulated variations of illuminance and Correlation Color Temperature (CCT) from dawn to midday. Mean deviations between theoretical values and measured values were obtained, which were 15lx and 23K respectively. Result shows that this method can effectively realize the light spectrum which has a specific requirement of EPCQ, and provides a theoretical basis and a practical way for dynamic lighting of LED.

  16. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Sun, Weimin; El-Sharawy, El-Budawy; Aberle, James T.; Birtcher, Craig R.; Peng, Jian; Tirkas, Panayiotis A.

    1992-01-01

    The Advanced Helicopter Electromagnetics (AHE) Industrial Associates Program continues its research on variety of main topics identified and recommended by the Advisory Task Force of the program. The research activities center on issues that advance technology related to helicopter electromagnetics. While most of the topics are a continuation of previous works, special effort has been focused on some of the areas due to recommendations from the last annual conference. The main topics addressed in this report are: composite materials, and antenna technology. The area of composite materials continues getting special attention in this period. The research has focused on: (1) measurements of the electrical properties of low-conductivity materials; (2) modeling of material discontinuity and their effects on the scattering patterns; (3) preliminary analysis on interaction of electromagnetic fields with multi-layered graphite fiberglass plates; and (4) finite difference time domain (FDTD) modeling of fields penetration through composite panels of a helicopter.

  17. Controlling template erosion with advanced cleaning methods

    NASA Astrophysics Data System (ADS)

    Singh, SherJang; Yu, Zhaoning; Wähler, Tobias; Kurataka, Nobuo; Gauzner, Gene; Wang, Hongying; Yang, Henry; Hsu, Yautzong; Lee, Kim; Kuo, David; Dress, Peter

    2012-03-01

    We studied the erosion and feature stability of fused silica patterns under different template cleaning conditions. The conventional SPM cleaning is compared with an advanced non-acid process. Spectroscopic ellipsometry optical critical dimension (SE-OCD) measurements were used to characterize the changes in pattern profile with good sensitivity. This study confirmed the erosion of the silica patterns in the traditional acid-based SPM cleaning mixture (H2SO4+H2O2) at a rate of ~0.1nm per cleaning cycle. The advanced non-acid clean process however only showed CD shift of ~0.01nm per clean. Contamination removal & pattern integrity of sensitive 20nm features under MegaSonic assisted cleaning is also demonstrated.

  18. A quantitative assessment method for Ascaris eggs on hands.

    PubMed

    Jeandron, Aurelie; Ensink, Jeroen H J; Thamsborg, Stig M; Dalsgaard, Anders; Sengupta, Mita E

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness.

  19. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    PubMed

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.

  20. Advanced methods of structural and trajectory analysis for transport aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1995-01-01

    This report summarizes the efforts in two areas: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of trajectory optimization. The majority of the effort was spent in the structural weight area. A draft of 'Analytical Fuselage and Wing Weight Estimation of Transport Aircraft', resulting from this research, is included as an appendix.

  1. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  2. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; El-Sharawy, El-Budawy; Hashemi-Yeganeh, Shahrokh; Aberle, James T.; Birtcher, Craig R.

    1991-01-01

    The Advanced Helicopter Electromagnetics is centered on issues that advance technology related to helicopter electromagnetics. Progress was made on three major topics: composite materials; precipitation static corona discharge; and antenna technology. In composite materials, the research has focused on the measurements of their electrical properties, and the modeling of material discontinuities and their effect on the radiation pattern of antennas mounted on or near material surfaces. The electrical properties were used to model antenna performance when mounted on composite materials. Since helicopter platforms include several antenna systems at VHF and UHF bands, measuring techniques are being explored that can be used to measure the properties at these bands. The effort on corona discharge and precipitation static was directed toward the development of a new two dimensional Voltage Finite Difference Time Domain computer program. Results indicate the feasibility of using potentials for simulating electromagnetic problems in the cases where potentials become primary sources. In antenna technology the focus was on Polarization Diverse Conformal Microstrip Antennas, Cavity Backed Slot Antennas, and Varactor Tuned Circular Patch Antennas. Numerical codes were developed for the analysis of two probe fed rectangular and circular microstrip patch antennas fed by resistive and reactive power divider networks.

  3. Breast tumour visualization using 3D quantitative ultrasound methods

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.

    2016-04-01

    Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.

  4. Striking against bioterrorism with advanced proteomics and reference methods.

    PubMed

    Armengaud, Jean

    2017-01-01

    The intentional use by terrorists of biological toxins as weapons has been of great concern for many years. Among the numerous toxins produced by plants, animals, algae, fungi, and bacteria, ricin is one of the most scrutinized by the media because it has already been used in biocrimes and acts of bioterrorism. Improving the analytical toolbox of national authorities to monitor these potential bioweapons all at once is of the utmost interest. MS/MS allows their absolute quantitation and exhibits advantageous sensitivity, discriminative power, multiplexing possibilities, and speed. In this issue of Proteomics, Gilquin et al. (Proteomics 2017, 17, 1600357) present a robust multiplex assay to quantify a set of eight toxins in the presence of a complex food matrix. This MS/MS reference method is based on scheduled SRM and high-quality standards consisting of isotopically labeled versions of these toxins. Their results demonstrate robust reliability based on rather loose scheduling of SRM transitions and good sensitivity for the eight toxins, lower than their oral median lethal doses. In the face of an increased threat from terrorism, relevant reference assays based on advanced proteomics and high-quality companion toxin standards are reliable and firm answers.

  5. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  6. Advanced particulate matter control apparatus and methods

    DOEpatents

    Miller, Stanley J [Grand Forks, ND; Zhuang, Ye [Grand Forks, ND; Almlie, Jay C [East Grand Forks, MN

    2012-01-10

    Apparatus and methods for collection and removal of particulate matter, including fine particulate matter, from a gas stream, comprising a unique combination of high collection efficiency and ultralow pressure drop across the filter. The apparatus and method utilize simultaneous electrostatic precipitation and membrane filtration of a particular pore size, wherein electrostatic collection and filtration occur on the same surface.

  7. Advanced reliability methods for structural evaluation

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.; Wu, Y.-T.

    1985-01-01

    Fast probability integration (FPI) methods, which can yield approximate solutions to such general structural reliability problems as the computation of the probabilities of complicated functions of random variables, are known to require one-tenth the computer time of Monte Carlo methods for a probability level of 0.001; lower probabilities yield even more dramatic differences. A strategy is presented in which a computer routine is run k times with selected perturbed values of the variables to obtain k solutions for a response variable Y. An approximating polynomial is fit to the k 'data' sets, and FPI methods are employed for this explicit form.

  8. Indentation Methods in Advanced Materials Research Introduction

    SciTech Connect

    Pharr, George Mathews; Cheng, Yang-Tse; Hutchings, Ian; Sakai, Mototsugu; Moody, Neville; Sundararajan, G.; Swain, Michael V.

    2009-01-01

    Since its commercialization early in the 20th century, indentation testing has played a key role in the development of new materials and understanding their mechanical behavior. Progr3ess in the field has relied on a close marriage between research in the mechanical behavior of materials and contact mechanics. The seminal work of Hertz laid the foundations for bringing these two together, with his contributions still widely utilized today in examining elastic behavior and the physics of fracture. Later, the pioneering work of Tabor, as published in his classic text 'The Hardness of Metals', exapdned this understanding to address the complexities of plasticity. Enormous progress in the field has been achieved in the last decade, made possible both by advances in instrumentation, for example, load and depth-sensing indentation and scanning electron microscopy (SEM) and transmission electron microscopy (TEM) based in situ testing, as well as improved modeling capabilities that use computationally intensive techniques such as finite element analysis and molecular dynamics simulation. The purpose of this special focus issue is to present recent state of the art developments in the field.

  9. Advanced spectral methods for climatic time series

    USGS Publications Warehouse

    Ghil, M.; Allen, M.R.; Dettinger, M.D.; Ide, K.; Kondrashov, D.; Mann, M.E.; Robertson, A.W.; Saunders, A.; Tian, Y.; Varadi, F.; Yiou, P.

    2002-01-01

    The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this review we describe the connections between time series analysis and nonlinear dynamics, discuss signal- to-noise enhancement, and present some of the novel methods for spectral analysis. The various steps, as well as the advantages and disadvantages of these methods, are illustrated by their application to an important climatic time series, the Southern Oscillation Index. This index captures major features of interannual climate variability and is used extensively in its prediction. Regional and global sea surface temperature data sets are used to illustrate multivariate spectral methods. Open questions and further prospects conclude the review.

  10. Advances in Geometric Acoustic Propagation Modeling Methods

    NASA Astrophysics Data System (ADS)

    Blom, P. S.; Arrowsmith, S.

    2013-12-01

    Geometric acoustics provides an efficient numerical method to model propagation effects. At leading order, one can identify ensonified regions and calculate celerities of the predicted arrivals. Beyond leading order, the solution of the transport equation provides a means to estimate the amplitude of individual acoustic phases. The auxiliary parameters introduced in solving the transport equation have been found to provide a means of identifying ray paths connecting source and receiver, or eigenrays, for non-planar propagation. A detailed explanation of the eigenray method will be presented as well as an application to predicting azimuth deviations for infrasonic data recorded during the Humming Roadrunner experiment of 2012.

  11. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  12. Advanced method for making vitreous waste forms

    SciTech Connect

    Pope, J.M.; Harrison, D.E.

    1980-01-01

    A process is described for making waste glass that circumvents the problems of dissolving nuclear waste in molten glass at high temperatures. Because the reactive mixing process is independent of the inherent viscosity of the melt, any glass composition can be prepared with equal facility. Separation of the mixing and melting operations permits novel glass fabrication methods to be employed.

  13. Evolution of Quantitative Measures in NMR: Quantum Mechanical qHNMR Advances Chemical Standardization of a Red Clover (Trifolium pratense) Extract.

    PubMed

    Phansalkar, Rasika S; Simmler, Charlotte; Bisson, Jonathan; Chen, Shao-Nong; Lankin, David C; McAlpine, James B; Niemitz, Matthias; Pauli, Guido F

    2017-01-09

    Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative (1)H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5-36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs (1)H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of (1)H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials.

  14. Evolution of Quantitative Measures in NMR: Quantum Mechanical qHNMR Advances Chemical Standardization of a Red Clover (Trifolium pratense) Extract

    PubMed Central

    2017-01-01

    Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513

  15. Advanced methods in synthetic aperture radar imaging

    NASA Astrophysics Data System (ADS)

    Kragh, Thomas

    2012-02-01

    For over 50 years our world has been mapped and measured with synthetic aperture radar (SAR). A SAR system operates by transmitting a series of wideband radio-frequency pulses towards the ground and recording the resulting backscattered electromagnetic waves as the system travels along some one-dimensional trajectory. By coherently processing the recorded backscatter over this extended aperture, one can form a high-resolution 2D intensity map of the ground reflectivity, which we call a SAR image. The trajectory, or synthetic aperture, is achieved by mounting the radar on an aircraft, spacecraft, or even on the roof of a car traveling down the road, and allows for a diverse set of applications and measurement techniques for remote sensing applications. It is quite remarkable that the sub-centimeter positioning precision and sub-nanosecond timing precision required to make this work properly can in fact be achieved under such real-world, often turbulent, vibrationally intensive conditions. Although the basic principles behind SAR imaging and interferometry have been known for decades, in recent years an explosion of data exploitation techniques enabled by ever-faster computational horsepower have enabled some remarkable advances. Although SAR images are often viewed as simple intensity maps of ground reflectivity, SAR is also an exquisitely sensitive coherent imaging modality with a wealth of information buried within the phase information in the image. Some of the examples featured in this presentation will include: (1) Interferometric SAR, where by comparing the difference in phase between two SAR images one can measure subtle changes in ground topography at the wavelength scale. (2) Change detection, in which carefully geolocated images formed from two different passes are compared. (3) Multi-pass 3D SAR tomography, where multiple trajectories can be used to form 3D images. (4) Moving Target Indication (MTI), in which Doppler effects allow one to detect and

  16. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  17. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    PubMed Central

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  18. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.

    PubMed

    Fitterer, Jessica L; Nelson, Trisalyn A

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).

  19. Advancing-layers method for generation of unstructured viscous grids

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar

    1993-01-01

    A novel approach for generating highly stretched grids which is based on a modified advancing-front technique and benefits from the generality, flexibility, and grid quality of the conventional advancing-front-based Euler grid generators is presented. The method is self-sufficient for the insertion of grid points in the boundary layer and beyond. Since it is based on a totally unstructured grid strategy, the method alleviates the difficulties stemming from the structural limitations of the prismatic techniques.

  20. Advanced Electromagnetic Methods for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Polycarpou, Anastasis; Birtcher, Craig R.; Georgakopoulos, Stavros; Han, Dong-Ho; Ballas, Gerasimos

    1999-01-01

    The imminent destructive threats of Lightning on helicopters and other airborne systems has always been a topic of great interest to this research grant. Previously, the lightning induced currents on the surface of the fuselage and its interior were predicted using the finite-difference time-domain (FDTD) method as well as the NEC code. The limitations of both methods, as applied to lightning, were identified and extensively discussed in the last meeting. After a thorough investigation of the capabilities of the FDTD, it was decided to incorporate into the numerical method a subcell model to accurately represent current diffusion through conducting materials of high conductivity and finite thickness. Because of the complexity of the model, its validity will be first tested for a one-dimensional FDTD problem. Although results are not available yet, the theory and formulation of the subcell model are presented and discussed here to a certain degree. Besides lightning induced currents in the interior of an aircraft, penetration of electromagnetic fields through apertures (e.g., windows and cracks) could also be devastating for the navigation equipment, electronics, and communications systems in general. The main focus of this study is understanding and quantifying field penetration through apertures. The simulation is done using the FDTD method and the predictions are compared with measurements and moment method solutions obtained from the NASA Langley Research Center. Cavity-backed slot (CBS) antennas or slot antennas in general have many applications in aircraft-satellite type of communications. These can be flushmounted on the surface of the fuselage and, therefore, they retain the aerodynamic shape of the aircraft. In the past, input impedance and radiation patterns of CBS antennas were computed using a hybrid FEM/MoM code. The analysis is now extended to coupling between two identical slot antennas mounted on the same structure. The predictions are performed

  1. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  2. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  3. Advancements in Research Synthesis Methods: From a Methodologically Inclusive Perspective

    ERIC Educational Resources Information Center

    Suri, Harsh; Clarke, David

    2009-01-01

    The dominant literature on research synthesis methods has positivist and neo-positivist origins. In recent years, the landscape of research synthesis methods has changed rapidly to become inclusive. This article highlights methodologically inclusive advancements in research synthesis methods. Attention is drawn to insights from interpretive,…

  4. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  5. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  6. Advanced Fuzzy Potential Field Method for Mobile Robot Obstacle Avoidance

    PubMed Central

    Park, Jong-Wook; Kwak, Hwan-Joo; Kang, Young-Chang; Kim, Dong W.

    2016-01-01

    An advanced fuzzy potential field method for mobile robot obstacle avoidance is proposed. The potential field method primarily deals with the repulsive forces surrounding obstacles, while fuzzy control logic focuses on fuzzy rules that handle linguistic variables and describe the knowledge of experts. The design of a fuzzy controller—advanced fuzzy potential field method (AFPFM)—that models and enhances the conventional potential field method is proposed and discussed. This study also examines the rule-explosion problem of conventional fuzzy logic and assesses the performance of our proposed AFPFM through simulations carried out using a mobile robot. PMID:27123001

  7. Advanced Fuzzy Potential Field Method for Mobile Robot Obstacle Avoidance.

    PubMed

    Park, Jong-Wook; Kwak, Hwan-Joo; Kang, Young-Chang; Kim, Dong W

    2016-01-01

    An advanced fuzzy potential field method for mobile robot obstacle avoidance is proposed. The potential field method primarily deals with the repulsive forces surrounding obstacles, while fuzzy control logic focuses on fuzzy rules that handle linguistic variables and describe the knowledge of experts. The design of a fuzzy controller--advanced fuzzy potential field method (AFPFM)--that models and enhances the conventional potential field method is proposed and discussed. This study also examines the rule-explosion problem of conventional fuzzy logic and assesses the performance of our proposed AFPFM through simulations carried out using a mobile robot.

  8. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  9. Quantitative methods to characterize morphological properties of cell lines.

    PubMed

    Mancia, Annalaura; Elliott, John T; Halter, Michael; Bhadriraju, Kiran; Tona, Alessandro; Spurlin, Tighe A; Middlebrooks, Bobby L; Baatz, John E; Warr, Gregory W; Plant, Anne L

    2012-07-01

    Descriptive terms are often used to characterize cells in culture, but the use of nonquantitative and poorly defined terms can lead to ambiguities when comparing data from different laboratories. Although recently there has been a good deal of interest in unambiguous identification of cell lines via their genetic markers, it is also critical to have definitive, quantitative metrics to describe cell phenotypic characteristics. Quantitative metrics of cell phenotype will aid the comparison of data from experiments performed at different times and in different laboratories where influences such as the age of the population and differences in culture conditions or protocols can potentially affect cellular metabolic state and gene expression in the absence of changes in the genetic profile. Here, we present examples of robust methodologies for quantitatively assessing characteristics of cell morphology and cell-cell interactions, and of growth rates of cells within the population. We performed these analyses with endothelial cell lines derived from dolphin, bovine and human, and with a mouse fibroblast cell line. These metrics quantify some characteristics of these cells lines that clearly distinguish them from one another, and provide quantitative information on phenotypic changes in one of the cell lines over large number of passages.

  10. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  11. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  12. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  13. Unstructured viscous grid generation by advancing-front method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar

    1993-01-01

    A new method of generating unstructured triangular/tetrahedral grids with high-aspect-ratio cells is proposed. The method is based on new grid-marching strategy referred to as 'advancing-layers' for construction of highly stretched cells in the boundary layer and the conventional advancing-front technique for generation of regular, equilateral cells in the inviscid-flow region. Unlike the existing semi-structured viscous grid generation techniques, the new procedure relies on a totally unstructured advancing-front grid strategy resulting in a substantially enhanced grid flexibility and efficiency. The method is conceptually simple but powerful, capable of producing high quality viscous grids for complex configurations with ease. A number of two-dimensional, triangular grids are presented to demonstrate the methodology. The basic elements of the method, however, have been primarily designed with three-dimensional problems in mind, making it extendible for tetrahedral, viscous grid generation.

  14. Advanced Ablative Insulators and Methods of Making Them

    NASA Technical Reports Server (NTRS)

    Congdon, William M.

    2005-01-01

    Advanced ablative (more specifically, charring) materials that provide temporary protection against high temperatures, and advanced methods of designing and manufacturing insulators based on these materials, are undergoing development. These materials and methods were conceived in an effort to replace the traditional thermal-protection systems (TPSs) of re-entry spacecraft with robust, lightweight, better-performing TPSs that can be designed and manufactured more rapidly and at lower cost. These materials and methods could also be used to make improved TPSs for general aerospace, military, and industrial applications.

  15. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  16. Strategy to Promote Active Learning of an Advanced Research Method

    ERIC Educational Resources Information Center

    McDermott, Hilary J.; Dovey, Terence M.

    2013-01-01

    Research methods courses aim to equip students with the knowledge and skills required for research yet seldom include practical aspects of assessment. This reflective practitioner report describes and evaluates an innovative approach to teaching and assessing advanced qualitative research methods to final-year psychology undergraduate students. An…

  17. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  18. A Comparative Study on Tobacco Cessation Methods: A Quantitative Systematic Review

    PubMed Central

    Heydari, Gholamreza; Masjedi, Mohammadreza; Ahmady, Arezoo Ebn; Leischow, Scott J.; Lando, Harry A.; Shadmehr, Mohammad Behgam; Fadaizadeh, Lida

    2014-01-01

    Background: During recent years, there have been many advances in different types of pharmacological and non-pharmacological tobacco control treatments. In this study, we aimed to identify the most effective smoking cessation methods used in quit based upon a review of the literature. Methods: We did a search of PubMed, limited to English publications from 2000 to 2012. Two trained reviewers independently assessed titles, abstracts and full texts of articles after a pilot inter-rater reliability assessment which was conducted by the author (GH). The total number of papers and their conclusions including recommendation of that method (positive) or not supporting (negative) was computed for each method. The number of negative papers was subtracted from the number of positive ones for each method. In cases of inconsistency between the two reviewers, these were adjudicated by author. Results: Of the 932 articles that were critically assessed, 780 studies supported quit smoking methods. In 90 studies, the methods were not supported or rejected and in 62 cases the methods were not supported. Nicotine replacement therapy (NRT), Champix and Zyban with 352, 117 and 71 studies respectively were the most supported methods and e-cigarettes and non-Nicotine medications with one case were the least supported methods. Finally, NRT with 39 and Champix and education with 36 scores were the most supported methods. Conclusions: Results of this review indicate that the scientific papers in the most recent decade recommend the use of NRT and Champix in combination with educational interventions. Additional research is needed to compare qualitative and quantitative studies for smoking cessation. PMID:25013685

  19. A Primer In Advanced Fatigue Life Prediction Methods

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.

    2000-01-01

    Metal fatigue has plagued structural components for centuries, and it remains a critical durability issue in today's aerospace hardware. This is true despite vastly improved and advanced materials, increased mechanistic understanding, and development of accurate structural analysis and advanced fatigue life prediction tools. Each advance is quickly taken advantage of to produce safer, more reliable more cost effective, and better performing products. In other words, as the envelop is expanded, components are then designed to operate just as close to the newly expanded envelop as they were to the initial one. The problem is perennial. The economic importance of addressing structural durability issues early in the design process is emphasized. Tradeoffs with performance, cost, and legislated restrictions are pointed out. Several aspects of structural durability of advanced systems, advanced materials and advanced fatigue life prediction methods are presented. Specific items include the basic elements of durability analysis, conventional designs, barriers to be overcome for advanced systems, high-temperature life prediction for both creep-fatigue and thermomechanical fatigue, mean stress effects, multiaxial stress-strain states, and cumulative fatigue damage accumulation assessment.

  20. Recent advances in quantitative PCR (qPCR) applications in food microbiology.

    PubMed

    Postollec, Florence; Falentin, Hélène; Pavan, Sonia; Combrisson, Jérôme; Sohier, Danièle

    2011-08-01

    Molecular methods are being increasingly applied to detect, quantify and study microbial populations in food or during food processes. Among these methods, PCR-based techniques have been the subject of considerable focus and ISO guidelines have been established for the detection of food-borne pathogens. More particularly, real-time quantitative PCR (qPCR) is considered as a method of choice for the detection and quantification of microorganisms. One of its major advantages is to be faster than conventional culture-based methods. It is also highly sensitive, specific and enables simultaneous detection of different microorganisms. Application of reverse-transcription-qPCR (RT-qPCR) to study population dynamics and activities through quantification of gene expression in food, by contrast with the use of qPCR, is just beginning. Provided that appropriate controls are included in the analyses, qPCR and RT-qPCR appear to be highly accurate and reliable for quantification of genes and gene expression. This review addresses some important technical aspects to be considered when using these techniques. Recent applications of qPCR and RT-qPCR in food microbiology are given. Some interesting applications such as risk analysis or studying the influence of industrial processes on gene expression and microbial activity are reported.

  1. Method for depth-resolved quantitation of optical properties in layered media using spatially modulated quantitative spectroscopy.

    PubMed

    Saager, Rolf B; Truong, Alex; Cuccia, David J; Durkin, Anthony J

    2011-07-01

    We have demonstrated that spatially modulated quantitative spectroscopy (SMoQS) is capable of extracting absolute optical properties from homogeneous tissue simulating phantoms that span both the visible and near-infrared wavelength regimes. However, biological tissue, such as skin, is highly structured, presenting challenges to quantitative spectroscopic techniques based on homogeneous models. In order to more accurately address the challenges associated with skin, we present a method for depth-resolved optical property quantitation based on a two layer model. Layered Monte Carlo simulations and layered tissue simulating phantoms are used to determine the efficacy and accuracy of SMoQS to quantify layer specific optical properties of layered media. Initial results from both the simulation and experiment show that this empirical method is capable of determining top layer thickness within tens of microns across a physiological range for skin. Layer specific chromophore concentration can be determined to <±10% the actual values, on average, whereas bulk quantitation in either visible or near infrared spectroscopic regimes significantly underestimates the layer specific chromophore concentration and can be confounded by top layer thickness.

  2. Advancing the sensitivity of selected reaction monitoring-based targeted quantitative proteomics

    SciTech Connect

    Shi, Tujin; Su, Dian; Liu, Tao; Tang, Keqi; Camp, David G.; Qian, Weijun; Smith, Richard D.

    2012-04-01

    Selected reaction monitoring (SRM)—also known as multiple reaction monitoring (MRM)—has emerged as a promising high-throughput targeted protein quantification technology for candidate biomarker verification and systems biology applications. A major bottleneck for current SRM technology, however, is insufficient sensitivity for e.g., detecting low-abundance biomarkers likely present at the pg/mL to low ng/mL range in human blood plasma or serum, or extremely low-abundance signaling proteins in the cells or tissues. Herein we review recent advances in methods and technologies, including front-end immunoaffinity depletion, fractionation, selective enrichment of target proteins/peptides or their posttranslational modifications (PTMs), as well as advances in MS instrumentation, which have significantly enhanced the overall sensitivity of SRM assays and enabled the detection of low-abundance proteins at low to sub- ng/mL level in human blood plasma or serum. General perspectives on the potential of achieving sufficient sensitivity for detection of pg/mL level proteins in plasma are also discussed.

  3. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  4. Advanced surface paneling method for subsonic and supersonic flow

    NASA Technical Reports Server (NTRS)

    Erickson, L. L.; Johnson, F. T.; Ehlers, F. E.

    1976-01-01

    Numerical results illustrating the capabilities of an advanced aerodynamic surface paneling method are presented. The method is applicable to both subsonic and supersonic flow, as represented by linearized potential flow theory. The method is based on linearly varying sources and quadratically varying doublets which are distributed over flat or curved panels. These panels are applied to the true surface geometry of arbitrarily shaped three dimensional aerodynamic configurations.

  5. Human-System Safety Methods for Development of Advanced Air Traffic Management Systems

    SciTech Connect

    Nelson, W.R.

    1999-05-24

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems.

  6. Advanced digital methods for solid propellant burning rate determination

    NASA Astrophysics Data System (ADS)

    Jones, Daniel A.

    The work presented here is a study of a digital method for determining the combustion bomb burning rate of a fuel-rich gas generator propellant sample using the ultrasonic pulse-echo technique. The advanced digital method, which places user defined limits on the search for the ultrasonic echo from the burning surface, is computationally faster than the previous cross correlation method, and is able to analyze data for this class of propellant that the previous cross correlation data reduction method could not. For the conditions investigated, the best fit burning rate law at 800 psi from the ultrasonic technique and advanced cross correlation method is within 3 percent of an independent analysis of the same data, and is within 5 percent of the best fit burning rate law found from parallel research of the same propellant in a motor configuration.

  7. Quantitative analysis with advanced compensated polarized light microscopy on wavelength dependence of linear birefringence of single crystals causing arthritis

    NASA Astrophysics Data System (ADS)

    Takanabe, Akifumi; Tanaka, Masahito; Taniguchi, Atsuo; Yamanaka, Hisashi; Asahi, Toru

    2014-07-01

    To improve our ability to identify single crystals causing arthritis, we have developed a practical measurement system of polarized light microscopy called advanced compensated polarized light microscopy (A-CPLM). The A-CPLM system is constructed by employing a conventional phase retardation plate, an optical fibre and a charge-coupled device spectrometer in a polarized light microscope. We applied the A-CPLM system to measure linear birefringence (LB) in the visible region, which is an optical anisotropic property, for tiny single crystals causing arthritis, i.e. monosodium urate monohydrate (MSUM) and calcium pyrophosphate dihydrate (CPPD). The A-CPLM system performance was evaluated by comparing the obtained experimental data using the A-CPLM system with (i) literature data for a standard sample, MgF2, and (ii) experimental data obtained using an established optical method, high-accuracy universal polarimeter, for the MSUM. The A-CPLM system was found to be applicable for measuring the LB spectra of the single crystals of MSUM and CPPD, which cause arthritis, in the visible regions. We quantitatively reveal the large difference in LB between MSUM and CPPD crystals. These results demonstrate the usefulness of the A-CPLM system for distinguishing the crystals causing arthritis.

  8. Quantitative estimation of poikilocytosis by the coherent optical method

    NASA Astrophysics Data System (ADS)

    Safonova, Larisa P.; Samorodov, Andrey V.; Spiridonov, Igor N.

    2000-05-01

    The investigation upon the necessity and the reliability required of the determination of the poikilocytosis in hematology has shown that existing techniques suffer from grave shortcomings. To determine a deviation of the erythrocytes' form from the normal (rounded) one in blood smears it is expedient to use an integrative estimate. The algorithm which is based on the correlation between erythrocyte morphological parameters with properties of the spatial-frequency spectrum of blood smear is suggested. During analytical and experimental research an integrative form parameter (IFP) which characterizes the increase of the relative concentration of cells with the changed form over 5% and the predominating type of poikilocytes was suggested. An algorithm of statistically reliable estimation of the IFP on the standard stained blood smears has been developed. To provide the quantitative characterization of the morphological features of cells a form vector has been proposed, and its validity for poikilocytes differentiation was shown.

  9. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  10. Advanced propulsion for LEO-Moon transport. 1: A method for evaluating advanced propulsion performance

    NASA Technical Reports Server (NTRS)

    Stern, Martin O.

    1992-01-01

    This report describes a study to evaluate the benefits of advanced propulsion technologies for transporting materials between low Earth orbit and the Moon. A relatively conventional reference transportation system, and several other systems, each of which includes one advanced technology component, are compared in terms of how well they perform a chosen mission objective. The evaluation method is based on a pairwise life-cycle cost comparison of each of the advanced systems with the reference system. Somewhat novel and economically important features of the procedure are the inclusion not only of mass payback ratios based on Earth launch costs, but also of repair and capital acquisition costs, and of adjustments in the latter to reflect the technological maturity of the advanced technologies. The required input information is developed by panels of experts. The overall scope and approach of the study are presented in the introduction. The bulk of the paper describes the evaluation method; the reference system and an advanced transportation system, including a spinning tether in an eccentric Earth orbit, are used to illustrate it.

  11. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    EPA Science Inventory

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  12. Advanced boundary layer transition measurement methods for flight applications

    NASA Technical Reports Server (NTRS)

    Holmes, B. J.; Croom, C. C.; Gail, P. D.; Manuel, G. S.; Carraway, D. L.

    1986-01-01

    In modern laminar flow flight research, it is important to understand the specific cause(s) of laminar to turbulent boundary-layer transition. Such information is crucial to the exploration of the limits of practical application of laminar flow for drag reduction on aircraft. The transition modes of interest in current flight investigations include the viscous Tollmien-Schlichting instability, the inflectional instability at laminar separation, and the crossflow inflectional instability, as well as others. This paper presents the results to date of research on advanced devices and methods used for the study of laminar boundary-layer transition phenomena in the flight environment. Recent advancements in the development of arrayed hot-film devices and of a new flow visualization method are discussed. Arrayed hot-film devices have been designed to detect the presence of laminar separation, and of crossflow vorticity. The advanced flow visualization method utilizes color changes in liquid-crystal coatings to detect boundary-layer transition at high altitude flight conditions. Flight and wind tunnel data are presented to illustrate the design and operation of these advanced methods. These new research tools provide information on disturbance growth and transition mode which is essential to furthering our understanding of practical design limits for applications of laminar flow technology.

  13. Techniques for quantitative LC-MS/MS analysis of protein therapeutics: advances in enzyme digestion and immunocapture.

    PubMed

    Fung, Eliza N; Bryan, Peter; Kozhich, Alexander

    2016-04-01

    LC-MS/MS has been investigated to quantify protein therapeutics in biological matrices. The protein therapeutics is digested by an enzyme to generate surrogate peptide(s) before LC-MS/MS analysis. One challenge is isolating protein therapeutics in the presence of large number of endogenous proteins in biological matrices. Immunocapture, in which a capture agent is used to preferentially bind the protein therapeutics over other proteins, is gaining traction. The protein therapeutics is eluted for digestion and LC-MS/MS analysis. One area of tremendous potential for immunocapture-LC-MS/MS is to obtain quantitative data where ligand-binding assay alone is not sufficient, for example, quantitation of antidrug antibody complexes. Herein, we present an overview of recent advance in enzyme digestion and immunocapture applicable to protein quantitation.

  14. Domain Decomposition By the Advancing-Partition Method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    2008-01-01

    A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.

  15. A quantitative comparative analysis of Advancement via Independent Determination (AVID) in Texas middle schools

    NASA Astrophysics Data System (ADS)

    Reed, Krystal Astra

    The "Advancement via Individual Determination (AVID) program was designed to provide resources and strategies that enable underrepresented minority students to attend 4-year colleges" (AVID Center, 2013, p. 2). These students are characterized as the forgotten middle in that they have high test scores, average-to-low grades, minority or low socioeconomic status, and will be first-generation college students (AVID, 2011). Research indicates (Huerta, Watt, & Butcher, 2013) that strict adherence to 11 program components supports success of students enrolled in AVID, and AVID certification depends on districts following those components. Several studies (AVID Center, 2013) have investigated claims about the AVID program through qualitative analyses; however, very few have addressed this program quantitatively. This researcher sought to determine whether differences existed between student achievement and attendance rates between AVID and non-AVID middle schools. To achieve this goal, the researcher compared eighth-grade science and seventh- and eighth-grade mathematics scores from the 2007 to 2011 Texas Assessment of Knowledge and Skills (TAKS) and overall attendance rates in demographically equivalent AVID and non-AVID middle schools. Academic Excellence Indicator System (AEIS) reports from the Texas Education Agency (TEA) were used to obtain 2007 to 2011 TAKS results and attendance information for the selected schools. The results indicated a statistically significant difference between AVID demonstration students and non-AVID students in schools with similar CI. No statistically significant differences were found on any component of the TAKS for AVID economically disadvantaged students. The mean scores indicated an achievement gap between non-AVID and AVID demonstration middle schools. The findings from the other three research questions indicated no statistically significant differences between AVID and non-AVID student passing rates on the seventh- and eighth

  16. Advances and future directions of research on spectral methods

    NASA Technical Reports Server (NTRS)

    Patera, A. T.

    1986-01-01

    Recent advances in spectral methods are briefly reviewed and characterized with respect to their convergence and computational complexity. Classical finite element and spectral approaches are then compared, and spectral element (or p-type finite element) approximations are introduced. The method is applied to the full Navier-Stokes equations, and examples are given of the application of the technique to several transitional flows. Future directions of research in the field are outlined.

  17. An advanced Gibbs-Duhem integration method: theory and applications.

    PubMed

    van 't Hof, A; Peters, C J; de Leeuw, S W

    2006-02-07

    The conventional Gibbs-Duhem integration method is very convenient for the prediction of phase equilibria of both pure components and mixtures. However, it turns out to be inefficient. The method requires a number of lengthy simulations to predict the state conditions at which phase coexistence occurs. This number is not known from the outset of the numerical integration process. Furthermore, the molecular configurations generated during the simulations are merely used to predict the coexistence condition and not the liquid- and vapor-phase densities and mole fractions at coexistence. In this publication, an advanced Gibbs-Duhem integration method is presented that overcomes above-mentioned disadvantage and inefficiency. The advanced method is a combination of Gibbs-Duhem integration and multiple-histogram reweighting. Application of multiple-histogram reweighting enables the substitution of the unknown number of simulations by a fixed and predetermined number. The advanced method has a retroactive nature; a current simulation improves the predictions of previously computed coexistence points as well. The advanced Gibbs-Duhem integration method has been applied for the prediction of vapor-liquid equilibria of a number of binary mixtures. The method turned out to be very convenient, much faster than the conventional method, and provided smooth simulation results. As the employed force fields perfectly predict pure-component vapor-liquid equilibria, the binary simulations were very well suitable for testing the performance of different sets of combining rules. Employing Lorentz-Hudson-McCoubrey combining rules for interactions between unlike molecules, as opposed to Lorentz-Berthelot combining rules for all interactions, considerably improved the agreement between experimental and simulated data.

  18. Uncertainty in environmental health impact assessment: quantitative methods and perspectives.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Vanni, Tazio; Foss, Anna M

    2013-01-01

    Environmental health impact assessment models are subjected to great uncertainty due to the complex associations between environmental exposures and health. Quantifying the impact of uncertainty is important if the models are used to support health policy decisions. We conducted a systematic review to identify and appraise current methods used to quantify the uncertainty in environmental health impact assessment. In the 19 studies meeting the inclusion criteria, several methods were identified. These were grouped into random sampling methods, second-order probability methods, Bayesian methods, fuzzy sets, and deterministic sensitivity analysis methods. All 19 studies addressed the uncertainty in the parameter values but only 5 of the studies also addressed the uncertainty in the structure of the models. None of the articles reviewed considered conceptual sources of uncertainty associated with the framing assumptions or the conceptualisation of the model. Future research should attempt to broaden the way uncertainty is taken into account in environmental health impact assessments.

  19. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    PubMed

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms.

  20. Advances in nucleic acid-based detection methods.

    PubMed Central

    Wolcott, M J

    1992-01-01

    Laboratory techniques based on nucleic acid methods have increased in popularity over the last decade with clinical microbiologists and other laboratory scientists who are concerned with the diagnosis of infectious agents. This increase in popularity is a result primarily of advances made in nucleic acid amplification and detection techniques. Polymerase chain reaction, the original nucleic acid amplification technique, changed the way many people viewed and used nucleic acid techniques in clinical settings. After the potential of polymerase chain reaction became apparent, other methods of nucleic acid amplification and detection were developed. These alternative nucleic acid amplification methods may become serious contenders for application to routine laboratory analyses. This review presents some background information on nucleic acid analyses that might be used in clinical and anatomical laboratories and describes some recent advances in the amplification and detection of nucleic acids. PMID:1423216

  1. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  2. Methods for equine preantral follicles isolation: quantitative aspects.

    PubMed

    Leonel, E C R; Bento-Silva, V; Ambrozio, K S; Luna, H S; Costa e Silva, E V; Zúccari, C E S N

    2013-12-01

    The aim of this study was to test the use of mechanical and mechanical-enzymatic methods, saline solution (SS), and PBS solution for the manipulation and isolation of mare ovarian preantral follicles (PAFs). The ovaries were subjected to mechanical isolation (mixer) alone or in association with enzymatic digestion (collagenase). Incubation times of 10 and 20 min were employed. In the first group, 4.1 ± 4.9 PAFs were harvested with the mechanical-enzymatic method vs 71.1 ± 19.2 with the mechanical procedure, showing a significant difference between methods; using SS and PBS, these numbers were 35.7 ± 34.3 and 39.6 ± 39.6, respectively, with no significant difference between solutions. In the second group, there was significant difference between methods, with 7.1 ± 10.6 follicles harvested with the mechanical-enzymatic method vs 63.2 ± 22.9 with the mechanical procedure; using SS and PBS, means were 35.5 ± 36.4 and 34.9 ± 31.1, respectively. The mechanical method proved more effective than the mechanical-enzymatic approach. Both SS and PBS can be used as a media for equine PAFs preparation.

  3. Quantitative Analysis of Intra-chromosomal Contacts: The 3C-qPCR Method.

    PubMed

    Ea, Vuthy; Court, Franck; Forné, Thierry

    2017-01-01

    The chromosome conformation capture (3C) technique is fundamental to many population-based methods investigating chromatin dynamics and organization in eukaryotes. Here, we provide a modified quantitative 3C (3C-qPCR) protocol for improved quantitative analyses of intra-chromosomal contacts. We also describe an algorithm for data normalization which allows more accurate comparisons between contact profiles.

  4. A general method for the quantitative assessment of mineral pigments.

    PubMed

    Ares, M C Zurita; Fernández, J M

    2016-01-01

    A general method for the estimation of mineral pigment contents in different bases has been proposed using a sole set of calibration curves, (one for each pigment), calculated for a white standard base, thus elaborating patterns for each utilized base is not necessary. The method can be used in different bases and its validity had ev en been proved in strongly tinted bases. The method consists of a novel procedure that combines diffuse reflectance spectroscopy, second derivatives and the Kubelka-Munk function. This technique has proved to be at least one order of magnitude more sensitive than X-Ray diffraction for colored compounds, since it allowed the determination of the pigment amount in colored samples containing 0.5 wt% of pigment that was not detected by X-Ray Diffraction. The method can be used to estimate the concentration of mineral pigments in a wide variety of either natural or artificial materials, since it does not requiere the calculation of each pigment pattern in every base. This fact could have important industrial consequences, as the proposed method would be more convenient, faster and cheaper.

  5. Sequencing human ribs into anatomical order by quantitative multivariate methods.

    PubMed

    Cirillo, John; Henneberg, Maciej

    2012-06-01

    Little research has focussed on methods to anatomically sequence ribs. Correct anatomical sequencing of ribs assists in determining the location and distribution of regional trauma, age estimation, number of puncture wounds, number of individuals, and personal identification. The aim of the current study is to develop a method for placing fragmented and incomplete rib sets into correct anatomical position. Ribs 2-10 were used from eleven cadavers of an Australian population. Seven variables were measured from anatomical locations on the rib. General descriptive statistics were calculated for each variable along with an analysis of variance (ANOVA) and ANOVA with Bonferroni statistics. Considerable overlap was observed between ribs for univariate methods. Bivariate and multivariate methods were then applied. Results of the ANOVA with post hoc Bonferroni statistics show that ratios of various dimensions of a single rib could be used to sequence it within adjacent ribs. Using multiple regression formulae, the most accurate estimation of the anatomical rib number occurs when the entire rib is found in isolation. This however, is not always possible. Even when only the head and neck of the rib are preserved, a modified multivariate regression formula assigned 91.95% of ribs into correct anatomical position or as an adjacent rib. Using multivariate methods it is possible to sequence a single human rib with a high level of accuracy and they are superior to univariate methods. Left and right ribs were found to be highly symmetrical. Some rib dimensions were greater in males than in females, but overall the level of sexual dimorphism was low.

  6. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  7. Spinal Cord Segmentation by One Dimensional Normalized Template Matching: A Novel, Quantitative Technique to Analyze Advanced Magnetic Resonance Imaging Data.

    PubMed

    Cadotte, Adam; Cadotte, David W; Livne, Micha; Cohen-Adad, Julien; Fleet, David; Mikulis, David; Fehlings, Michael G

    2015-01-01

    Spinal cord segmentation is a developing area of research intended to aid the processing and interpretation of advanced magnetic resonance imaging (MRI). For example, high resolution three-dimensional volumes can be segmented to provide a measurement of spinal cord atrophy. Spinal cord segmentation is difficult due to the variety of MRI contrasts and the variation in human anatomy. In this study we propose a new method of spinal cord segmentation based on one-dimensional template matching and provide several metrics that can be used to compare with other segmentation methods. A set of ground-truth data from 10 subjects was manually-segmented by two different raters. These ground truth data formed the basis of the segmentation algorithm. A user was required to manually initialize the spinal cord center-line on new images, taking less than one minute. Template matching was used to segment the new cord and a refined center line was calculated based on multiple centroids within the segmentation. Arc distances down the spinal cord and cross-sectional areas were calculated. Inter-rater validation was performed by comparing two manual raters (n = 10). Semi-automatic validation was performed by comparing the two manual raters to the semi-automatic method (n = 10). Comparing the semi-automatic method to one of the raters yielded a Dice coefficient of 0.91 +/- 0.02 for ten subjects, a mean distance between spinal cord center lines of 0.32 +/- 0.08 mm, and a Hausdorff distance of 1.82 +/- 0.33 mm. The absolute variation in cross-sectional area was comparable for the semi-automatic method versus manual segmentation when compared to inter-rater manual segmentation. The results demonstrate that this novel segmentation method performs as well as a manual rater for most segmentation metrics. It offers a new approach to study spinal cord disease and to quantitatively track changes within the spinal cord in an individual case and across cohorts of subjects.

  8. Advanced Doubling Adding Method for Radiative Transfer in Planetary Atmospheres

    NASA Astrophysics Data System (ADS)

    Liu, Quanhua; Weng, Fuzhong

    2006-12-01

    The doubling adding method (DA) is one of the most accurate tools for detailed multiple-scattering calculations. The principle of the method goes back to the nineteenth century in a problem dealing with reflection and transmission by glass plates. Since then the doubling adding method has been widely used as a reference tool for other radiative transfer models. The method has never been used in operational applications owing to tremendous demand on computational resources from the model. This study derives an analytical expression replacing the most complicated thermal source terms in the doubling adding method. The new development is called the advanced doubling adding (ADA) method. Thanks also to the efficiency of matrix and vector manipulations in FORTRAN 90/95, the advanced doubling adding method is about 60 times faster than the doubling adding method. The radiance (i.e., forward) computation code of ADA is easily translated into tangent linear and adjoint codes for radiance gradient calculations. The simplicity in forward and Jacobian computation codes is very useful for operational applications and for the consistency between the forward and adjoint calculations in satellite data assimilation.

  9. General advancing front packing algorithm for the discrete element method

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos A. Recarey; Pérez Morales, Irvin Pablo; de Farias, Márcio Muniz; de Navarra, Eugenio Oñate Ibañez; Valera, Roberto Roselló; Casañas, Harold Díaz-Guzmán

    2016-11-01

    A generic formulation of a new method for packing particles is presented. It is based on a constructive advancing front method, and uses Monte Carlo techniques for the generation of particle dimensions. The method can be used to obtain virtual dense packings of particles with several geometrical shapes. It employs continuous, discrete, and empirical statistical distributions in order to generate the dimensions of particles. The packing algorithm is very flexible and allows alternatives for: 1—the direction of the advancing front (inwards or outwards), 2—the selection of the local advancing front, 3—the method for placing a mobile particle in contact with others, and 4—the overlap checks. The algorithm also allows obtaining highly porous media when it is slightly modified. The use of the algorithm to generate real particle packings from grain size distribution curves, in order to carry out engineering applications, is illustrated. Finally, basic applications of the algorithm, which prove its effectiveness in the generation of a large number of particles, are carried out.

  10. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, Theodore H. H.

    1991-01-01

    The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.

  11. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, Edward F.; Keller, Richard A.; Apel, Charles T.

    1983-01-01

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions.

  12. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1983-09-06

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions. 6 figs.

  13. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  14. Selection methods in forage breeding: a quantitative appraisal

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Forage breeding can be extraordinarily complex because of the number of species, perenniality, mode of reproduction, mating system, and the genetic correlation for some traits evaluated in spaced plants vs. performance under cultivation. Aiming to compare eight forage breeding methods for direct sel...

  15. Magnetic Ligation Method for Quantitative Detection of MicroRNAs

    PubMed Central

    Liong, Monty; Im, Hyungsoon; Majmudar, Maulik D.; Aguirre, Aaron D.; Sebas, Matthew; Lee, Hakho; Weissleder, Ralph

    2014-01-01

    A magnetic ligation method is utilized for the detection of microRNAs amongst a complex biological background without polymerase chain reaction or nucleotide modification. The sandwich probes assay can be adapted to analyze a panel of microRNAs associated with cardiovascular diseases in heart tissue samples. PMID:24532323

  16. Computer Image Analysis Method for Rapid Quantitation of Macrophage Phagocytosis

    DTIC Science & Technology

    1990-01-01

    None- Methods of Enzymology, ( Sabato , D.G., and Everse, J.. Eds.) theless, occasionally an excessive number of micro- New York: Academic Press, vol...heterogeneity in neonates mology ( Sabato , G.D., and Everse, J., Eds.) New York: Aca- and adults. Blood 68,200, 1986. demic Press. Vol. 132, p. 3, 1986. 21

  17. Quantitative method for enumeration of enterotoxigenic Escherichia coli.

    PubMed Central

    Calderon, R L; Levin, M A

    1981-01-01

    A rapid method was developed to quantify toxigenic Escherichia coli, using a membrane filter procedure. After filtration of samples, the membrane filter was first incubated on a medium selective for E. coli (24 h, 44 degrees C) and then transferred to tryptic soy agar (3%; 6 h, 37 degrees C). To assay for labile toxin-producing colonies, the filter was then transferred to a monolayer of Y-1 cells, the E. coli colonies were marked on the bottom of the petri dish, and the filter was removed after 15 min. The monolayer was observed for a positive rounding effect after a 15- to 24-h incubation. The method has an upper limit of detecting 30 toxigenic colonies per plate and can detect as few as one toxigenic colony per plate. A preliminary screening for these enterotoxigenic strains in polluted waters and known positive fecal samples was performed, and positive results were obtained with fecal samples only. PMID:7007415

  18. A quantitative sampling method for Oncomelania quadrasi by filter paper.

    PubMed

    Tanaka, H; Santos, M J; Matsuda, H; Yasuraoka, K; Santos, A T

    1975-08-01

    Filter paper was found to attract Oncomelania quadrasi in waters the same way as fallen dried banana leaves, although less number of other species of snails was collected on the former than on the latter. Snails were collected in limited areas using a tube (85 cm2 area at cross-section) and a filter paper (20 X 20 CM) samplers. The sheet of filter paper was placed close to the spot where a tube sample was taken, and recovered after 24 hours. At each sampling, 30 samples were taken by each method in an area and sampling was made four times. The correlation of the number of snails collected by the tube and that by filter paper was studied. The ratio of the snail counts by the tube sampler to those by the filter paper was 1.18. A loose correlation was observed between snail counts of both methods as shown by the correlation coefficient r = 0.6502. The formulas for the regression line were Y = 0.77 X + 1.6 and X = 0.55 Y + 1.35 for 3 experiments where Y is the number of snails collected by tube sampling and X is the number of snails collected in the sheet of filter paper. The type of snail distribution was studied in the 30 samples taken by each method and this was observed to be nearly the same in both sampling methods. All sampling data were found to fit the negative binomial distribution with the values of the constant k varying very much from 0.5775 to 5.9186 in (q -- p)-k. In each experiment, the constant k was always larger in tube sampling than in filter paper sampling. This indicates that the uneven distribution of snails on the soil surface becomes more conspicuous by the filter paper sampling.

  19. Facile colorimetric methods for the quantitative determination of tetramisole hydrochloride

    NASA Astrophysics Data System (ADS)

    Amin, A. S.; Dessouki, H. A.

    2002-10-01

    A facile, rapid and sensitive methods for the determination of tetramisole hydrochloride in pure and in dosage forms are described. The procedures are based on the formation of coloured products with the chromogenic reagents alizarin blue BB (I), alizarin red S (II), alizarin violet 3R (III) and alizarin yellow G (IV). The coloured products showed absorption maxima at 605, 468, 631 and 388 nm for I-IV, respectively. The colours obtained were stable for 24 h. The colour system obeyed Beer's law in the concentration range 1.0-36, 0.8-32, 1.2-42 and 0.8-30 μg ml -1, respectively. The results obtained showed good recoveries with relative standard deviations of 1.27, 0.96, 1.13 and 1.35%, respectively. The detection and determination limits were found to be 1.0 and 3.8, 1.2 and 4.2, 1.0 and 3.9 and finally 1.4 and 4.8 ng ml -1 for I-IV complexes, respectively. Applications of the method to representative pharmaceutical formulations are represented and the validity assessed by applying the standard addition technique, which is comparable with that obtained using the official method.

  20. Advanced three-dimensional dynamic analysis by boundary element methods

    NASA Technical Reports Server (NTRS)

    Banerjee, P. K.; Ahma, S.

    1985-01-01

    Advanced formulations of boundary element method for periodic, transient transform domain and transient time domain solution of three-dimensional solids have been implemented using a family of isoparametric boundary elements. The necessary numerical integration techniques as well as the various solution algorithms are described. The developed analysis has been incorporated in a fully general purpose computer program BEST3D which can handle up to 10 subregions. A number of numerical examples are presented to demonstrate the accuracy of the dynamic analyses.

  1. Advanced boundary element methods in aeroacoustics and elastodynamics

    NASA Astrophysics Data System (ADS)

    Lee, Li

    In the first part of this dissertation, advanced boundary element methods (BEM) are developed for acoustic radiation in the presence of subsonic flows. A direct boundary integral formulation is first introduced for acoustic radiation in a uniform flow. This new formulation uses the Green's function derived from the adjoint operator of the governing differential equation. Therefore, it requires no coordinate transformation. This direct BEM formulation is then extended to acoustic radiation in a nonuniform-flow field. All the terms due to the nonuniform-flow effect are taken to the right-hand side and treated as source terms. The source terms result in a domain integral in the standard boundary integral formulation. The dual reciprocity method is then used to convert the domain integral into a number of boundary integrals. The second part of this dissertation is devoted to the development of advanced BEM algorithms to overcome the multi-frequency and nonuniqueness difficulties in steady-state elastodynamics. For the multi-frequency difficulty, two different interpolation schemes, borrowed from recent developments in acoustics, are first extended to elastodynamics to accelerate the process of matrix re-formation. Then, a hybrid scheme that retains only the merits of the two different interpolation schemes is suggested. To overcome the nonuniqueness difficulty, an enhanced CHIEF (Combined Helmholtz Integral Equation Formulation) method using a linear combination of the displacement and the traction boundary integral equations on the surface of a small interior volume is proposed. Numerical examples are given to demonstrate all the advanced BEM formulations.

  2. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR.

  3. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    NASA Astrophysics Data System (ADS)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  4. A novel generalized ridge regression method for quantitative genetics.

    PubMed

    Shen, Xia; Alam, Moudud; Fikse, Freddy; Rönnegård, Lars

    2013-04-01

    As the molecular marker density grows, there is a strong need in both genome-wide association studies and genomic selection to fit models with a large number of parameters. Here we present a computationally efficient generalized ridge regression (RR) algorithm for situations in which the number of parameters largely exceeds the number of observations. The computationally demanding parts of the method depend mainly on the number of observations and not the number of parameters. The algorithm was implemented in the R package bigRR based on the previously developed package hglm. Using such an approach, a heteroscedastic effects model (HEM) was also developed, implemented, and tested. The efficiency for different data sizes were evaluated via simulation. The method was tested for a bacteria-hypersensitive trait in a publicly available Arabidopsis data set including 84 inbred lines and 216,130 SNPs. The computation of all the SNP effects required <10 sec using a single 2.7-GHz core. The advantage in run time makes permutation test feasible for such a whole-genome model, so that a genome-wide significance threshold can be obtained. HEM was found to be more robust than ordinary RR (a.k.a. SNP-best linear unbiased prediction) in terms of QTL mapping, because SNP-specific shrinkage was applied instead of a common shrinkage. The proposed algorithm was also assessed for genomic evaluation and was shown to give better predictions than ordinary RR.

  5. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-10-16

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks.

  6. Multiparametric monitoring of chemotherapy treatment response in locally advanced breast cancer using quantitative ultrasound and diffuse optical spectroscopy

    PubMed Central

    Tran, William T.; Childs, Charmaine; Chin, Lee; Slodkowska, Elzbieta; Sannachi, Lakshmanan; Tadayyon, Hadi; Watkins, Elyse; Wong, Sharon Lemon; Curpen, Belinda; Kaffas, Ahmed El; Al-Mahrouki, Azza; Sadeghi-Naini, Ali; Czarnota, Gregory J.

    2016-01-01

    Purpose This study evaluated pathological response to neoadjuvant chemotherapy using quantitative ultrasound (QUS) and diffuse optical spectroscopy imaging (DOSI) biomarkers in locally advanced breast cancer (LABC). Materials and Methods The institution's ethics review board approved this study. Subjects (n = 22) gave written informed consent prior to participating. US and DOSI data were acquired, relative to the start of neoadjuvant chemotherapy, at weeks 0, 1, 4, 8 and preoperatively. QUS parameters including the mid-band fit (MBF), 0-MHz intercept (SI), and the spectral slope (SS) were determined from tumor ultrasound data using spectral analysis. In the same patients, DOSI was used to measure parameters relating to tumor hemoglobin and composition. Discriminant analysis and receiver-operating characteristic (ROC) analysis was used to classify clinical and pathological response during treatment and to estimate the area under the curve (AUC). Additionally, multivariate analysis was carried out for pairwise QUS/DOSI parameter combinations using a logistic regression model. Results Individual QUS and DOSI parameters, including the (SI), oxy-hemoglobin (HbO2), and total hemoglobin (HbT) were significant markers for response after one week of treatment (p < 0.01). Multivariate (pairwise) combinations increased the sensitivity, specificity and AUC at this time; the SI + HbO2 showed a sensitivity/specificity of 100%, and an AUC of 1.0. Conclusions QUS and DOSI demonstrated potential as coincident markers for treatment response and may potentially facilitate response-guided therapies. Multivariate QUS and DOSI parameters increased the sensitivity and specificity of classifying LABC patients as early as one week after treatment. PMID:26942698

  7. Quantitative evaluation of material degradation by Barkhausen noise method

    SciTech Connect

    Yamaguchi, Atsunori; Maeda, Noriyoshi; Sugibayashi, Takuya

    1995-12-01

    Evaluation the life of nuclear power plant becomes inevitable to extend the plant operating period. This paper applied the magnetic method using Barkhausen noise (BHN) to detect the degradation by fatigue and thermal aging. Low alloy steel (SA 508 cl.2) was fatigued at the strain amplitudes of {+-}1% and {+-}0.4%, and duplex stainless steel (SCS14A) was heated at 400 C for a long period (thermal aging). For the degraded material by thermal aging, BHN was measured and good correlation between magnetic properties and absorption energy of the material was obtained. For fatigued material, BHNM was measured at each predetermined cycle and the effect of stress or strain of the material when it measured was evaluated, and good correlation between BHN and fatigue damage ratio was obtained.

  8. Quantitative evaluation of solar wind time-shifting methods

    NASA Astrophysics Data System (ADS)

    Cameron, Taylor; Jackel, Brian

    2016-11-01

    Nine years of solar wind dynamic pressure and geosynchronous magnetic field data are used for a large-scale statistical comparison of uncertainties associated with several different algorithms for propagating solar wind measurements. The MVAB-0 scheme is best overall, performing on average a minute more accurately than a flat time-shift. We also evaluate the accuracy of these time-shifting methods as a function of solar wind magnetic field orientation. We find that all time-shifting algorithms perform significantly worse (>5 min) due to geometric effects when the solar wind magnetic field is radial (parallel or antiparallel to the Earth-Sun line). Finally, we present an empirical scheme that performs almost as well as MVAB-0 on average and slightly better than MVAB-0 for intervals with nonradial B.

  9. New methods for quantitative and qualitative facial studies: an overview.

    PubMed

    Thomas, I T; Hintz, R J; Frias, J L

    1989-01-01

    The clinical study of birth defects has traditionally followed the Gestalt approach, with a trend, in recent years, toward more objective delineation. Data collection, however, has been largely restricted to measurements from X-rays and anthropometry. In other fields, new techniques are being applied that capitalize on the use of modern computer technology. One such technique is that of remote sensing, of which photogrammetry is a branch. Cartographers, surveyors and engineers, using specially designed cameras, have applied geometrical techniques to locate points on an object precisely. These techniques, in their long-range application, have become part of our industrial technology and have assumed great importance with the development of satellite-borne surveillance systems. The close-range application of similar techniques has the potential for extremely accurate clinical measurement. We are currently evaluating the application of remote sensing to facial measurement using three conventional 35 mm still cameras. The subject is photographed in front of a carefully measured grid, and digitization is then carried out on 35-mm slides specific landmarks on the cranioface are identified, along with points on the background grid and the four corners of the slide frame, and are registered as xy coordinates by a digitizer. These coordinates are then converted into precise locations in object space. The technique is capable of producing measurements to within 1/100th of an inch. We suggest that remote sensing methods such as this may well be of great value in the study of congenital malformations.

  10. Optimization of Quantitative PCR Methods for Enteropathogen Detection.

    PubMed

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen's extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease.

  11. Optimization of Quantitative PCR Methods for Enteropathogen Detection

    PubMed Central

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M.; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R.

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen’s extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease. PMID:27336160

  12. Analyses on Regional Cultivated Land Changebased on Quantitative Method

    NASA Astrophysics Data System (ADS)

    Cao, Yingui; Yuan, Chun; Zhou, Wei; Wang, Jing

    Three Gorges Project is the great project in the world, which accelerates economic development in the reservoir area of Three Gorges Project. In the process of development in the reservoir area of Three Gorges Project, cultivated land has become the important resources, a lot of cultivated land has been occupied and become the constructing land. In the same time, a lot of cultivated land has been flooded because of the rising of the water level. This paper uses the cultivated land areas and social economic indicators of reservoir area of Three Gorges in 1990-2004, takes the statistic analyses and example research in order to analyze the process of cultivated land, get the driving forces of cultivated land change, find the new methods to stimulate and forecast the cultivated land areas in the future, and serve for the cultivated land protection and successive development in reservoir area of Three Gorges. The results indicate as follow, firstly, in the past 15 years, the cultivated land areas has decreased 200142 hm2, the decreasing quantity per year is 13343 hm2. The whole reservoir area is divided into three different areas, they are upper reaches area, belly area and lower reaches area. The trends of cultivated land change in different reservoir areas are similar to the whole reservoir area. Secondly, the curve of cultivated land areas and per capita GDP takes on the reverse U, and the steps between the change rate of cultivated land and the change rate of GDP are different in some years, which indicates that change of cultivated land and change of GDP are decoupling, besides that, change of cultivated land is connection with the development of urbanization and the policy of returning forestry greatly. Lastly, the precision of multi-regression is lower than the BP neural network in the stimulation of cultivated land, then takes use of the BP neural network to forecast the cultivated land areas in 2005, 2010 and 2015, and the forecasting results are reasonable.

  13. Quantitative research on the primary process: method and findings.

    PubMed

    Holt, Robert R

    2002-01-01

    Freud always defined the primary process metapsychologically, but he described the ways it shows up in dreams, parapraxes, jokes, and symptoms with enough observational detail to make it possible to create an objective, reliable scoring system to measure its manifestations in Rorschach responses, dreams, TAT stories, free associations, and other verbal texts. That system can identify signs of the thinker's efforts, adaptive or maladaptive, to control or defend against the emergence of primary process. A prerequisite and a consequence of the research that used this system was clarification and elaboration of the psychoanalytic theory of thinking. Results of empirical tests of several propositions derived from psychoanalytic theory are summarized. Predictions concerning the method's most useful index, of adaptive vs. maladaptive regression, have been repeatedly verified: People who score high on this index (who are able to produce well-controlled "primary products" in their Rorschach responses), as compared to those who score at the maladaptive pole (producing primary-process-filled responses with poor reality testing, anxiety, and pathological defensive efforts), are better able to tolerate sensory deprivation, are more able to enter special states of consciousness comfortably (drug-induced, hypnotic, etc.), and have higher achievements in artistic creativity, while schizophrenics tend to score at the extreme of maladaptive regression. Capacity for adaptive regression also predicts success in psychotherapy, and rises with the degree of improvement after both psychotherapy and drug treatment. Some predictive failures have been theoretically interesting: Kris's hypothesis about creativity and the controlled use of primary process holds for males but usually not for females. This body of work is presented as a refutation of charges, brought by such critics as Crews, that psychoanalysis cannot become a science.

  14. Advances in Laboratory Methods for Detection and Typing of Norovirus

    PubMed Central

    2014-01-01

    Human noroviruses are the leading cause of epidemic and sporadic gastroenteritis across all age groups. Although the disease is usually self-limiting, in the United States norovirus gastroenteritis causes an estimated 56,000 to 71,000 hospitalizations and 570 to 800 deaths each year. This minireview describes the latest data on laboratory methods (molecular, immunological) for norovirus detection, including real-time reverse transcription-quantitative PCR (RT-qPCR) and commercially available immunological assays as well as the latest FDA-cleared multi-gastrointestinal-pathogen platforms. In addition, an overview is provided on the latest nomenclature and molecular epidemiology of human noroviruses. PMID:24989606

  15. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  16. Bio-Photonic Detection and Quantitative Evaluation Method for the Progression of Dental Caries Using Optical Frequency-Domain Imaging Method

    PubMed Central

    Wijesinghe, Ruchire Eranga; Cho, Nam Hyun; Park, Kibeom; Jeon, Mansik; Kim, Jeehyun

    2016-01-01

    The initial detection of dental caries is an essential biomedical requirement to barricade the progression of caries and tooth demineralization. The objective of this study is to introduce an optical frequency-domain imaging technique based quantitative evaluation method to calculate the volume and thickness of enamel residual, and a quantification method was developed to evaluate the total intensity fluctuation in depth direction owing to carious lesions, which can be favorable to identify the progression of dental caries in advance. The cross-sectional images of the ex vivo tooth samples were acquired using 1.3 μm spectral domain optical coherence tomography system (SD-OCT). Moreover, the advantages of the proposed method over the conventional dental inspection methods were compared to highlight the potential capability of OCT. As a consequence, the threshold parameters obtained through the developed method can be used as an efficient investigating technique for the initial detection of demineralization. PMID:27929440

  17. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  18. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  19. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  20. The Ten Beads Method: A Novel Way to Collect Quantitative Data in Rural Uganda

    PubMed Central

    Bwambale, Francis Mulekya; Moyer, Cheryl A.; Komakech, Innocent; -Mangen, Fred-Wabwire; Lori, Jody R

    2013-01-01

    This paper illustrates how locally appropriate methods can be used to collect quantitative data from illiterate respondents. This method uses local beads to represent quantities, which is a novel yet potentially valuable methodological improvement over standard Western survey methods. PMID:25170477

  1. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  2. Quantitative evaluation of proteins with bicinchoninic acid (BCA): resonance Raman and surface-enhanced resonance Raman scattering-based methods.

    PubMed

    Chen, Lei; Yu, Zhi; Lee, Youngju; Wang, Xu; Zhao, Bing; Jung, Young Mee

    2012-12-21

    A rapid and highly sensitive bicinchoninic acid (BCA) reagent-based protein quantitation tool was developed using competitive resonance Raman (RR) and surface-enhanced resonance Raman scattering (SERRS) methods. A chelation reaction between BCA and Cu(+), which is reduced by protein in an alkaline environment, is exploited to create a BCA-Cu(+) complex that has strong RR and SERRS activities. Using these methods, protein concentrations in solutions can be quantitatively measured at concentrations as low as 50 μg mL(-1) and 10 pg mL(-1). There are many advantages of using RR and SERRS-based assays. These assays exhibit a much wider linear concentration range and provide an additional one (RR method) to four (SERRS method) orders of magnitude increase in detection limits relative to UV-based methods. Protein-to-protein variation is determined using a reference to a standard curve at concentrations of BSA that exhibits excellent recoveries. These novel methods are extremely accurate in detecting total protein concentrations in solution. This improvement in protein detection sensitivity could yield advances in the biological sciences and medical diagnostic field and extend the applications of reagent-based protein assay techniques.

  3. Prediction and preliminary standardization of fire debris constituents with the advanced distillation curve method.

    PubMed

    Bruno, Thomas J; Lovestead, Tara M; Huber, Marcia L

    2011-01-01

    The recent National Academy of Sciences report on forensic sciences states that the study of fire patterns and debris in arson fires is in need of additional work and eventual standardization. We discuss a recently introduced method that can provide predicted evaporation patterns for ignitable liquids as a function of temperature. The method is a complex fluid analysis protocol, the advanced distillation curve approach, featuring a composition explicit data channel for each distillate fraction (for qualitative, quantitative, and trace analysis), low uncertainty temperature measurements that are thermodynamic state points that can be modeled with an equation of state, consistency with a century of historical data, and an assessment of the energy content of each distillate fraction. We discuss the application of the method to kerosenes and gasolines and outline how expansion of the scope of fluids to other ignitable liquids can benefit the criminalist in the analysis of fire debris for arson.

  4. Immunoassay Methods and their Applications in Pharmaceutical Analysis: Basic Methodology and Recent Advances.

    PubMed

    Darwish, Ibrahim A

    2006-09-01

    Immunoassays are bioanalytical methods in which the quantitation of the analyte depends on the reaction of an antigen (analyte) and an antibody. Immunoassays have been widely used in many important areas of pharmaceutical analysis such as diagnosis of diseases, therapeutic drug monitoring, clinical pharmacokinetic and bioequivalence studies in drug discovery and pharmaceutical industries. The importance and widespread of immunoassay methods in pharmaceutical analysis are attributed to their inherent specificity, high-throughput, and high sensitivity for the analysis of wide range of analytes in biological samples. Recently, marked improvements were achieved in the field of immunoassay development for the purposes of pharmaceutical analysis. These improvements involved the preparation of the unique immunoanalytical reagents, analysis of new categories of compounds, methodology, and instrumentation. The basic methodologies and recent advances in immunoassay methods applied in different fields of pharmaceutical analysis have been reviewed.

  5. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer.

    PubMed

    Fu, Guanglei; Sanjay, Sharma T; Dou, Maowei; Li, XiuJun

    2016-03-14

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.

  6. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  7. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  8. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  9. Advanced reactor physics methods for heterogeneous reactor cores

    NASA Astrophysics Data System (ADS)

    Thompson, Steven A.

    To maintain the economic viability of nuclear power the industry has begun to emphasize maximizing the efficiency and output of existing nuclear power plants by using longer fuel cycles, stretch power uprates, shorter outage lengths, mixed-oxide (MOX) fuel and more aggressive operating strategies. In order to accommodate these changes, while still satisfying the peaking factor and power envelope requirements necessary to maintain safe operation, more complexity in commercial core designs have been implemented, such as an increase in the number of sub-batches and an increase in the use of both discrete and integral burnable poisons. A consequence of the increased complexity of core designs, as well as the use of MOX fuel, is an increase in the neutronic heterogeneity of the core. Such heterogeneous cores introduce challenges for the current methods that are used for reactor analysis. New methods must be developed to address these deficiencies while still maintaining the computational efficiency of existing reactor analysis methods. In this thesis, advanced core design methodologies are developed to be able to adequately analyze the highly heterogeneous core designs which are currently in use in commercial power reactors. These methodological improvements are being pursued with the goal of not sacrificing the computational efficiency which core designers require. More specifically, the PSU nodal code NEM is being updated to include an SP3 solution option, an advanced transverse leakage option, and a semi-analytical NEM solution option.

  10. Advanced Methods in Black-Hole Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Pani, Paolo

    2013-09-01

    Black-hole perturbation theory is a useful tool to investigate issues in astrophysics, high-energy physics, and fundamental problems in gravity. It is often complementary to fully-fledged nonlinear evolutions and instrumental to interpret some results of numerical simulations. Several modern applications require advanced tools to investigate the linear dynamics of generic small perturbations around stationary black holes. Here, we present an overview of these applications and introduce extensions of the standard semianalytical methods to construct and solve the linearized field equations in curved space-time. Current state-of-the-art techniques are pedagogically explained and exciting open problems are presented.

  11. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  12. Quantitative interferometric microscopic flow cytometer with expanded principal component analysis method

    NASA Astrophysics Data System (ADS)

    Wang, Shouyu; Jin, Ying; Yan, Keding; Xue, Liang; Liu, Fei; Li, Zhenhua

    2014-11-01

    Quantitative interferometric microscopy is used in biological and medical fields and a wealth of applications are proposed in order to detect different kinds of biological samples. Here, we develop a phase detecting cytometer based on quantitative interferometric microscopy with expanded principal component analysis phase retrieval method to obtain phase distributions of red blood cells with a spatial resolution ~1.5 μm. Since expanded principal component analysis method is a time-domain phase retrieval algorithm, it could avoid disadvantages of traditional frequency-domain algorithms. Additionally, the phase retrieval method realizes high-speed phase imaging from multiple microscopic interferograms captured by CCD camera when the biological cells are scanned in the field of view. We believe this method can be a powerful tool to quantitatively measure the phase distributions of different biological samples in biological and medical fields.

  13. An improved simple colorimetric method for quantitation of non-transferrin-bound iron in serum.

    PubMed

    Zhang, D; Okada, S; Kawabata, T; Yasuda, T

    1995-03-01

    A simple method for direct quantitation of non-transferrin-bound iron (NTBI) in serum is introduced. NTBI was separated from serum by adding excess nitrilotriacetic acid disodium salt (NTA) to serum to form an Fe-NTA complex and then ultrafiltrated using a micro-filter. The NTBI in the ultrafiltrate was quantitated using a bathophenanthroline-based method. The optimal detection condition and several potential confounding factors were investigated. The actual measurements to samples in vivo and in vitro showed that this method is very practical.

  14. A simple method for the subnanomolar quantitation of seven ophthalmic drugs in the rabbit eye.

    PubMed

    Latreille, Pierre-Luc; Banquy, Xavier

    2015-05-01

    This study describes the development and validation of a new liquid chromatography-tandem mass spectrometry (MS/MS) method capable of simultaneous quantitation of seven ophthalmic drugs-pilocarpine, lidocaine, atropine, proparacaine, timolol, prednisolone, and triamcinolone acetonide-within regions of the rabbit eye. The complete validation of the method was performed using an Agilent 1100 series high-performance liquid chromatography system coupled to a 4000 QTRAP MS/MS detector in positive TurboIonSpray mode with pooled drug solutions. The method sensitivity, evaluated by the lower limit of quantitation in two simulated matrices, yielded lower limits of quantitation of 0.25 nmol L(-1) for most of the drugs. The precision in the low, medium, and high ranges of the calibration curves, the freeze-thaw stability over 1 month, the intraday precision, and the interday precision were all within a 15% limit. The method was used to quantitate the different drugs in the cornea, aqueous humor, vitreous humor, and remaining eye tissues of the rabbit eye. It was validated to a concentration of up to 1.36 ng/g in humors and 5.43 ng/g in tissues. The unprecedented low detection limit of the present method and its ease of implementation allow easy, robust, and reliable quantitation of multiple drugs for rapid in vitro and in vivo evaluation of the local pharmacokinetics of these compounds.

  15. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    PubMed

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (P< .001). Repeated measures pairwise correlation between any of the methods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies.

  16. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  17. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.

  18. Comparison of Overlap Methods for Quantitatively Synthesizing Single-Subject Data

    ERIC Educational Resources Information Center

    Wolery, Mark; Busick, Matthew; Reichow, Brian; Barton, Erin E.

    2010-01-01

    Four overlap methods for quantitatively synthesizing single-subject data were compared to visual analysts' judgments. The overlap methods were percentage of nonoverlapping data, pairwise data overlap squared, percentage of data exceeding the median, and percentage of data exceeding a median trend. Visual analysts made judgments about 160 A-B data…

  19. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  20. A method for the quantitative determination of crystalline phases by X-ray

    NASA Technical Reports Server (NTRS)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  1. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  2. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  3. Integrating Qualitative Methods in a Predominantly Quantitative Evaluation: A Case Study and Some Reflections.

    ERIC Educational Resources Information Center

    Mark, Melvin M.; Feller, Irwin; Button, Scott B.

    1997-01-01

    A review of qualitative methods used in a predominantly quantitative evaluation indicates a variety of roles for such a mixing of methods, including framing and revising research questions, assessing the validity of measures and adaptations to program implementation, and gauging the degree of uncertainty and generalizability of conclusions.…

  4. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  5. Advanced fluorescence microscopy methods for the real-time study of transcription and chromatin dynamics

    PubMed Central

    Annibale, Paolo; Gratton, Enrico

    2014-01-01

    In this contribution we provide an overview of the recent advances allowed by the use of fluorescence microscopy methods in the study of transcriptional processes and their interplay with the chromatin architecture in living cells. Although the use of fluorophores to label nucleic acids dates back at least to about half a century ago,1 two recent breakthroughs have effectively opened the way to use fluorescence routinely for specific and quantitative probing of chromatin organization and transcriptional activity in living cells: namely, the possibility of labeling first the chromatin loci and then the mRNA synthesized from a gene using fluorescent proteins. In this contribution we focus on methods that can probe rapid dynamic processes by analyzing fast fluorescence fluctuations. PMID:25764219

  6. Advanced magnetic resonance imaging techniques in the preterm brain: methods and applications.

    PubMed

    Tao, Joshua D; Neil, Jeffrey J

    2014-01-01

    Brain development and brain injury in preterm infants are areas of active research. Magnetic resonance imaging (MRI), a non-invasive tool applicable to both animal models and human infants, provides a wealth of information on this process by bridging the gap between histology (available from animal studies) and developmental outcome (available from clinical studies). Moreover, MRI also offers information regarding diagnosis and prognosis in the clinical setting. Recent advances in MR methods - diffusion tensor imaging, volumetric segmentation, surface based analysis, functional MRI, and quantitative metrics - further increase the sophistication of information available regarding both brain structure and function. In this review, we discuss the basics of these newer methods as well as their application to the study of premature infants.

  7. Using the Taguchi method for rapid quantitative PCR optimization with SYBR Green I.

    PubMed

    Thanakiatkrai, Phuvadol; Welch, Lindsey

    2012-01-01

    Here, we applied the Taguchi method, an engineering optimization process, to successfully determine the optimal conditions for three SYBR Green I-based quantitative PCR assays. This method balanced the effects of all factors and their associated levels by using an orthogonal array rather than a factorial array. Instead of running 27 experiments with the conventional factorial method, the Taguchi method achieved the same optimal conditions using only nine experiments, saving valuable resources.

  8. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  9. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  10. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    PubMed

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  11. The promise of mixed-methods for advancing latino health research.

    PubMed

    Apesoa-Varano, Ester Carolina; Hinton, Ladson

    2013-09-01

    Mixed-methods research in the social sciences has been conducted for quite some time. More recently, mixed-methods have become popular in health research, with the National Institutes of Health leading the impetus to fund studies that implement such an approach. The public health issues facing us today are great and they range from policy and other macro-level issues, to systems level problems to individuals' health behaviors. For Latinos, who are projected to become the largest minority group bearing a great deal of the burden of social inequality in the U.S., it is important to understand the deeply-rooted nature of these health disparities in order to close the gap in health outcomes. Mixed-methodology thus holds promise for advancing research on Latino heath by tackling health disparities from a variety of standpoints and approaches. The aim of this manuscript is to provide two examples of mixed methods research, each of which addresses a health topic of considerable importance to older Latinos and their families. These two examples will illustrate a) the complementary use of qualitative and quantitative methods to advance health of older Latinos in an area that is important from a public health perspective, and b) the "translation" of findings from observational studies (informed by social science and medicine) to the development and testing of interventions.

  12. The Promise of Mixed-Methods for Advancing Latino Health Research

    PubMed Central

    Apesoa-Varano, Ester Carolina; Hinton, Ladson

    2015-01-01

    Mixed-methods research in the social sciences has been conducted for quite some time. More recently, mixed-methods have become popular in health research, with the National Institutes of Health leading the impetus to fund studies that implement such an approach. The public health issues facing us today are great and they range from policy and other macro-level issues, to systems level problems to individuals' health behaviors. For Latinos, who are projected to become the largest minority group bearing a great deal of the burden of social inequality in the U.S., it is important to understand the deeply-rooted nature of these health disparities in order to close the gap in health outcomes. Mixed-methodology thus holds promise for advancing research on Latino heath by tackling health disparities from a variety of standpoints and approaches. The aim of this manuscript is to provide two examples of mixed methods research, each of which addresses a health topic of considerable importance to older Latinos and their families. These two examples will illustrate a) the complementary use of qualitative and quantitative methods to advance health of older Latinos in an area that is important from a public health perspective, and b) the “translation” of findings from observational studies (informed by social science and medicine) to the development and testing of interventions. PMID:23996325

  13. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0

  14. Advances in direct and diffraction methods for surface structural determination

    NASA Astrophysics Data System (ADS)

    Tong, S. Y.

    1999-08-01

    I describe recent advances in low-energy electron diffraction holography and photoelectron diffraction holography. These are direct methods for determining the surface structure. I show that for LEED and PD spectra taken in an energy and angular mesh, the relative phase between the reference wave and the scattered wave has a known geometric form if the spectra are always taken from within a small angular cone in the near backscattering direction. By using data in the backscattering small cone at each direction of interest, a simple algorithm is developed to invert the spectra and extract object atomic positions with no input of calculated dynamic factors. I also describe the use of a convergent iterative method of PD and LEED. The computation time of this method scales as N2, where N is the dimension of the propagator matrix, rather than N3 as in conventional Gaussian substitutional methods. Both the Rehr-Albers separable-propagator cluster approach and the slab-type non-separable approach can be cast in the new iterative form. With substantial savings in computational time and no loss in numerical accuracy, this method is very useful in applications of multiple scattering theory, particularly for systems involving either very large unit cells (>300 atoms) or where no long-range order is present.

  15. Quantitation of TIMP-1 in plasma of healthy blood donors and patients with advanced cancer

    PubMed Central

    Holten-Andersen, M N; Murphy, G; Nielsen, H J; Pedersen, A N; Christensen, I J; Høyer-Hansen, G; Brünner, N; Stephens, R W

    1999-01-01

    A kinetic enzyme-linked immunosorbent assay (ELISA) for plasma tissue inhibitor of metalloproteinase (TIMP)-1 was developed in order to examine the potential diagnostic and prognostic value of TIMP-1 measurements in cancer patients. The ELISA enabled specific detection of total TIMP-1 in EDTA, citrate and heparin plasma. The assay was rigorously tested and requirements of sensitivity, specificity, stability and good recovery were fulfilled. TIMP-1 levels measured in citrate plasma (mean 69.2 ± 13.1 μg l−1) correlated with TIMP-1 measured in EDTA plasma (mean 73.5 ± 14.2 μg l−1) from the same individuals in a set of 100 healthy blood donors (Spearman's rho = 0.62, P < 0.0001). The mean level of TIMP-1 in EDTA plasma from 143 patients with Dukes' stage D colorectal cancer was 240 ± 145 μg l−1 and a Mann–Whitney test demonstrated a highly significant difference between TIMP-1 levels in healthy blood donors and colorectal cancer patients (P < 0.0001). Similar findings were obtained for 19 patients with advanced breast cancer (mean 292 ± 331 μg l−1). The results show that TIMP-1 is readily measured in plasma samples by ELISA and that increased levels of TIMP-1 are found in patients with advanced cancer. It is proposed that plasma measurements of TIMP-1 may have value in the management of cancer patients. © 1999 Cancer Research Campaign PMID:10408859

  16. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    PubMed

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances.

  17. Advances in the Surface Renewal Flux Measurement Method

    NASA Astrophysics Data System (ADS)

    Shapland, T. M.; McElrone, A.; Paw U, K. T.; Snyder, R. L.

    2011-12-01

    The measurement of ecosystem-scale energy and mass fluxes between the planetary surface and the atmosphere is crucial for understanding geophysical processes. Surface renewal is a flux measurement technique based on analyzing the turbulent coherent structures that interact with the surface. It is a less expensive technique because it does not require fast-response velocity measurements, but only a fast-response scalar measurement. It is therefore also a useful tool for the study of the global cycling of trace gases. Currently, surface renewal requires calibration against another flux measurement technique, such as eddy covariance, to account for the linear bias of its measurements. We present two advances in the surface renewal theory and methodology that bring the technique closer to becoming a fully independent flux measurement method. The first advance develops the theory of turbulent coherent structure transport associated with the different scales of coherent structures. A novel method was developed for identifying the scalar change rate within structures at different scales. Our results suggest that for canopies less than one meter in height, the second smallest coherent structure scale dominates the energy and mass flux process. Using the method for resolving the scalar exchange rate of the second smallest coherent structure scale, calibration is unnecessary for surface renewal measurements over short canopies. This study forms the foundation for analysis over more complex surfaces. The second advance is a sensor frequency response correction for measuring the sensible heat flux via surface renewal. Inexpensive fine-wire thermocouples are frequently used to record high frequency temperature data in the surface renewal technique. The sensible heat flux is used in conjunction with net radiation and ground heat flux measurements to determine the latent heat flux as the energy balance residual. The robust thermocouples commonly used in field experiments

  18. Advanced superposition methods for high speed turbopump vibration analysis

    NASA Technical Reports Server (NTRS)

    Nielson, C. E.; Campany, A. D.

    1981-01-01

    The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.

  19. Integration of isothermal amplification methods in microfluidic devices: Recent advances.

    PubMed

    Giuffrida, Maria Chiara; Spoto, Giuseppe

    2017-04-15

    The integration of nucleic acids detection assays in microfluidic devices represents a highly promising approach for the development of convenient, cheap and efficient diagnostic tools for clinical, food safety and environmental monitoring applications. Such tools are expected to operate at the point-of-care and in resource-limited settings. The amplification of the target nucleic acid sequence represents a key step for the development of sensitive detection protocols. The integration in microfluidic devices of the most popular technology for nucleic acids amplifications, polymerase chain reaction (PCR), is significantly limited by the thermal cycling needed to obtain the target sequence amplification. This review provides an overview of recent advances in integration of isothermal amplification methods in microfluidic devices. Isothermal methods, that operate at constant temperature, have emerged as promising alternative to PCR and greatly simplify the implementation of amplification methods in point-of-care diagnostic devices and devices to be used in resource-limited settings. Possibilities offered by isothermal methods for digital droplet amplification are discussed.

  20. Advanced numerical methods in mesh generation and mesh adaptation

    SciTech Connect

    Lipnikov, Konstantine; Danilov, A; Vassilevski, Y; Agonzal, A

    2010-01-01

    Numerical solution of partial differential equations requires appropriate meshes, efficient solvers and robust and reliable error estimates. Generation of high-quality meshes for complex engineering models is a non-trivial task. This task is made more difficult when the mesh has to be adapted to a problem solution. This article is focused on a synergistic approach to the mesh generation and mesh adaptation, where best properties of various mesh generation methods are combined to build efficiently simplicial meshes. First, the advancing front technique (AFT) is combined with the incremental Delaunay triangulation (DT) to build an initial mesh. Second, the metric-based mesh adaptation (MBA) method is employed to improve quality of the generated mesh and/or to adapt it to a problem solution. We demonstrate with numerical experiments that combination of all three methods is required for robust meshing of complex engineering models. The key to successful mesh generation is the high-quality of the triangles in the initial front. We use a black-box technique to improve surface meshes exported from an unattainable CAD system. The initial surface mesh is refined into a shape-regular triangulation which approximates the boundary with the same accuracy as the CAD mesh. The DT method adds robustness to the AFT. The resulting mesh is topologically correct but may contain a few slivers. The MBA uses seven local operations to modify the mesh topology. It improves significantly the mesh quality. The MBA method is also used to adapt the mesh to a problem solution to minimize computational resources required for solving the problem. The MBA has a solid theoretical background. In the first two experiments, we consider the convection-diffusion and elasticity problems. We demonstrate the optimal reduction rate of the discretization error on a sequence of adaptive strongly anisotropic meshes. The key element of the MBA method is construction of a tensor metric from hierarchical edge

  1. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and

  2. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  3. Advanced Motion Compensation Methods for Intravital Optical Microscopy

    PubMed Central

    Vinegoni, Claudio; Lee, Sungon; Feruglio, Paolo Fumene; Weissleder, Ralph

    2013-01-01

    Intravital microscopy has emerged in the recent decade as an indispensible imaging modality for the study of the micro-dynamics of biological processes in live animals. Technical advancements in imaging techniques and hardware components, combined with the development of novel targeted probes and new mice models, have enabled us to address long-standing questions in several biology areas such as oncology, cell biology, immunology and neuroscience. As the instrument resolution has increased, physiological motion activities have become a major obstacle that prevents imaging live animals at resolutions analogue to the ones obtained in vitro. Motion compensation techniques aim at reducing this gap and can effectively increase the in vivo resolution. This paper provides a technical review of some of the latest developments in motion compensation methods, providing organ specific solutions. PMID:24273405

  4. Computational methods of the Advanced Fluid Dynamics Model

    SciTech Connect

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.; Berthier, J.; Maudlin, P.J.; Schmuck, P.; Goutagny, L.; Ichikawa, S.; Ninokata, H.; Luck, L.B.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development.

  5. Quantitative analysis of methylglyoxal, glyoxal and free advanced glycation end-products in the plasma of Wistar rats during the oral glucose tolerance test.

    PubMed

    Chen, Si Jing; Aikawa, Chiwa; Matsui, Toshiro

    2015-01-01

    The purpose of this study was to gain insight into the production behavior of free adducts of advanced glycation end-products (AGEs) in Wistar rats under acute hyperglycemic conditions. Five AGE-free adducts as well as their precursors (i.e., highly reactive carbonyl intermediates of methylglyoxal and glyoxal) in rat plasma were quantitatively determined at greater than nanomolar levels using the liquid chromatography/tandem mass spectrometry method coupled with 2,4,6-trinitrobenzene sulfonate and 2,3-diaminonaphthalene derivatization techniques. An oral glucose (2 g/kg dose) tolerance test to 10-week-old Wistar rats provided evidence that the plasma levels of diabetes-related metabolites did not change acutely within 120 min, irrespective of increasing blood glucose levels.

  6. Introduction to special section of the Journal of Family Psychology, advances in mixed methods in family psychology: integrative and applied solutions for family science.

    PubMed

    Weisner, Thomas S; Fiese, Barbara H

    2011-12-01

    Mixed methods in family psychology refer to the systematic integration of qualitative and quantitative techniques to represent family processes and settings. Over the past decade, significant advances have been made in study design, analytic strategies, and technological support (such as software) that allow for the integration of quantitative and qualitative methods and for making appropriate inferences from mixed methods. This special section of the Journal of Family Psychology illustrates how mixed methods may be used to advance knowledge in family science through identifying important cultural differences in family structure, beliefs, and practices, and revealing patterns of family relationships to generate new measurement paradigms and inform clinical practice. Guidance is offered to advance mixed methods research in family psychology through sound principles of peer review.

  7. Advancing MODFLOW Applying the Derived Vector Space Method

    NASA Astrophysics Data System (ADS)

    Herrera, G. S.; Herrera, I.; Lemus-García, M.; Hernandez-Garcia, G. D.

    2015-12-01

    The most effective domain decomposition methods (DDM) are non-overlapping DDMs. Recently a new approach, the DVS-framework, based on an innovative discretization method that uses a non-overlapping system of nodes (the derived-nodes), was introduced and developed by I. Herrera et al. [1, 2]. Using the DVS-approach a group of four algorithms, referred to as the 'DVS-algorithms', which fulfill the DDM-paradigm (i.e. the solution of global problems is obtained by resolution of local problems exclusively) has been derived. Such procedures are applicable to any boundary-value problem, or system of such equations, for which a standard discretization method is available and then software with a high degree of parallelization can be constructed. In a parallel talk, in this AGU Fall Meeting, Ismael Herrera will introduce the general DVS methodology. The application of the DVS-algorithms has been demonstrated in the solution of several boundary values problems of interest in Geophysics. Numerical examples for a single-equation, for the cases of symmetric, non-symmetric and indefinite problems were demonstrated before [1,2]. For these problems DVS-algorithms exhibited significantly improved numerical performance with respect to standard versions of DDM algorithms. In view of these results our research group is in the process of applying the DVS method to a widely used simulator for the first time, here we present the advances of the application of this method for the parallelization of MODFLOW. Efficiency results for a group of tests will be presented. References [1] I. Herrera, L.M. de la Cruz and A. Rosas-Medina. Non overlapping discretization methods for partial differential equations, Numer Meth Part D E, (2013). [2] Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  8. A method for comprehensive glycosite-mapping and direct quantitation of plasma glycoproteins

    PubMed Central

    Hong, Qiuting; Ruhaak, L. Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B.

    2015-01-01

    A comprehensive glycan map was constructed for the top eight abundant plasma glycoproteins using both specific and non-specific enzyme digestions followed by nano LC–Chip/QTOF mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20-min UPLC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared to quantitation methods that involve protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries. PMID:26510530

  9. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT

    PubMed Central

    Yoon, Hyun Jin

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SDx, SDv) or the height of the fundamental peak (A1) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SDx, SDv, A1, and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SDx, and SDv, p = 0.0002 for A1). In ROC analysis, the cutoff values were 0.11 for SDx (AUC: 0.982, p < 0.0001), 0.062 for SDv (AUC: 0.847, p < 0.0001), 0.117 for A1 (AUC: 0.876, p < 0.0001), and 0.349 for MUD-MDD (AUC: 0.948, p < 0.0001). This is the first study to analyze multiple aspects of respiration using various mathematical constructs and provides quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration. PMID:27872857

  10. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  11. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  12. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  13. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  14. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    ERIC Educational Resources Information Center

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  15. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  16. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    ERIC Educational Resources Information Center

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining…

  17. Method of fault-tree quantitative analysis for solid rocket motor

    NASA Astrophysics Data System (ADS)

    Hu, Baochao; Yang, Yicai; Xie, Weimin

    1993-08-01

    Based on the existing problem in determining the failure probabilities of base events in solid rocket motor fault-tree quantitative analysis, an engineering method of 'Solicitation Opinions to Give Marks' was put forward to determine the failure probability. A satisfactory result was obtained by analyzing the practical example of structure reliability for some solid rocket motors at the test sample stage.

  18. Method of fault-tree quantitative analysis for solid rocket motor

    NASA Astrophysics Data System (ADS)

    Hu, Baochao; Yang, Yicai; Xie, Weimin

    1993-08-01

    Based on the existing problem of determining the failure probabilities of base events in solid rocket motor fault tree quantitative analysis, an engineering method of Solicitation Opinions to Give Marks is put forward to determine failure probability. A more satisfactory result is obtained by analyzing the actual example of the structural reliability of solid rocket motors at the test sample stage.

  19. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  20. A GC-FID method for quantitative analysis of N,N-carbonyldiimidazole.

    PubMed

    Lee, Claire; Mangion, Ian

    2016-03-20

    N,N-Carbonyldiimidazole (CDI), a common synthetic reagent used in commercial scale pharmaceutical synthesis, is known to be sensitive to hydrolysis from ambient moisture. This liability demands a simple, robust analytical method to quantitatively determine reagent quality to ensure reproducible performance in chemical reactions. This work describes a protocol for a rapid GC-FID based analysis of CDI.

  1. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  2. A method for the rapid qualitative and quantitative analysis of 4,4-dimethyl sterols.

    PubMed

    Gibbons, G F; Mitropoulos, K A; Ramananda, K

    1973-09-01

    A simple and relatively rapid technique has been developed for the separation of several 4,4-dimethyl steryl acetates, some of which contain sterically hindered nuclear double bonds. The method involves thin-layer chromatography on silver nitrate-impregnated silica gel and silver nitrate-impregnated alumina. The separated steryl acetates may then be analyzed quantitatively by gas-liquid chromatography.

  3. Potential Guidelines for Conducting and Reporting Environmental Education Research: Quantitative Methods of Inquiry.

    ERIC Educational Resources Information Center

    Smith-Sebasto, N. J.

    2001-01-01

    Presents potential guidelines for conducting and reporting environmental education research using quantitative methods of inquiry that were developed during a 10-hour (1-1/2 day) workshop sponsored by the North American Commission on Environmental Education Research during the 1998 annual meeting of the North American Association for Environmental…

  4. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  5. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  6. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  7. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the…

  8. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  9. The Use of Quantitative Methods as an Aid to Decision Making in Educational Administration.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.

    Three quantitative methods are outlined, with suggestions for application to particular problem areas of educational administration: (1) The Leontief input-output analysis, incorporating a "transaction table" for displaying relationships between economic outputs and inputs, mainly applicable to budget analysis and planning; (2) linear programing,…

  10. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  11. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  12. Advances in Time Estimation Methods for Molecular Data.

    PubMed

    Kumar, Sudhir; Hedges, S Blair

    2016-04-01

    Molecular dating has become central to placing a temporal dimension on the tree of life. Methods for estimating divergence times have been developed for over 50 years, beginning with the proposal of molecular clock in 1962. We categorize the chronological development of these methods into four generations based on the timing of their origin. In the first generation approaches (1960s-1980s), a strict molecular clock was assumed to date divergences. In the second generation approaches (1990s), the equality of evolutionary rates between species was first tested and then a strict molecular clock applied to estimate divergence times. The third generation approaches (since ∼2000) account for differences in evolutionary rates across the tree by using a statistical model, obviating the need to assume a clock or to test the equality of evolutionary rates among species. Bayesian methods in the third generation require a specific or uniform prior on the speciation-process and enable the inclusion of uncertainty in clock calibrations. The fourth generation approaches (since 2012) allow rates to vary from branch to branch, but do not need prior selection of a statistical model to describe the rate variation or the specification of speciation model. With high accuracy, comparable to Bayesian approaches, and speeds that are orders of magnitude faster, fourth generation methods are able to produce reliable timetrees of thousands of species using genome scale data. We found that early time estimates from second generation studies are similar to those of third and fourth generation studies, indicating that methodological advances have not fundamentally altered the timetree of life, but rather have facilitated time estimation by enabling the inclusion of more species. Nonetheless, we feel an urgent need for testing the accuracy and precision of third and fourth generation methods, including their robustness to misspecification of priors in the analysis of large phylogenies and data

  13. Pressure ulcer prevention algorithm content validation: a mixed-methods, quantitative study.

    PubMed

    van Rijswijk, Lia; Beitz, Janice M

    2015-04-01

    Translating pressure ulcer prevention (PUP) evidence-based recommendations into practice remains challenging for a variety of reasons, including the perceived quality, validity, and usability of the research or the guideline itself. Following the development and face validation testing of an evidence-based PUP algorithm, additional stakeholder input and testing were needed. Using convenience sampling methods, wound care experts attending a national wound care conference and a regional wound ostomy continence nursing (WOCN) conference and/or graduates of a WOCN program were invited to participate in an Internal Review Board-approved, mixed-methods quantitative survey with qualitative components to examine algorithm content validity. After participants provided written informed consent, demographic variables were collected and participants were asked to comment on and rate the relevance and appropriateness of each of the 26 algorithm decision points/steps using standard content validation study procedures. All responses were anonymous. Descriptive summary statistics, mean relevance/appropriateness scores, and the content validity index (CVI) were calculated. Qualitative comments were transcribed and thematically analyzed. Of the 553 wound care experts invited, 79 (average age 52.9 years, SD 10.1; range 23-73) consented to participate and completed the study (a response rate of 14%). Most (67, 85%) were female, registered (49, 62%) or advanced practice (12, 15%) nurses, and had > 10 years of health care experience (88, 92%). Other health disciplines included medical doctors, physical therapists, nurse practitioners, and certified nurse specialists. Almost all had received formal wound care education (75, 95%). On a Likert-type scale of 1 (not relevant/appropriate) to 4 (very relevant and appropriate), the average score for the entire algorithm/all decision points (N = 1,912) was 3.72 with an overall CVI of 0.94 (out of 1). The only decision point/step recommendation

  14. Research using qualitative, quantitative or mixed methods and choice based on the research.

    PubMed

    McCusker, K; Gunaydin, S

    2015-10-01

    Research is fundamental to the advancement of medicine and critical to identifying the most optimal therapies unique to particular societies. This is easily observed through the dynamics associated with pharmacology, surgical technique and the medical equipment used today versus short years ago. Advancements in knowledge synthesis and reporting guidelines enhance the quality, scope and applicability of results; thus, improving health science and clinical practice and advancing health policy. While advancements are critical to the progression of optimal health care, the high cost associated with these endeavors cannot be ignored. Research fundamentally needs to be evaluated to identify the most efficient methods of evaluation. The primary objective of this paper is to look at a specific research methodology when applied to the area of clinical research, especially extracorporeal circulation and its prognosis for the future.

  15. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  16. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    SciTech Connect

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.

  17. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival.

  18. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers.

    PubMed

    Harman-Ware, Anne E; Foster, Cliff; Happs, Renee M; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F

    2016-10-01

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.

  19. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  20. Quantitative fuel motion determination with the CABRI fast neutron hodoscope; Evaluation methods and results

    SciTech Connect

    Baumung, K. ); Augier, G. )

    1991-12-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized.

  1. Characterization of Three Rice Multiparent Advanced Generation Intercross (MAGIC) Populations for Quantitative Trait Loci Identification.

    PubMed

    Meng, Lijun; Guo, Longbiao; Ponce, Kimberly; Zhao, Xiangqian; Ye, Guoyou

    2016-07-01

    Three new rice ( L.) multiparent advanced generation intercross (MAGIC) populations were developed using eight elite rice varieties from different breeding programs. These three populations were two recombinant inbred line (RIL) populations derived from two 4-way crosses, DC1 and DC2, and one RIL population derived from an 8-way cross. These populations were genotyped using an Illumina Infinium rice 6K SNP chip. The potential of the three MAGIC populations in identifying marker-trait associations was demonstrated using the plant height (PH) and heading date (HD) measured in 2014. A population of 248 IRRI breeding lines and a population of 323 Chinese breeding lines were also included to compare genetic diversity and linkage disequilibrium (LD) pattern. Our study discovered that (i) the 8-way population had a higher gene diversity than the DC1, DC2, and IRRI populations; (ii) all three MAGIC populations showed no clear population structure; (iii) LD decayed to < 0.2 at about 2.5, 2.5, 1.25, 1.75, and 4.0 Mb for the DC1, DC2, 8-way, IRRI, and Chinese populations, respectively; and (iv) the 8-way population was more powerful than the DC1, DC2, and IRRI populations on QTL identification. The association analysis identified two and three QTL for PH and HD, respectively. Four of the five QTL had peak markers close to known genes. A novel QTL for PH was identified on chromosome 12 using the 8-way population. Therefore, our study suggests that the three new MAGIC populations are valuable resources for QTL identification.

  2. [Contemporary methods of treatment in local advanced prostate cancer].

    PubMed

    Brzozowska, Anna; Mazurkiewicz, Maria; Starosławska, Elzbieta; Stasiewicz, Dominika; Mocarska, Agnieszka; Burdan, Franciszek

    2012-10-01

    The prostate cancer is one of the most often cancers amongst males. Its frequency is increasing with age. Thanks to widespread of screening denomination of specific prostate specific antigen (PSA), ultrasonography including the one in transrectal (TRUS), computed tomography, magnetic resonance and especially the awareness of society, the number of patients with low local advance of illness is increasing. The basic method of treatment in such cases is still the surgical removal of prostate with seminal bladder or radiotherapy. To this purpose tele-(IMRT, VMAT) or brachytherapy (J125, Ir192, Pa103) is used. In patients with higher risk of progression the radiotherapy may be associated with hormonotherapy (total androgen blockage-LH-RH analog and androgen). Despite numerous clinical researches conducted there is still no selection of optimal sequence of particular methods. Moreover, no explicit effectiveness was determined. The general rule of treatment in patients suffering from prostate cancer still remains individual selection of therapeutic treatment depending on the age of a patient, general condition and especially patient's general preferences. In case of elderly patients and patients with low risk of progression, recommendation of direct observation including systematical PSA denomination, clinical transrectal examination, TRUS, MR of smaller pelvis or scintigraphy of the whole skeleton may be considered.

  3. Linking multidimensional functional diversity to quantitative methods: a graphical hypothesis--evaluation framework.

    PubMed

    Boersma, Kate S; Dee, Laura E; Miller, Steve J; Bogan, Michael T; Lytle, David A; Gitelman, Alix I

    2016-03-01

    Functional trait analysis is an appealing approach to study differences among biological communities because traits determine species' responses to the environment and their impacts on ecosystem functioning. Despite a rapidly expanding quantitative literature, it remains challenging to conceptualize concurrent changes in multiple trait dimensions ("trait space") and select quantitative functional diversity methods to test hypotheses prior to analysis. To address this need, we present a widely applicable framework for visualizing ecological phenomena in trait space to guide the selection, application, and interpretation of quantitative functional diversity methods. We describe five hypotheses that represent general patterns of responses to disturbance in functional community ecology and then apply a formal decision process to determine appropriate quantitative methods to test ecological hypotheses. As a part of this process, we devise a new statistical approach to test for functional turnover among communities. Our combination of hypotheses and metrics can be applied broadly to address ecological questions across a range of systems and study designs. We illustrate the framework with a case study of disturbance in freshwater communities. This hypothesis-driven approach will increase the rigor and transparency of applied functional trait studies.

  4. Method for quantitative proteomics research by using metal element chelated tags coupled with mass spectrometry.

    PubMed

    Liu, Huiling; Zhang, Yangjun; Wang, Jinglan; Wang, Dong; Zhou, Chunxi; Cai, Yun; Qian, Xiaohong

    2006-09-15

    The mass spectrometry-based methods with a stable isotope as the internal standard in quantitative proteomics have been developed quickly in recent years. But the use of some stable isotope reagents is limited by the relative high price and synthetic difficulties. We have developed a new method for quantitative proteomics research by using metal element chelated tags (MECT) coupled with mass spectrometry. The bicyclic anhydride diethylenetriamine-N,N,N',N' ',N' '-pentaacetic acid (DTPA) is covalently coupled to primary amines of peptides, and the ligand is then chelated to the rare earth metals Y and Tb. The tagged peptides are mixed and analyzed by LC-ESI-MS/MS. Peptides are quantified by measuring the relative signal intensities for the Y and Tb tag pairs in MS, which permits the quantitation of the original proteins generating the corresponding peptides. The protein is then identified by the corresponding peptide sequence from its MS/MS spectrum. The MECT method was evaluated by using standard proteins as model sample. The experimental results showed that metal chelate-tagged peptides chromatographically coeluted successfully during the reversed-phase LC analysis. The relative quantitation results were accurate for proteins using MECT. DTPA modification of the N-terminal of peptides promoted cleaner fragmentation (only y-series ions) in mass spectrometry and improved the confidence level of protein identification. The MECT strategy provides a simple, rapid, and economical alternative to current mass tagging technologies available.

  5. Advanced methods for preparation and characterization of infrared detector materials

    NASA Technical Reports Server (NTRS)

    Broerman, J. G.; Morris, B. J.; Meschter, P. J.

    1983-01-01

    Crystals were prepared by the Bridgman-Stockbarger method with a wide range of crystal growth rates and temperature gradients adequate to prevent constitutional supercooling under diffusion-limited, steady-state, growth conditions. The longitudinal compositional gradients for different growth conditions and alloy compositions were calculated and compared with experimental data to develop a quantitative model of solute redistribution during the crystal growth of the alloys. Measurements were performed to ascertain the effect of growth conditions on radial compositional gradients. The pseudobinary HgTe-CdTe constitutional phase diagram was determined by precision differential-thermal-analysis measurements and used to calculate the segregation coefficient of Cd as a function of x and interface temperature. Experiments were conducted to determine the ternary phase equilibria in selected regions of the Hg-Cd-Te constitutional phase diagram. Electron and hole mobilities as functions of temperature were analyzed to establish charge-carrier scattering probabilities. Computer algorithms specific to Hg(1-x)CdxTe were developed for calculations of the charge-carrier concentration, charge-carrier mobilities, Hall coefficient, and Dermi Fermi energy as functions of x, temperature, ionized donor and acceptor concentrations, and neutral defect concentrations.

  6. Qualitative and quantitative PCR methods for detection of three lines of genetically modified potatoes.

    PubMed

    Rho, Jae Kyun; Lee, Theresa; Jung, Soon-Il; Kim, Tae-San; Park, Yong-Hwan; Kim, Young-Mi

    2004-06-02

    Qualitative and quantitative polymerase chain reaction (PCR) methods have been developed for the detection of genetically modified (GM) potatoes. The combination of specific primers for amplification of the promoter region of Cry3A gene, potato leafroll virus replicase gene, and potato virus Y coat protein gene allows to identify each line of NewLeaf, NewLeaf Y, and NewLeaf Plus GM potatoes. Multiplex PCR method was also established for the simple and rapid detection of the three lines of GM potato in a mixture sample. For further quantitative detection, the realtime PCR method has been developed. This method features the use of a standard plasmid as a reference molecule. Standard plasmid contains both a specific region of the transgene Cry3A and an endogenous UDP-glucose pyrophosphorylase gene of the potato. The test samples containing 0.5, 1, 3, and 5% GM potatoes were quantified by this method. At the 3.0% level of each line of GM potato, the relative standard deviations ranged from 6.0 to 19.6%. This result shows that the above PCR methods are applicable to detect GM potatoes quantitatively as well as qualitatively.

  7. Development and validation of event-specific quantitative PCR method for genetically modified maize MIR604.

    PubMed

    Mano, Junichi; Furui, Satoshi; Takashima, Kaori; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2012-01-01

    A GM maize event, MIR604, has been widely distributed and an analytical method to quantify its content is required to monitor the validity of food labeling. Here we report a novel real-time PCR-based quantitation method for MIR604 maize. We developed real-time PCR assays specific for MIR604 using event-specific primers designed by the trait developer, and for maize endogenous starch synthase IIb gene (SSIIb). Then, we determined the conversion factor, which is required to calculate the weight-based GM maize content from the copy number ratio of MIR604-specific DNA to the endogenous reference DNA. Finally, to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind samples containing MIR604 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The reproducibility (RSDr) of the developed method was evaluated to be less than 25%. The limit of quantitation of the method was estimated to be 0.5% based on the ISO 24276 guideline. These results suggested that the developed method would be suitable for practical quantitative analyses of MIR604 maize.

  8. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer

    NASA Astrophysics Data System (ADS)

    Fu, Guanglei; Sanjay, Sharma T.; Dou, Maowei; Li, Xiujun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. Electronic supplementary information (ESI) available: Additional information on FTIR characterization (Fig. S1), photothermal immunoassay of PSA in human serum samples (Table S1), and the Experimental section, including preparation of antibody-conjugated iron oxide NPs, sandwich-type immunoassay, characterization, and photothermal detection protocol. See DOI: 10.1039/c5nr09051b

  9. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery

  10. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works

  11. A hybrid approach to advancing quantitative prediction of tissue distribution of basic drugs in human

    SciTech Connect

    Poulin, Patrick; Ekins, Sean; Theil, Frank-Peter

    2011-01-15

    A general toxicity of basic drugs is related to phospholipidosis in tissues. Therefore, it is essential to predict the tissue distribution of basic drugs to facilitate an initial estimate of that toxicity. The objective of the present study was to further assess the original prediction method that consisted of using the binding to red blood cells measured in vitro for the unbound drug (RBCu) as a surrogate for tissue distribution, by correlating it to unbound tissue:plasma partition coefficients (Kpu) of several tissues, and finally to predict volume of distribution at steady-state (V{sub ss}) in humans under in vivo conditions. This correlation method demonstrated inaccurate predictions of V{sub ss} for particular basic drugs that did not follow the original correlation principle. Therefore, the novelty of this study is to provide clarity on the actual hypotheses to identify i) the impact of pharmacological mode of action on the generic correlation of RBCu-Kpu, ii) additional mechanisms of tissue distribution for the outlier drugs, iii) molecular features and properties that differentiate compounds as outliers in the original correlation analysis in order to facilitate its applicability domain alongside the properties already used so far, and finally iv) to present a novel and refined correlation method that is superior to what has been previously published for the prediction of human V{sub ss} of basic drugs. Applying a refined correlation method after identifying outliers would facilitate the prediction of more accurate distribution parameters as key inputs used in physiologically based pharmacokinetic (PBPK) and phospholipidosis models.

  12. Quantitative evaluation of registration methods for atlas-based diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Wu, Xue; Eggebrecht, Adam T.; Culver, Joseph P.; Zhan, Yuxuan; Basevi, Hector; Dehghani, Hamid

    2013-06-01

    In Diffuse Optical Tomography (DOT), an atlas-based model can be used as an alternative to a subject-specific anatomical model for recovery of brain activity. The main step of the generation of atlas-based subject model is the registration of atlas model to the subject head. The accuracy of the DOT then relies on the accuracy of registration method. In this work, 11 registration methods are quantitatively evaluated. The registration method with EEG 10/20 systems with 19 landmarks and non-iterative point to point algorithm provides approximately 1.4 mm surface error and is considered as the most efficient registration method.

  13. Quantitative interferometric microscopy with two dimensional Hilbert transform based phase retrieval method

    NASA Astrophysics Data System (ADS)

    Wang, Shouyu; Yan, Keding; Xue, Liang

    2017-01-01

    In order to obtain high contrast images and detailed descriptions of label free samples, quantitative interferometric microscopy combining with phase retrieval is designed to obtain sample phase distributions from fringes. As accuracy and efficiency of recovered phases are affected by phase retrieval methods, thus approaches owning higher precision and faster processing speed are still in demand. Here, two dimensional Hilbert transform based phase retrieval method is adopted in cellular phase imaging, it not only reserves more sample specifics compared to classical fast Fourier transform based method, but also overcomes disadvantages of traditional algorithm according to Hilbert transform which is a one dimensional processing causing phase ambiguities. Both simulations and experiments are provided, proving the proposed phase retrieval approach can acquire quantitative sample phases with high accuracy and fast speed.

  14. Qualification of HSQC methods for quantitative composition of heparin and low molecular weight heparins.

    PubMed

    Mauri, Lucio; Boccardi, Giovanni; Torri, Giangiacomo; Karfunkle, Michael; Macchi, Eleonora; Muzi, Laura; Keire, David; Guerrini, Marco

    2017-03-20

    An NMR HSQC method has recently been proposed for the quantitative determination of the mono- and disaccharide subunits of heparin and low molecular weight heparins (LMWH). The focus of the current study was the validation of this procedure to make the 2D-NMR method suitable for pharmaceutical quality control applications. Pre-validation work investigated the effects of several experimental parameters to assess robustness and to optimize critical factors. Important experimental parameters were pulse sequence selection, equilibration interval between pulse trains and temperature. These observations were needed so that the NMR method was sufficiently understood to enable continuous improvement. A standard validation study on heparin then examined linearity, repeatability, intermediate precision and limits of detection and quantitation; selected validation parameters were also determined for LMWH.

  15. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  16. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  17. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  18. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  19. Advanced Signal Processing Methods Applied to Digital Mammography

    NASA Technical Reports Server (NTRS)

    Stauduhar, Richard P.

    1997-01-01

    The work reported here is on the extension of the earlier proposal of the same title, August 1994-June 1996. The report for that work is also being submitted. The work reported there forms the foundation for this work from January 1997 to September 1997. After the earlier work was completed there were a few items that needed to be completed prior to submission of a new and more comprehensive proposal for further research. Those tasks have been completed and two new proposals have been submitted, one to NASA, and one to Health & Human Services WS). The main purpose of this extension was to refine some of the techniques that lead to automatic large scale evaluation of full mammograms. Progress on each of the proposed tasks follows. Task 1: A multiresolution segmentation of background from breast has been developed and tested. The method is based on the different noise characteristics of the two different fields. The breast field has more power in the lower octaves and the off-breast field behaves similar to a wideband process, where more power is in the high frequency octaves. After the two fields are separated by lowpass filtering, a region labeling routine is used to find the largest contiguous region, the breast. Task 2: A wavelet expansion that can decompose the image without zero padding has been developed. The method preserves all properties of the power-of-two wavelet transform and does not add appreciably to computation time or storage. This work is essential for analysis of the full mammogram, as opposed to selecting sections from the full mammogram. Task 3: A clustering method has been developed based on a simple counting mechanism. No ROC analysis has been performed (and was not proposed), so we cannot finally evaluate this work without further support. Task 4: Further testing of the filter reveals that different wavelet bases do yield slightly different qualitative results. We cannot provide quantitative conclusions about this for all possible bases

  20. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  1. Underwater Photosynthesis of Submerged Plants – Recent Advances and Methods

    PubMed Central

    Pedersen, Ole; Colmer, Timothy D.; Sand-Jensen, Kaj

    2013-01-01

    We describe the general background and the recent advances in research on underwater photosynthesis of leaf segments, whole communities, and plant dominated aquatic ecosystems and present contemporary methods tailor made to quantify photosynthesis and carbon fixation under water. The majority of studies of aquatic photosynthesis have been carried out with detached leaves or thalli and this selectiveness influences the perception of the regulation of aquatic photosynthesis. We thus recommend assessing the influence of inorganic carbon and temperature on natural aquatic communities of variable density in addition to studying detached leaves in the scenarios of rising CO2 and temperature. Moreover, a growing number of researchers are interested in tolerance of terrestrial plants during flooding as torrential rains sometimes result in overland floods that inundate terrestrial plants. We propose to undertake studies to elucidate the importance of leaf acclimation of terrestrial plants to facilitate gas exchange and light utilization under water as these acclimations influence underwater photosynthesis as well as internal aeration of plant tissues during submergence. PMID:23734154

  2. Advances in the analysis of iminocyclitols: Methods, sources and bioavailability.

    PubMed

    Amézqueta, Susana; Torres, Josep Lluís

    2016-05-01

    Iminocyclitols are chemically and metabolically stable, naturally occurring sugar mimetics. Their biological activities make them interesting and extremely promising as both drug leads and functional food ingredients. The first iminocyclitols were discovered using preparative isolation and purification methods followed by chemical characterization using nuclear magnetic resonance spectroscopy. In addition to this classical approach, gas and liquid chromatography coupled to mass spectrometry are increasingly used; they are highly sensitive techniques capable of detecting minute amounts of analytes in a broad spectrum of sources after only minimal sample preparation. These techniques have been applied to identify new iminocyclitols in plants, microorganisms and synthetic mixtures. The separation of iminocyclitol mixtures by chromatography is particularly difficult however, as the most commonly used matrices have very low selectivity for these highly hydrophilic structurally similar molecules. This review critically summarizes recent advances in the analysis of iminocyclitols from plant sources and findings regarding their quantification in dietary supplements and foodstuffs, as well as in biological fluids and organs, from bioavailability studies.

  3. Regenerative medicine: advances in new methods and technologies.

    PubMed

    Park, Dong-Hyuk; Eve, David J

    2009-11-01

    The articles published in the journal Cell Transplantation - The Regenerative Medicine Journal over the last two years reveal the recent and future cutting-edge research in the fields of regenerative and transplantation medicine. 437 articles were published from 2007 to 2008, a 17% increase compared to the 373 articles in 2006-2007. Neuroscience was still the most common section in both the number of articles and the percentage of all manuscripts published. The increasing interest and rapid advance in bioengineering technology is highlighted by tissue engineering and bioartificial organs being ranked second again. For a similar reason, the methods and new technologies section increased significantly compared to the last period. Articles focusing on the transplantation of stem cell lineages encompassed almost 20% of all articles published. By contrast, the non-stem cell transplantation group which is made up primarily of islet cells, followed by biomaterials and fetal neural tissue, etc. comprised less than 15%. Transplantation of cells pre-treated with medicine or gene transfection to prolong graft survival or promote differentiation into the needed phenotype, was prevalent in the transplantation articles regardless of the kind of cells used. Meanwhile, the majority of non-transplantation-based articles were related to new devices for various purposes, characterization of unknown cells, medicines, cell preparation and/or optimization for transplantation (e.g. isolation and culture), and disease pathology.

  4. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era

    SciTech Connect

    Chiu, Weihsueh A.; Euling, Susan Y.; Scott, Cheryl Siegel; Subramaniam, Ravi P.

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA) — i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on “augmentation” of weight of evidence — using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards “integration” of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for “expansion” of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual “reorientation” of QRA towards approaches that more directly link environmental exposures to human outcomes.

  5. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI

    PubMed Central

    Magnoni, Sandra; Mac Donald, Christine L.; Esparza, Thomas J.; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z.; Shimony, Joshua S.; Triulzi, Fabio; Stocchetti, Nino

    2015-01-01

    Axonal injury is a major contributor to adverse outcomes following brain trauma. However, the extent of axonal injury cannot currently be assessed reliably in living humans. Here, we used two experimental methods with distinct noise sources and limitations in the same cohort of 15 patients with severe traumatic brain injury to assess axonal injury. One hundred kilodalton cut-off microdialysis catheters were implanted at a median time of 17 h (13–29 h) after injury in normal appearing (on computed tomography scan) frontal white matter in all patients, and samples were collected for at least 72 h. Multiple analytes, such as the metabolic markers glucose, lactate, pyruvate, glutamate and tau and amyloid-β proteins, were measured every 1–2 h in the microdialysis samples. Diffusion tensor magnetic resonance imaging scans at 3 T were performed 2–9 weeks after injury in 11 patients. Stability of diffusion tensor imaging findings was verified by repeat scans 1–3 years later in seven patients. An additional four patients were scanned only at 1–3 years after injury. Imaging abnormalities were assessed based on comparisons with five healthy control subjects for each patient, matched by age and sex (32 controls in total). No safety concerns arose during either microdialysis or scanning. We found that acute microdialysis measurements of the axonal cytoskeletal protein tau in the brain extracellular space correlated well with diffusion tensor magnetic resonance imaging-based measurements of reduced brain white matter integrity in the 1-cm radius white matter-masked region near the microdialysis catheter insertion sites. Specifically, we found a significant inverse correlation between microdialysis measured levels of tau 13–36 h after injury and anisotropy reductions in comparison with healthy controls (Spearman’s r = −0.64, P = 0.006). Anisotropy reductions near microdialysis catheter insertion sites were highly correlated with reductions in multiple additional

  6. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI.

    PubMed

    Magnoni, Sandra; Mac Donald, Christine L; Esparza, Thomas J; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z; Shimony, Joshua S; Triulzi, Fabio; Stocchetti, Nino; Brody, David L

    2015-08-01

    Axonal injury is a major contributor to adverse outcomes following brain trauma. However, the extent of axonal injury cannot currently be assessed reliably in living humans. Here, we used two experimental methods with distinct noise sources and limitations in the same cohort of 15 patients with severe traumatic brain injury to assess axonal injury. One hundred kilodalton cut-off microdialysis catheters were implanted at a median time of 17 h (13-29 h) after injury in normal appearing (on computed tomography scan) frontal white matter in all patients, and samples were collected for at least 72 h. Multiple analytes, such as the metabolic markers glucose, lactate, pyruvate, glutamate and tau and amyloid-β proteins, were measured every 1-2 h in the microdialysis samples. Diffusion tensor magnetic resonance imaging scans at 3 T were performed 2-9 weeks after injury in 11 patients. Stability of diffusion tensor imaging findings was verified by repeat scans 1-3 years later in seven patients. An additional four patients were scanned only at 1-3 years after injury. Imaging abnormalities were assessed based on comparisons with five healthy control subjects for each patient, matched by age and sex (32 controls in total). No safety concerns arose during either microdialysis or scanning. We found that acute microdialysis measurements of the axonal cytoskeletal protein tau in the brain extracellular space correlated well with diffusion tensor magnetic resonance imaging-based measurements of reduced brain white matter integrity in the 1-cm radius white matter-masked region near the microdialysis catheter insertion sites. Specifically, we found a significant inverse correlation between microdialysis measured levels of tau 13-36 h after injury and anisotropy reductions in comparison with healthy controls (Spearman's r = -0.64, P = 0.006). Anisotropy reductions near microdialysis catheter insertion sites were highly correlated with reductions in multiple additional white matter

  7. A method for operative quantitative interpretation of multispectral images of biological tissues

    NASA Astrophysics Data System (ADS)

    Lisenko, S. A.; Kugeiko, M. M.

    2013-10-01

    A method for operative retrieval of spatial distributions of biophysical parameters of a biological tissue by using a multispectral image of it has been developed. The method is based on multiple regressions between linearly independent components of the diffuse reflection spectrum of the tissue and unknown parameters. Possibilities of the method are illustrated by an example of determining biophysical parameters of the skin (concentrations of melanin, hemoglobin and bilirubin, blood oxygenation, and scattering coefficient of the tissue). Examples of quantitative interpretation of the experimental data are presented.

  8. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC).

  9. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected.

  10. Quantitative determination of sibutramine in adulterated herbal slimming formulations by TLC-image analysis method.

    PubMed

    Phattanawasin, Panadda; Sotanaphun, Uthai; Sukwattanasinit, Tasamaporn; Akkarawaranthorn, Jariya; Kitchaiya, Sarunyaporn

    2012-06-10

    A simple thin layer chromatographic (TLC)-image analysis method was developed for rapid determination and quantitation of sibutramine hydrochloride (SH) adulterated in herbal slimming products. Chromatographic separation of SH was achieved on a silica gel 60 F(254) TLC plate, using toluene-n-hexane-diethylamine (9:1:0.3, v/v/v) as the mobile phase and Dragendorff reagent as spot detection. Image analysis of the scanned TLC plate was performed to quantify the amount of SH. The polynomial regression data for the calibration plots showed good linear relationship in the concentration range of 1-6 μg/spot. The limits of detection and quantitation were 190 and 634 ng/spot, respectively. The method gave satisfactory specificity, precision, accuracy, robustness and was applied for determination of SH in herbal formulations. The contents of SH in adulterated samples determined by the TLC-image analysis and TLC-densitometry were also compared.

  11. Quantitative methods for genome-scale analysis of in situ hybridization and correlation with microarray data

    PubMed Central

    Lee, Chang-Kyu; Sunkin, Susan M; Kuan, Chihchau; Thompson, Carol L; Pathak, Sayan; Ng, Lydia; Lau, Chris; Fischer, Shanna; Mortrud, Marty; Slaughterbeck, Cliff; Jones, Allan; Lein, Ed; Hawrylycz, Michael

    2008-01-01

    With the emergence of genome-wide colorimetric in situ hybridization (ISH) data sets such as the Allen Brain Atlas, it is important to understand the relationship between this gene expression modality and those derived from more quantitative based technologies. This study introduces a novel method for standardized relative quantification of colorimetric ISH signal that enables a large-scale cross-platform expression level comparison of ISH with two publicly available microarray brain data sources. PMID:18234097

  12. Models and methods for quantitative analysis of surface-enhanced Raman spectra.

    PubMed

    Li, Shuo; Nyagilo, James O; Dave, Digant P; Gao, Jean

    2014-03-01

    The quantitative analysis of surface-enhanced Raman spectra using scattering nanoparticles has shown the potential and promising applications in in vivo molecular imaging. The diverse approaches have been used for quantitative analysis of Raman pectra information, which can be categorized as direct classical least squares models, full spectrum multivariate calibration models, selected multivariate calibration models, and latent variable regression (LVR) models. However, the working principle of these methods in the Raman spectra application remains poorly understood and a clear picture of the overall performance of each model is missing. Based on the characteristics of the Raman spectra, in this paper, we first provide the theoretical foundation of the aforementioned commonly used models and show why the LVR models are more suitable for quantitative analysis of the Raman spectra. Then, we demonstrate the fundamental connections and differences between different LVR methods, such as principal component regression, reduced-rank regression, partial least square regression (PLSR), canonical correlation regression, and robust canonical analysis, by comparing their objective functions and constraints.We further prove that PLSR is literally a blend of multivariate calibration and feature extraction model that relates concentrations of nanotags to spectrum intensity. These features (a.k.a. latent variables) satisfy two purposes: the best representation of the predictor matrix and correlation with the response matrix. These illustrations give a new understanding of the traditional PLSR and explain why PLSR exceeds other methods in quantitative analysis of the Raman spectra problem. In the end, all the methods are tested on the Raman spectra datasets with different evaluation criteria to evaluate their performance.

  13. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  14. Advances in Experimental Neuropathology: New Methods and Insights.

    PubMed

    Roth, Kevin A

    2016-03-01

    This Editorial introduces this month's special Neuropathology Theme Issue, a series of Reviews on advances in our understanding of rare human hereditary neuropathies, peripheral nervous system tumors, and common degenerative diseases.

  15. Test characteristics of urinary biomarkers depend on quantitation method in acute kidney injury.

    PubMed

    Ralib, Azrina Md; Pickering, John W; Shaw, Geoffrey M; Devarajan, Prasad; Edelstein, Charles L; Bonventre, Joseph V; Endre, Zoltan H

    2012-02-01

    The concentration of urine influences the concentration of urinary biomarkers of AKI. Whether normalization to urinary creatinine concentration, as commonly performed to quantitate albuminuria, is the best method to account for variations in urinary biomarker concentration among patients in the intensive care unit is unknown. Here, we compared the diagnostic and prognostic performance of three methods of biomarker quantitation: absolute concentration, biomarker normalized to urinary creatinine concentration, and biomarker excretion rate. We measured urinary concentrations of alkaline phosphatase, γ-glutamyl transpeptidase, cystatin C, neutrophil gelatinase-associated lipocalin, kidney injury molecule-1, and IL-18 in 528 patients on admission and after 12 and 24 hours. Absolute concentration best diagnosed AKI on admission, but normalized concentrations best predicted death, dialysis, or subsequent development of AKI. Excretion rate on admission did not diagnose or predict outcomes better than either absolute or normalized concentration. Estimated 24-hour biomarker excretion associated with AKI severity, and for neutrophil gelatinase-associated lipocalin and cystatin C, with poorer survival. In summary, normalization to urinary creatinine concentration improves the prediction of incipient AKI and outcome but provides no advantage in diagnosing established AKI. The ideal method for quantitating biomarkers of urinary AKI depends on the outcome of interest.

  16. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    SciTech Connect

    E. Blanford; E. Keldrauk; M. Laufer; M. Mieler; J. Wei; B. Stojadinovic; P.F. Peterson

    2010-09-20

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement, and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using

  17. A pilot study of quantitative assessment of mandible advancement using pressure-flow relationship during midazolam sedation.

    PubMed

    Ayuse, T; Hoshino, Y; Inazawa, T; Oi, K; Schneider, H; Schwartz, A R

    2006-11-01

    It has been proposed that a titration of the mandibular positioner would be a promising method for predicting the outcome of nasal continuous positive airway pressure (CPAP) therapy. This study was carried out to test the hypothesis that mandible advancement could be evaluated by analysis of inspiratory flow limitation using a titration procedure. To explore its effect, we examined upper airway pressure-flow relationships using a titrated mandible positioner during midazolam sedation. Non-flow limited inspiration occurred when the mandible was advanced 7.1 +/- 1.2 mm from centric occlusion position. In the centric occlusion position (0 mm advancement), Pcrit was -1.9 +/- 2.9 cmH2O and Rua was 23.3 +/- 4.5 cmH2O L(-1) s(-1). In the eMAP position, Pcrit was -7.3 +/- 1.9 cmH2O and Rua was 27.8 +/- 3.3 cmH2O L(-1) s(-1). Essentially no CPAP was required to overcome flow limitation in eMAP position, whereas 3.7 +/- 2.2 cmH2O CPAP was required in centric occlusion position. We conclude that assessing inspiratory flow limitation using a titrated mandible positioner was effective for estimating individual-matched mandible positions.

  18. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  19. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    PubMed

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time <60 s) and method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical

  20. The Isotope-Coded Affinity Tag Method for Quantitative Protein Profile Comparison and Relative Quantitation of Cysteine Redox Modifications.

    PubMed

    Chan, James Chun Yip; Zhou, Lei; Chan, Eric Chun Yong

    2015-11-02

    The isotope-coded affinity tag (ICAT) technique has been applied to measure pairwise changes in protein expression through differential stable isotopic labeling of proteins or peptides followed by identification and quantification using a mass spectrometer. Changes in protein expression are observed when the identical peptide from each of two biological conditions is identified and a difference is detected in the measurements comparing the peptide labeled with the heavy isotope to the one with a normal isotopic distribution. This approach allows the simultaneous comparison of the expression of many proteins between two different biological states (e.g., yeast grown on galactose versus glucose, or normal versus cancer cells). Due to the cysteine-specificity of the ICAT reagents, the ICAT technique has also been applied to perform relative quantitation of cysteine redox modifications such as oxidation and nitrosylation. This unit describes both protein quantitation and profiling of cysteine redox modifications using the ICAT technique.

  1. Processing of alnico permanent magnets by advanced directional solidification methods

    SciTech Connect

    Zou, Min; Johnson, Francis; Zhang, Wanming; Zhao, Qi; Rutkowski, Stephen F.; Zhou, Lin; Kramer, Matthew J.

    2016-07-05

    Advanced directional solidification methods have been used to produce large (>15 cm length) castings of Alnico permanent magnets with highly oriented columnar microstructures. In combination with subsequent thermomagnetic and draw thermal treatment, this method was used to enable the high coercivity, high-Titanium Alnico composition of 39% Co, 29.5% Fe, 14% Ni, 7.5% Ti, 7% Al, 3% Cu (wt%) to have an intrinsic coercivity (Hci) of 2.0 kOe, a remanence (Br) of 10.2 kG, and an energy product (BH)max of 10.9 MGOe. These properties compare favorably to typical properties for the commercial Alnico 9. Directional solidification of higher Ti compositions yielded anisotropic columnar grained microstructures if high heat extraction rates through the mold surface of at least 200 kW/m2 were attained. This was achieved through the use of a thin walled (5 mm thick) high thermal conductivity SiC shell mold extracted from a molten Sn bath at a withdrawal rate of at least 200 mm/h. However, higher Ti compositions did not result in further increases in magnet performance. Images of the microstructures collected by scanning electron microscopy (SEM) reveal a majority α phase with inclusions of secondary αγ phase. Transmission electron microscopy (TEM) reveals that the α phase has a spinodally decomposed microstructure of FeCo-rich needles in a NiAl-rich matrix. In the 7.5% Ti composition the diameter distribution of the FeCo needles was bimodal with the majority having diameters of approximately 50 nm with a small fraction having diameters of approximately 10 nm. The needles formed a mosaic pattern and were elongated along one <001> crystal direction (parallel to the field used during magnetic annealing). Cu precipitates were observed between the needles. Regions of abnormal spinodal morphology appeared to correlate with secondary phase precipitates. The presence of these abnormalities did not prevent the material from displaying

  2. Processing of alnico permanent magnets by advanced directional solidification methods

    DOE PAGES

    Zou, Min; Johnson, Francis; Zhang, Wanming; ...

    2016-07-05

    Advanced directional solidification methods have been used to produce large (>15 cm length) castings of Alnico permanent magnets with highly oriented columnar microstructures. In combination with subsequent thermomagnetic and draw thermal treatment, this method was used to enable the high coercivity, high-Titanium Alnico composition of 39% Co, 29.5% Fe, 14% Ni, 7.5% Ti, 7% Al, 3% Cu (wt%) to have an intrinsic coercivity (Hci) of 2.0 kOe, a remanence (Br) of 10.2 kG, and an energy product (BH)max of 10.9 MGOe. These properties compare favorably to typical properties for the commercial Alnico 9. Directional solidification of higher Ti compositions yieldedmore » anisotropic columnar grained microstructures if high heat extraction rates through the mold surface of at least 200 kW/m2 were attained. This was achieved through the use of a thin walled (5 mm thick) high thermal conductivity SiC shell mold extracted from a molten Sn bath at a withdrawal rate of at least 200 mm/h. However, higher Ti compositions did not result in further increases in magnet performance. Images of the microstructures collected by scanning electron microscopy (SEM) reveal a majority α phase with inclusions of secondary αγ phase. Transmission electron microscopy (TEM) reveals that the α phase has a spinodally decomposed microstructure of FeCo-rich needles in a NiAl-rich matrix. In the 7.5% Ti composition the diameter distribution of the FeCo needles was bimodal with the majority having diameters of approximately 50 nm with a small fraction having diameters of approximately 10 nm. The needles formed a mosaic pattern and were elongated along one <001> crystal direction (parallel to the field used during magnetic annealing). Cu precipitates were observed between the needles. Regions of abnormal spinodal morphology appeared to correlate with secondary phase precipitates. The presence of these abnormalities did not prevent the material from displaying superior magnetic properties in the 7.5% Ti

  3. Processing of alnico permanent magnets by advanced directional solidification methods

    NASA Astrophysics Data System (ADS)

    Zou, Min; Johnson, Francis; Zhang, Wanming; Zhao, Qi; Rutkowski, Stephen F.; Zhou, Lin; Kramer, Matthew J.

    2016-12-01

    Advanced directional solidification methods have been used to produce large (>15 cm length) castings of Alnico permanent magnets with highly oriented columnar microstructures. In combination with subsequent thermomagnetic and draw thermal treatment, this method was used to enable the high coercivity, high-Titanium Alnico composition of 39% Co, 29.5% Fe, 14% Ni, 7.5% Ti, 7% Al, 3% Cu (wt%) to have an intrinsic coercivity (Hci) of 2.0 kOe, a remanence (Br) of 10.2 kG, and an energy product (BH)max of 10.9 MGOe. These properties compare favorably to typical properties for the commercial Alnico 9. Directional solidification of higher Ti compositions yielded anisotropic columnar grained microstructures if high heat extraction rates through the mold surface of at least 200 kW/m2 were attained. This was achieved through the use of a thin walled (5 mm thick) high thermal conductivity SiC shell mold extracted from a molten Sn bath at a withdrawal rate of at least 200 mm/h. However, higher Ti compositions did not result in further increases in magnet performance. Images of the microstructures collected by scanning electron microscopy (SEM) reveal a majority α phase with inclusions of secondary αγ phase. Transmission electron microscopy (TEM) reveals that the α phase has a spinodally decomposed microstructure of FeCo-rich needles in a NiAl-rich matrix. In the 7.5% Ti composition the diameter distribution of the FeCo needles was bimodal with the majority having diameters of approximately 50 nm with a small fraction having diameters of approximately 10 nm. The needles formed a mosaic pattern and were elongated along one <001> crystal direction (parallel to the field used during magnetic annealing). Cu precipitates were observed between the needles. Regions of abnormal spinodal morphology appeared to correlate with secondary phase precipitates. The presence of these abnormalities did not prevent the material from displaying superior magnetic properties in the 7.5% Ti

  4. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    PubMed

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  5. On the quantitative method for measurement and analysis of the fine structure of Fraunhofer line profiles

    NASA Astrophysics Data System (ADS)

    Kuli-Zade, D. M.

    The methods of measurement and analysis of the fine structure of weak and moderate Fraunhofer line profiles are considered. The digital spectral materials were obtained using rapid scanning high dispersion and high resolution double monochromators. The methods of asymmetry coefficient, bisector method and new quantitative method pro- posed by the author are discussed. The new physical values of differential, integral, residual and relative asymmetries are first introduced. These quantitative values permit us to investigate the dependence of asymmetry on microscopic (atomic) and macro- scopic (photospheric) values. It is shown that the integral profile asymmetries grow appreciably with increase in line equivalent width. The average effective depths of the formation of used Fraunhofer lines in the photosphere of the Sun are determined. It is shown that with the increasing of the effective formation depths of the lines integral and residual asymmetries of the lines profiles noticeably decrease. It is in fine agree- ment with the results of intensity dependence of asymmetry. The above-mentioned methods are critically compared and the advantages of author's method are shown. The computer program of calculation of the line-profile asymmetry parameters has been worked out.

  6. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methods—acetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41—was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason.

  7. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    SciTech Connect

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-15

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  8. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  9. Method for collecting and immobilizing individual cumulus cells enabling quantitative immunofluorescence analysis of proteins.

    PubMed

    Appeltant, R; Maes, D; Van Soom, A

    2015-07-01

    Most immunofluorescence methods rely on techniques dealing with a very large number of cells. However, when the number of cells in a sample is low (e.g., when cumulus cells must be analyzed from individual cumulus-oocyte complexes), specific techniques are required to conserve, fix, and analyze cells individually. We established and validated a simple and effective method for collecting and immobilizing low numbers of cumulus cells that enables easy and quick quantitative immunofluorescence analysis of proteins from individual cells. To illustrate this technique, we stained proprotein of a disintegrin and metalloproteinase with thrombospondin-like repeats-1 (proADAMTS-1) and analyzed its levels in individual porcine cumulus cells.

  10. A method for quantitative analysis of aquatic humic substances in clear water based on carbon concentration.

    PubMed

    Tsuda, Kumiko; Takata, Akihiro; Shirai, Hidekado; Kozaki, Katsutoshi; Fujitake, Nobuhide

    2012-01-01

    Aquatic humic substances (AHSs) are major constituents of dissolved organic matter (DOM) in freshwater, where they perform a number of important ecological and geochemical functions, yet no method exists for quantifying all AHSs. We have developed a method for the quantitative analysis of AHSs based on their carbon concentration. Our approach includes: (1) the development of techniques for clear-water samples with low AHS concentrations, which normally complicate quantification; (2) avoiding carbon contamination in the laboratory; and (3) optimizing the AHS adsorption conditions.

  11. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    PubMed

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated.

  12. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  13. A fluorescence-quenching method for quantitative analysis of Ponceau 4R in beverage.

    PubMed

    Zhang, Jianpo; Na, Lihua; Jiang, Yunxia; Han, Dandan; Lou, Dawei; Jin, Li

    2017-04-15

    CdTe quantum dots was synthesized and used to quantitative analysis of Ponceau 4R in solution. With the excitation wavelength of 380nm the emission of CdTe quantum dots was quenched obviously by Ponceau 4R. In order to detect Ponceau 4R in mild condition the influences of fluorescence emission wavelength of CdTe quantum dots, pH value, temperature and reaction time were examined to establish the experimental condition. The linear response of the fluorescence intensity of CdTe quantum dots to Ponceau 4R allowed the quantitative analysis of Ponceau 4R in a range of 2.5-25μg/mL, and the limit of detection for Ponceau 4R was 0.025μg/mL. In addition, the responsive mechanism of this reaction system was investigated in detail by using the modified Stern-Volmer equation and thermodynamic calculation. Particularly, this method was used to quantitatively analyze the real sample, which indicated that this method could be more widely applied in similar samples.

  14. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  15. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    PubMed

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-08

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine.

  16. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    PubMed

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy.

  17. Combinative Method Using Multi-components Quantitation and HPLC Fingerprint for Comprehensive Evaluation of Gentiana crassicaulis

    PubMed Central

    Song, Jiuhua; Chen, Fengzheng; Liu, Jiang; Zou, Yuanfeng; Luo, Yun; Yi, Xiaoyan; Meng, Jie; Chen, Xingfu

    2017-01-01

    Background: Gentiana crassicaulis () is an important traditional Chinese herb. Like other herbs, its chemical compounds vary greatly by the environmental and genetic factors, as a result, the quality is always different even from the same region, and therefore, the quality evaluation is necessary for its safety and effective use. In this study, a comprehensive method including HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Cujingqinjiao and to classify the samples collected from Lijiang City of Yunnan province. A total of 30 common peaks including four identified peaks, were found, and were involved for further characterization and quality control of Cujingqinjiao. Twenty-one batches of samples from Lijiang City of Yunnan Province were evaluated by similarity analysis (SA), hierarchical cluster analysis (HCA), principal component analysis (PCA) and factor analysis (FA) according to the characteristic of common peaks. Results: The obtained data showed good stability and repeatability of the chromatographic fingerprint, similarity values were all more than 0.90. This study demonstrated that a combination of the chromatographic quantitative analysis and fingerprint offered an efficient way to quality consistency evaluation of Cujingqinjiao. Consistent results were obtained to show that samples from a same origin could be successfully classified into two groups. Conclusion: This study revealed that the combinative method was reliable, simple and sensitive for fingerprint analysis, moreover, for quality control and pattern recognition of Cujingqinjiao. SUMMARY HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Gentiana crassicaulisSimilarity analysis, hierarchical cluster analysis, principal component analysis and factor analysis were employed to analysis the chromatographic dataset.The results of multi-components quantitation analysis, similarity analysis, hierarchical cluster analysis, principal

  18. Advanced Extraction Methods for Actinide/Lanthanide Separations

    SciTech Connect

    Scott, M.J.

    2005-12-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  19. Prognostic Value of Quantitative Metabolic Metrics on Baseline Pre-Sunitinib FDG PET/CT in Advanced Renal Cell Carcinoma

    PubMed Central

    Minamimoto, Ryogo; Barkhodari, Amir; Harshman, Lauren; Srinivas, Sandy; Quon, Andrew

    2016-01-01

    Purpose The objective of this study was to prospectively evaluate various quantitative metrics on FDG PET/CT for monitoring sunitinib therapy and predicting prognosis in patients with metastatic renal cell cancer (mRCC). Methods Seventeen patients (mean age: 59.0 ± 11.6) prospectively underwent a baseline FDG PET/CT and interim PET/CT after 2 cycles (12 weeks) of sunitinib therapy. We measured the highest maximum standardized uptake value (SUVmax) of all identified lesions (highest SUVmax), sum of SUVmax with maximum six lesions (sum of SUVmax), total lesion glycolysis (TLG) and metabolic tumor volume (MTV) from baseline PET/CT and interim PET/CT, and the % decrease in highest SUVmax of lesion (%Δ highest SUVmax), the % decrease in sum of SUVmax, the % decrease in TLG (%ΔTLG) and the % decrease in MTV (%ΔMTV) between baseline and interim PET/CT, and the imaging results were validated by clinical follow-up at 12 months after completion of therapy for progression free survival (PFS). Results At 12 month follow-up, 6/17 (35.3%) patients achieved PFS, while 11/17 (64.7%) patients were deemed to have progression of disease or recurrence within the previous 12 months. At baseline, PET/CT demonstrated metabolically active cancer in all cases. Using baseline PET/CT alone, all of the quantitative imaging metrics were predictive of PFS. Using interim PET/CT, the %Δ highest SUVmax, %Δ sum of SUVmax, and %ΔTLG were also predictive of PFS. Otherwise, interim PET/CT showed no significant difference between the two survival groups regardless of the quantitative metric utilized including MTV and TLG. Conclusions Quantitative metabolic measurements on baseline PET/CT appears to be predictive of PFS at 12 months post-therapy in patients scheduled to undergo sunitinib therapy for mRCC. Change between baseline and interim PET/CT also appeared to have prognostic value but otherwise interim PET/CT after 12 weeks of sunitinib did not appear to be predictive of PFS. PMID:27123976

  20. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, James W.; Molz, Fred J.

    2003-02-07

    Improved prediction of interwell reservoir heterogeneity is needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contain approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley. This investigation involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation.

  1. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models

    PubMed Central

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-01-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. PMID:27591750

  2. Novel method for quantitative ANA measurement using near-infrared imaging.

    PubMed

    Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L

    2009-09-30

    Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.

  3. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    PubMed

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population.

  4. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  5. An evolutionary method for synthesizing technological planning and architectural advance

    NASA Astrophysics Data System (ADS)

    Cole, Bjorn Forstrom

    In the development of systems with ever-increasing performance and/or decreasing drawbacks, there inevitably comes a point where more progress is available by shifting to a new set of principles of use. This shift marks a change in architecture, such as between the piston-driven propeller and the jet engine. The shift also often involves an abandonment of previous competencies that have been developed with great effort, and so a foreknowledge of these shifts can be advantageous. A further motivation for this work is the consideration of the Micro Autonomous Systems and Technology (MAST) project, which aims to develop very small (<5 cm) robots for a variety of uses. This is primarily a technology research project, and there is no baseline morphology for a robot to be considered. This then motivates an interest in the ability to automatically compose physical architectures from a series of components and quantitatively analyze them for a basic, conceptual analysis. The ability to do this would enable researchers to turn attention to the most promising forms. This work presents a method for using technology forecasts of components that enable future architectural shifts in order to forecast those shifts. The method consists of the use of multidimensional S-curves, genetic algorithms, and a graph-based formulation of architecture that is more flexible than other morphological techniques. Potential genetic operators are explored in depth to draft a final graph-based genetic algorithm. This algorithm is then implemented in a design code called Sindri, which leverages a commercial design tool named Pacelab. The first chapters of this thesis provide context and a philosophical background to the studies and research that was conducted. In particular, the idea that technology progresses in a fundamentally gradual way is developed and supported with previous historical research. The import of this is that the future can to some degree be predicted by the past, provided that

  6. Quantitative analysis of trace chromium in blood samples. Combination of the advanced oxidation process with catalytic adsorptive stripping voltammetry.

    PubMed

    Yong, Li; Armstrong, Kristie C; Dansby-Sparks, Royce N; Carrington, Nathan A; Chambers, James Q; Xue, Zi-Ling

    2006-11-01

    A new method for pretreating blood samples for trace Cr analysis is described. The advanced oxidation process (AOP with H2O2 and 5.5-W UV irradiation for 60 min) is used to remove biological/organic species for subsequent analysis. Prior to the AOP pretreatment, acid (HNO3) is used at pH 3.0 to inhibit the enzyme catalase in the blood samples. Catalytic adsorptive stripping voltammetry at a bismuth film electrode gives a Cr concentration of 6.0 +/- 0.3 ppb in the blood samples. This concentration was confirmed by dry-ashing the blood samples and subsequent analysis by atomic absorption spectroscopy. This current method may be used to monitor chromium, a trace metal in humans, and the efficacy and safety of chromium supplements as adjuvant therapy for diabetes.

  7. Development and application of quantitative detection method for viral hemorrhagic septicemia virus (VHSV) genogroup IVa.

    PubMed

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-05-23

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R² values of the primer set developed in this study were -0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID₅₀) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID₅₀, making it a very useful tool for VHSV diagnosis.

  8. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (Cf) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined Cf for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  9. Rapid quantitative method for total brominated vegetable oil in soft drinks using ion chromatography.

    PubMed

    Yousef, Ashraf A; Abbas, Alaa B; Badawi, Bassam Sh; Al-Jowhar, Wafaa Y; Zain, Esam A; El-Mufti, Seham A

    2012-08-01

    A simple, quantitative and rapid method for total brominated vegetable oil (BVO) using ion chromatography (IC) with suppressed conductivity detection was developed and successfully applied to soft drinks with results expressed as inorganic bromide anion. The procedure involves extraction of BVO with diethyl ether and treatment with zinc dust in a solution of acetic acid, giving recoveries ranging between 92.5 and 98.5%. The calibration curves obtained were linear with correlation coefficients (r²) of 0.998, a coefficient of variation (CV) of less than 5% and limit of detection (LOD) and limit of quantification (LOQ) of 250 and 750 µg l⁻¹, respectively. The method was successfully applied to the determination of BVO in several commercial soft drinks which were found to contain BVO in the range 1.8-14.510 mg l⁻¹. The method has less sources of error compared to previously published methods.

  10. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    PubMed

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge.

  11. Aspects of bioanalytical method validation for the quantitative determination of trace elements.

    PubMed

    Levine, Keith E; Tudan, Christopher; Grohse, Peter M; Weber, Frank X; Levine, Michael A; Kim, Yu-Seon J

    2011-08-01

    Bioanalytical methods are used to quantitatively determine the concentration of drugs, biotransformation products or other specified substances in biological matrices and are often used to provide critical data to pharmacokinetic or bioequivalence studies in support of regulatory submissions. In order to ensure that bioanalytical methods are capable of generating reliable, reproducible data that meet or exceed current regulatory guidance, they are subjected to a rigorous method validation process. At present, regulatory guidance does not necessarily account for nuances specific to trace element determinations. This paper is intended to provide the reader with guidance related to trace element bioanalytical method validation from the authors' perspective for two prevalent and powerful instrumental techniques: inductively coupled plasma-optical emission spectrometry and inductively coupled plasma-MS.

  12. Development and Application of Quantitative Detection Method for Viral Hemorrhagic Septicemia Virus (VHSV) Genogroup IVa

    PubMed Central

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-01-01

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R2 values of the primer set developed in this study were −0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID50) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID50, making it a very useful tool for VHSV diagnosis. PMID:24859343

  13. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  14. Chemical comparison of Tripterygium wilfordii and Tripterygium hypoglaucum based on quantitative analysis and chemometrics methods.

    PubMed

    Guo, Long; Duan, Li; Liu, Ke; Liu, E-Hu; Li, Ping

    2014-07-01

    Tripterygium wilfordii (T. wilfordii) and Tripterygium hypoglaucum (T. hypoglaucum), two commonly used Chinese herbal medicines derived from Tripterygium genus, have been widely used for the treatment of rheumatoid arthritis and other related inflammatory diseases in clinical therapy. In the present study, a rapid resolution liquid chromatography/electrospray ionization tandem mass spectrometry (RRLC-ESI-MS(n)) method has been developed and validated for simultaneous determination of 19 bioactive compounds including four catechins, three sesquiterpene alkaloids, four diterpenoids, and eight triterpenoids in these two similar herbs. The method validation results indicated that the developed method had desirable specificity, linearity, precision and accuracy. Quantitative analysis results showed that there were significant differences in the content of different types of compounds in T. wilfordii and T. hypoglaucum. Moreover, chemometrics methods such as one-way ANOVA, principal component analysis (PCA) and hierarchical clustering analysis (HCA) were performed to compare and discriminate the two Tripterygium herbs based on the quantitative data of analytes, and it was proven straightforward and reliable to differentiate T. wilfordii and T. hypoglaucum samples from different origins. In conclusion, simultaneous quantification of multiple-active component by RRLC-ESI-MS(n) coupled with chemometrics analysis could be a well-acceptable strategy to compare and evaluate the quality of T. wilfordii and T. hypoglaucum.

  15. A quantitative method for the evaluation of three-dimensional structure of temporal bone pneumatization

    PubMed Central

    Hill, Cheryl A.; Richtsmeier, Joan T.

    2010-01-01

    Temporal bone pneumatization has been included in lists of characters used in phylogenetic analyses of human evolution. While studies suggest that the extent of pneumatization has decreased over the course of human evolution, little is known about the processes underlying these changes or their significance. In short, reasons for the observed reduction and the potential reorganization within pneumatized spaces are unknown. Technological limitations have limited previous analyses of pneumatization in extant and fossil species to qualitative observations of the extent of temporal bone pneumatization. In this paper, we introduce a novel application of quantitative methods developed for the study of trabecular bone to the analysis of pneumatized spaces of the temporal bone. This method utilizes high-resolution X-ray computed tomography (HRXCT) images and quantitative software to estimate three-dimensional parameters (bone volume fractions, anisotropy, and trabecular thickness) of bone structure within defined units of pneumatized spaces. We apply this approach in an analysis of temporal bones of diverse but related primate species, Gorilla gorilla, Pan troglodytes, Homo sapiens, and Papio hamadryas anubis, to illustrate the potential of these methods. In demonstrating the utility of these methods, we show that there are interspecific differences in the bone structure of pneumatized spaces, perhaps reflecting changes in the localized growth dynamics, location of muscle attachments, encephalization, or basicranial flexion. PMID:18715622

  16. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR.

    PubMed

    Seeker, Luise A; Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  17. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR

    PubMed Central

    Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J.; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H.

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  18. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  19. Quantitative Analysis of Single Amino Acid Variant Peptides Associated with Pancreatic Cancer in Serum by an Isobaric Labeling Quantitative Method

    PubMed Central

    2015-01-01

    Single amino acid variations are highly associated with many human diseases. The direct detection of peptides containing single amino acid variants (SAAVs) derived from nonsynonymous single nucleotide polymorphisms (SNPs) in serum can provide unique opportunities for SAAV associated biomarker discovery. In the present study, an isobaric labeling quantitative strategy was applied to identify and quantify variant peptides in serum samples of pancreatic cancer patients and other benign controls. The largest number of SAAV peptides to date in serum including 96 unique variant peptides were quantified in this quantitative analysis, of which five variant peptides showed a statistically significant difference between pancreatic cancer and other controls (p-value < 0.05). Significant differences in the variant peptide SDNCEDTPEAGYFAVAVVK from serotransferrin were detected between pancreatic cancer and controls, which was further validated by selected reaction monitoring (SRM) analysis. The novel biomarker panel obtained by combining α-1-antichymotrypsin (AACT), Thrombospondin-1 (THBS1) and this variant peptide showed an excellent diagnostic performance in discriminating pancreatic cancer from healthy controls (AUC = 0.98) and chronic pancreatitis (AUC = 0.90). These results suggest that large-scale analysis of SAAV peptides in serum may provide a new direction for biomarker discovery research. PMID:25393578

  20. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating.

  1. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models.

    PubMed

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics.

  2. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  3. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  4. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  5. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    NASA Astrophysics Data System (ADS)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  6. Exercise and diet affect quantitative trait loci for body weight and composition traits in an advanced intercross population of mice.

    PubMed

    Leamy, Larry J; Kelly, Scott A; Hua, Kunjie; Pomp, Daniel

    2012-12-01

    Driven by the recent obesity epidemic, interest in understanding the complex genetic and environmental basis of body weight and composition is great. We investigated this by searching for quantitative trait loci (QTLs) affecting a number of weight and adiposity traits in a G(10) advanced intercross population produced from crosses of mice in inbred strain C57BL/6J with those in a strain selected for high voluntary wheel running. The mice in this population were fed either a high-fat or a control diet throughout the study and also measured for four exercise traits prior to death, allowing us to test for pre- and postexercise QTLs as well as QTL-by-diet and QTL-by-exercise interactions. Our genome scan uncovered a number of QTLs, of which 40% replicated QTLs previously found for similar traits in an earlier (G(4)) generation. For those replicated QTLs, the confidence intervals were reduced from an average of 19 Mb in the G(4) to 8 Mb in the G(10). Four QTLs on chromosomes 3, 8, 13, and 18 were especially prominent in affecting the percentage of fat in the mice. About of all QTLs showed interactions with diet, exercise, or both, their genotypic effects on the traits showing a variety of patterns depending on the diet or level of exercise. It was concluded that the indirect effects of these QTLs provide an underlying genetic basis for the considerable variability in weight or fat loss typically found among individuals on the same diet and/or exercise regimen.

  7. QUANTITATIVE TRAIT LOCI FOR BONE MINERAL DENSITY AND FEMORAL MORPHOLOGY IN AN ADVANCED INTERCROSS POPULATION OF MICE

    PubMed Central

    Leamy, Larry J.; Kelly, Scott A.; Hua, Kunjie; Farber, Charles R.; Pomp, Daniel

    2013-01-01

    Osteoporosis, characterized by low levels of bone mineral density (BMD), is a prevalent medical condition in humans. We investigated its genetic and environmental basis by searching for quantitative trait loci (QTLs) affecting six skeletal (including three BMD) traits in a G10 advanced intercross population produced from crosses of mice from the inbred strain C57BL/6J with mice from a strain selected for high voluntary wheel running. The mice in this population were fed either a high-fat or a matched control diet throughout the study, allowing us to test for QTL by diet interactions for the skeletal traits. Our genome scan uncovered a number of QTLs, the great majority of which were different from QTLs previously found for these same traits in an earlier (G4) generation of the same intercross. Further, the confidence intervals for the skeletal trait QTLs were reduced from an average of 18.5 Mb in the G4 population to an equivalent of about 9 Mb in the G10 population. We uncovered a total of 50 QTLs representing 32 separate genomic sites affecting these traits, with a distal region on chromosome 1 harboring several QTLs with large effects on the BMD traits. One QTL was located on chromosome 5 at 4.0 Mb with a confidence interval spanning from 4.0 to 4.6 Mb. Only three protein coding genes reside in this interval, and one of these, Cyp51, is an attractive candidate as others have shown that developing Cyp51 knockout embryos exhibit shortened and bowed limbs and synotosis of the femur and tibia. Several QTLs showed significant interactions with sex, although only two QTLs interacted with diet, both affecting only mice fed the high-fat diet. PMID:23486184

  8. Quantitative interpretation of mineral hyperspectral images based on principal component analysis and independent component analysis methods.

    PubMed

    Jiang, Xiping; Jiang, Yu; Wu, Fang; Wu, Fenghuang

    2014-01-01

    Interpretation of mineral hyperspectral images provides large amounts of high-dimensional data, which is often complicated by mixed pixels. The quantitative interpretation of hyperspectral images is known to be extremely difficult when three types of information are unknown, namely, the number of pure pixels, the spectrum of pure pixels, and the mixing matrix. The problem is made even more complex by the disturbance of noise. The key to interpreting abstract mineral component information, i.e., pixel unmixing and abundance inversion, is how to effectively reduce noise, dimension, and redundancy. A three-step procedure is developed in this study for quantitative interpretation of hyperspectral images. First, the principal component analysis (PCA) method can be used to process the pixel spectrum matrix and keep characteristic vectors with larger eigenvalues. This can effectively reduce the noise and redundancy, which facilitates the abstraction of major component information. Second, the independent component analysis (ICA) method can be used to identify and unmix the pixels based on the linear mixed model. Third, the pure-pixel spectrums can be normalized for abundance inversion, which gives the abundance of each pure pixel. In numerical experiments, both simulation data and actual data were used to demonstrate the performance of our three-step procedure. Under simulation data, the results of our procedure were compared with theoretical values. Under the actual data measured from core hyperspectral images, the results obtained through our algorithm are compared with those of similar software (Mineral Spectral Analysis 1.0, Nanjing Institute of Geology and Mineral Resources). The comparisons show that our method is effective and can provide reference for quantitative interpretation of hyperspectral images.

  9. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data.

  10. A new method for quantitative real-time polymerase chain reaction data analysis.

    PubMed

    Rao, Xiayu; Lai, Dejian; Huang, Xuelin

    2013-09-01

    Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantification method that has been extensively used in biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle method and linear and nonlinear model-fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence can hardly be accurate and therefore can distort results. We propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtract the fluorescence in the former cycle from that in the latter cycle, transforming the n cycle raw data into n-1 cycle data. Then, linear regression is applied to the natural logarithm of the transformed data. Finally, PCR amplification efficiencies and the initial DNA molecular numbers are calculated for each reaction. This taking-difference method avoids the error in subtracting an unknown background, and thus it is more accurate and reliable. This method is easy to perform, and this strategy can be extended to all current methods for PCR data analysis.

  11. New Method for the Quantitative Analysis of Smear Slides in Pelagic and Hemi-Pelagic Sediments of the Bering Sea

    NASA Astrophysics Data System (ADS)

    Drake, M. K.; Aiello, I. W.; Ravelo, A. C.

    2014-12-01

    Petrographic microscopy of smear slides is the standard method to initially investigate marine sediments in core sediment studies (e.g. IODP expeditions). The technique is not commonly used in more complex analysis due to concerns over the subjectivity of the method and variability in operator training and experience. Two initiatives sponsored by Ocean Leadership, a sedimentology training workshop and a digital reference of smear slide components (Marsaglia et al., 2013) have been implemented to address the need for advanced training. While the influence of subjectivity on the quality of data has yet to be rigorously tested, the lack of standardization in the current method of smear slide analysis (SSA) remains a concern. The relative abundance of the three main components, (total diatoms, silt-to-sand sized siliciclastics, and clay minerals) of high and low density Bering Sea hemi-pelagic sediments from the ocean margin (Site U144; Site U1339) and pelagic sediments from the open-ocean (Site U1340) were analyzed. Our analyses show visual estimation is a reproducible method to quantify the relative abundance of the main sediment components. Furthermore, we present a modified method for SSA, with procedural changes objectively guided by statistical analyses, including constraints to increase randomness and precision in both the preparation and analysis of the smear slide. For example, repeated measure ANOVAs found a smear slide could be accurately quantified by counting three fields of view. Similarly, the use of replicate smear slides to quantify a sample was analyzed. Finally, the data produced from this modified SSA shows a strong correlation to continuously logged physical parameters of sediment such as gamma ray attenuation (Site U1339 r2= 0.41; Site U1340 r2= 0.36). Therefore, the modified SSA combined with other independent methods (e.g. laser particle size analysis, scanning electron microscopy, and physical properties) can be a very effective tool for the

  12. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  13. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  14. Collaborating to improve the use of free-energy and other quantitative methods in drug discovery

    NASA Astrophysics Data System (ADS)

    Sherborne, Bradley; Shanmugasundaram, Veerabahu; Cheng, Alan C.; Christ, Clara D.; DesJarlais, Renee L.; Duca, Jose S.; Lewis, Richard A.; Loughney, Deborah A.; Manas, Eric S.; McGaughey, Georgia B.; Peishoff, Catherine E.; van Vlijmen, Herman

    2016-12-01

    In May and August, 2016, several pharmaceutical companies convened to discuss and compare experiences with Free Energy Perturbation (FEP). This unusual synchronization of interest was prompted by Schrödinger's FEP+ implementation and offered the opportunity to share fresh studies with FEP and enable broader discussions on the topic. This article summarizes key conclusions of the meetings, including a path forward of actions for this group to aid the accelerated evaluation, application and development of free energy and related quantitative, structure-based design methods.

  15. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-10-01

    Abstract Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  16. Collaborating to improve the use of free-energy and other quantitative methods in drug discovery.

    PubMed

    Sherborne, Bradley; Shanmugasundaram, Veerabahu; Cheng, Alan C; Christ, Clara D; DesJarlais, Renee L; Duca, Jose S; Lewis, Richard A; Loughney, Deborah A; Manas, Eric S; McGaughey, Georgia B; Peishoff, Catherine E; van Vlijmen, Herman

    2016-12-01

    In May and August, 2016, several pharmaceutical companies convened to discuss and compare experiences with Free Energy Perturbation (FEP). This unusual synchronization of interest was prompted by Schrödinger's FEP+ implementation and offered the opportunity to share fresh studies with FEP and enable broader discussions on the topic. This article summarizes key conclusions of the meetings, including a path forward of actions for this group to aid the accelerated evaluation, application and development of free energy and related quantitative, structure-based design methods.

  17. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-09-01

    Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  18. Conceptual frameworks and methods for advancing invasion ecology.

    PubMed

    Heger, Tina; Pahl, Anna T; Botta-Dukát, Zoltan; Gherardi, Francesca; Hoppe, Christina; Hoste, Ivan; Jax, Kurt; Lindström, Leena; Boets, Pieter; Haider, Sylvia; Kollmann, Johannes; Wittmann, Meike J; Jeschke, Jonathan M

    2013-09-01

    Invasion ecology has much advanced since its early beginnings. Nevertheless, explanation, prediction, and management of biological invasions remain difficult. We argue that progress in invasion research can be accelerated by, first, pointing out difficulties this field is currently facing and, second, looking for measures to overcome them. We see basic and applied research in invasion ecology confronted with difficulties arising from (A) societal issues, e.g., disparate perceptions of invasive species; (B) the peculiarity of the invasion process, e.g., its complexity and context dependency; and (C) the scientific methodology, e.g., imprecise hypotheses. To overcome these difficulties, we propose three key measures: (1) a checklist for definitions to encourage explicit definitions; (2) implementation of a hierarchy of hypotheses (HoH), where general hypotheses branch into specific and precisely testable hypotheses; and (3) platforms for improved communication. These measures may significantly increase conceptual clarity and enhance communication, thus advancing invasion ecology.

  19. An ECL-PCR method for quantitative detection of point mutation

    NASA Astrophysics Data System (ADS)

    Zhu, Debin; Xing, Da; Shen, Xingyan; Chen, Qun; Liu, Jinfeng

    2005-04-01

    A new method for identification of point mutations was proposed. Polymerase chain reaction (PCR) amplification of a sequence from genomic DNA was followed by digestion with a kind of restriction enzyme, which only cut the wild-type amplicon containing its recognition site. Reaction products were detected by electrochemiluminescence (ECL) assay after adsorption of the resulting DNA duplexes to the solid phase. One strand of PCR products carries biotin to be bound on a streptavidin-coated microbead for sample selection. Another strand carries Ru(bpy)32+ (TBR) to react with tripropylamine (TPA) to emit light for ECL detection. The method was applied to detect a specific point mutation in H-ras oncogene in T24 cell line. The results show that the detection limit for H-ras amplicon is 100 fmol and the linear range is more than 3 orders of magnitude, thus, make quantitative analysis possible. The genotype can be clearly discriminated. Results of the study suggest that ECL-PCR is a feasible quantitative method for safe, sensitive and rapid detection of point mutation in human genes.

  20. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    NASA Astrophysics Data System (ADS)

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green’s function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10-6 , comparable with the recent results reported in the literature.

  1. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  2. A Method for Quantitative Phase Analysis of Nanocrystalline Zirconium Dioxide Polymorphs.

    PubMed

    Zhou, Zhiqiang; Guo, Li

    2015-04-01

    A method based on X-ray diffractometry was developed for quantitative phase analysis of nanocrystalline zirconium dioxide polymorphs. Corresponding formulas were derived. The key factors therein were evaluated by rigorous theoretical calculation and fully verified by experimentation. A process of iteration was raised to make the experimental verification proceed in the case of lack of pure ZrO2 crystal polymorphs. By this method, the weight ratios of tetragonal ZrO2 (t-ZrO2) to monoclinic ZrO2 (m-ZrO2) in any a mixture that contains nanocrystalline t-ZrO2 and m-ZrO2 or their weight fractions in a mixture that is composed of nanocrystalline t-ZrO2 and m-ZrO2 can be determined only upon an XRD test. It is proved by both theoretical calculation and experimental test that mutual substitutions of t-ZrO2 and cubic ZrO2 (c-ZrO2) in a wide range show almost no impact on the XRD patterns of their mixtures. And plus the similarity in property of t-ZrO2 and c-ZrO2, they can be treated as one whole phase. The high agreement of the theoretical and experimental results in this work also proves the validity and reliability of the theoretical calculation based on X-ray diffractometry theory for such quantitative phase analysis. This method has the potential to be popularized to other materials.

  3. Advanced materials and methods for next generation spintronics

    NASA Astrophysics Data System (ADS)

    Siegel, Gene Phillip

    The modern age is filled with ever-advancing electronic devices. The contents of this dissertation continue the desire for faster, smaller, better electronics. Specifically, this dissertation addresses a field known as "spintronics", electronic devices based on an electron's spin, not just its charge. The field of spintronics originated in 1990 when Datta and Das first proposed a "spin transistor" that would function by passing a spin polarized current from a magnetic electrode into a semiconductor channel. The spins in the channel could then be manipulated by applying an electrical voltage across the gate of the device. However, it has since been found that a great amount of scattering occurs at the ferromagnet/semiconductor interface due to the large impedance mismatch that exists between the two materials. Because of this, there were three updated versions of the spintronic transistor that were proposed to improve spin injection: one that used a ferromagnetic semiconductor electrode, one that added a tunnel barrier between the ferromagnet and semiconductor, and one that utilized a ferromagnetic tunnel barrier which would act like a spin filter. It was next proposed that it may be possible to achieve a "pure spin current", or a spin current with no concurrent electric current (i.e., no net flow of electrons). One such method that was discovered is the spin Seebeck effect, which was discovered in 2008 by Uchida et al., in which a thermal gradient in a magnetic material generates a spin current which can be injected into adjacent material as a pure spin current. The first section of this dissertation addresses this spin Seebeck effect (SSE). The goal was to create such a device that both performs better than previously reported devices and is capable of operating without the aid of an external magnetic field. We were successful in this endeavor. The trick to achieving both of these goals was found to be in the roughness of the magnetic layer. A rougher magnetic

  4. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  5. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method.

    PubMed

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like (1)H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  6. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  7. A quantitative immunopolymerase chain reaction method for detection of vegetative insecticidal protein in genetically modified crops.

    PubMed

    Kumar, Rajesh

    2011-10-12

    Vegetative insecticidal protein (Vip) is being employed for transgenic expression in selected crops such as cotton, brinjal, and corn. For regulatory compliance, there is a need for a sensitive and reliable detection method, which can distinguish between approved and nonapproved genetically modified (GM) events and quantify GM contents as well. A quantitative immunopolymerase chain reaction (IPCR) method has been developed for the detection and quantification of Vip protein in GM crops. The developed assay displayed a detection limit of 1 ng/mL (1 ppb) and linear quantification range between 10 and 1000 ng/mL of Vip-S protein. The sensitivity of the assay was found to be 10 times higher than an analogous enzyme-linked immunosorbent assay for Vip-S protein. The results suggest that IPCR has the potential to become a standard method to quantify GM proteins.

  8. Method for quantitative estimation of position perception using a joystick during linear movement.

    PubMed

    Wada, Y; Tanaka, M; Mori, S; Chen, Y; Sumigama, S; Naito, H; Maeda, M; Yamamoto, M; Watanabe, S; Kajitani, N

    1996-12-01

    We designed a method for quantitatively estimating self-motion perceptions during passive body movement on a sled. The subjects were instructed to tilt a joystick in proportion to perceived displacement from a giving starting position during linear movement with varying displacements of 4 m, 10 m and 16 m induced by constant acceleration of 0.02 g, 0.05 g and 0.08 g along the antero-posterior axis. With this method, we could monitor not only subjective position perceptions but also response latencies for the beginning (RLbgn) and end (RLend) of the linear movement. Perceived body position fitted Stevens' power law, where R=kSn (R is output of the joystick, k is a constant, S is the displacement from the linear movement and n is an exponent). RLbgn decreased as linear acceleration increased. We conclude that this method is useful in analyzing the features and sensitivities of self-motion perceptions during movement.

  9. Methods and Applications for Advancing Distance Education Technologies: International Issues and Solutions

    ERIC Educational Resources Information Center

    Syed, Mahbubur Rahman, Ed.

    2009-01-01

    The emerging field of advanced distance education delivers academic courses across time and distance, allowing educators and students to participate in a convenient learning method. "Methods and Applications for Advancing Distance Education Technologies: International Issues and Solutions" demonstrates communication technologies, intelligent…

  10. A Powerful and Robust Method for Mapping Quantitative Trait Loci in General Pedigrees

    PubMed Central

    Diao, G. ; Lin, D. Y. 

    2005-01-01

    The variance-components model is the method of choice for mapping quantitative trait loci in general human pedigrees. This model assumes normally distributed trait values and includes a major gene effect, random polygenic and environmental effects, and covariate effects. Violation of the normality assumption has detrimental effects on the type I error and power. One possible way of achieving normality is to transform trait values. The true transformation is unknown in practice, and different transformations may yield conflicting results. In addition, the commonly used transformations are ineffective in dealing with outlying trait values. We propose a novel extension of the variance-components model that allows the true transformation function to be completely unspecified. We present efficient likelihood-based procedures to estimate variance components and to test for genetic linkage. Simulation studies demonstrated that the new method is as powerful as the existing variance-components methods when the normality assumption holds; when the normality assumption fails, the new method still provides accurate control of type I error and is substantially more powerful than the existing methods. We performed a genomewide scan of monoamine oxidase B for the Collaborative Study on the Genetics of Alcoholism. In that study, the results that are based on the existing variance-components method changed dramatically when three outlying trait values were excluded from the analysis, whereas our method yielded essentially the same answers with or without those three outliers. The computer program that implements the new method is freely available. PMID:15918154

  11. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci.

  12. Quantitative measurement of speech sound distortions with the aid of minimum variance spectral estimation method for dentistry use.

    PubMed

    Bereteu, L; Drăgănescu, G E; Stănescu, D; Sinescu, C

    2011-12-01

    In this paper, we search an adequate quantitative method based on minimum variance spectral analysis in order to reflect the dependence of the speech quality on the correct positioning of the dental prostheses. We also search some quantitative parameters, which reflect the correct position of dental prostheses in a sensitive manner.

  13. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2016-10-26

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors

  14. Spatial Access Priority Mapping (SAPM) with Fishers: A Quantitative GIS Method for Participatory Planning

    PubMed Central

    Yates, Katherine L.; Schoeman, David S.

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers’ spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers’ willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision

  15. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    SciTech Connect

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary; Geller, Jil; Fisher, Susan; Hall, Steven; Hazen, Terry C.; Brenner, Steven; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.

  16. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions*

    PubMed Central

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E.; Geller, Jil T.; Fisher, Susan J.; Hall, Steven C.; Hazen, Terry C.; Brenner, Steven E.; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  17. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE PAGES

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; ...

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  18. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  19. TU-G-303-00: Radiomics: Advances in the Use of Quantitative Imaging Used for Predictive Modeling

    SciTech Connect

    2015-06-15

    ‘Radiomics’ refers to studies that extract a large amount of quantitative information from medical imaging studies as a basis for characterizing a specific aspect of patient health. Radiomics models can be built to address a wide range of outcome predictions, clinical decisions, basic cancer biology, etc. For example, radiomics models can be built to predict the aggressiveness of an imaged cancer, cancer gene expression characteristics (radiogenomics), radiation therapy treatment response, etc. Technically, radiomics brings together quantitative imaging, computer vision/image processing, and machine learning. In this symposium, speakers will discuss approaches to radiomics investigations, including: longitudinal radiomics, radiomics combined with other biomarkers (‘pan-omics’), radiomics for various imaging modalities (CT, MRI, and PET), and the use of registered multi-modality imaging datasets as a basis for radiomics. There are many challenges to the eventual use of radiomics-derived methods in clinical practice, including: standardization and robustness of selected metrics, accruing the data required, building and validating the resulting models, registering longitudinal data that often involve significant patient changes, reliable automated cancer segmentation tools, etc. Despite the hurdles, results achieved so far indicate the tremendous potential of this general approach to quantifying and using data from medical images. Specific applications of radiomics to be presented in this symposium will include: the longitudinal analysis of patients with low-grade gliomas; automatic detection and assessment of patients with metastatic bone lesions; image-based monitoring of patients with growing lymph nodes; predicting radiotherapy outcomes using multi-modality radiomics; and studies relating radiomics with genomics in lung cancer and glioblastoma. Learning Objectives: Understanding the basic image features that are often used in radiomic models. Understanding

  20. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the

  1. Comparison of Analytic Methods for Quantitative Real-Time Polymerase Chain Reaction Data

    PubMed Central

    Chen, Ping

    2015-01-01

    Abstract Polymerase chain reaction (PCR) is a laboratory procedure to amplify and simultaneously quantify targeted DNA molecules, and then detect the product of the reaction at the end of all the amplification cycles. A more modern technique, real-time PCR, also known as quantitative PCR (qPCR), detects the product after each cycle of the progressing reaction by applying a specific fluorescence technique. The quantitative methods currently used to analyze qPCR data result in varying levels of estimation quality. This study compares the accuracy and precision of the estimation achieved by eight different models when applied to the same qPCR dataset. Also, the study evaluates a newly introduced data preprocessing approach, the taking-the-difference approach, and compares it to the currently used approach of subtracting the background fluorescence. The taking-the-difference method subtracts the fluorescence in the former cycle from that in the latter cycle to avoid estimating the background fluorescence. The results obtained from the eight models show that taking-the-difference is a better way to preprocess qPCR data compared to the original approach because of a reduction in the background estimation error. The results also show that weighted models are better than non-weighted models, and that the precision of the estimation achieved by the mixed models is slightly better than that achieved by the linear regression models. PMID:26204477

  2. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  3. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  4. Comparison of analytic methods for quantitative real-time polymerase chain reaction data.

    PubMed

    Chen, Ping; Huang, Xuelin

    2015-11-01

    Polymerase chain reaction (PCR) is a laboratory procedure to amplify and simultaneously quantify targeted DNA molecules, and then detect the product of the reaction at the end of all the amplification cycles. A more modern technique, real-time PCR, also known as quantitative PCR (qPCR), detects the product after each cycle of the progressing reaction by applying a specific fluorescence technique. The quantitative methods currently used to analyze qPCR data result in varying levels of estimation quality. This study compares the accuracy and precision of the estimation achieved by eight different models when applied to the same qPCR dataset. Also, the study evaluates a newly introduced data preprocessing approach, the taking-the-difference approach, and compares it to the currently used approach of subtracting the background fluorescence. The taking-the-difference method subtracts the fluorescence in the former cycle from that in the latter cycle to avoid estimating the background fluorescence. The results obtained from the eight models show that taking-the-difference is a better way to preprocess qPCR data compared to the original approach because of a reduction in the background estimation error. The results also show that weighted models are better than non-weighted models, and that the precision of the estimation achieved by the mixed models is slightly better than that achieved by the linear regression models.

  5. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    PubMed Central

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  6. Quantitative methods in the tuberculosis epidemiology and in the evaluation of BCG vaccination programs.

    PubMed

    Lugosi, L

    1986-01-01

    Controversies concerning the protective efficacy of the BCG vaccination result mostly from the fact that quantitative methods have not been used in the evaluation of the BCG programs. Therefore, to eliminate the current controversy an unconditional requirement is to apply valid biostatistical models to analyse the results of the BCG programs. In order to achieve objective statistical inferences and epidemiological interpretations the following conditions should be fulfilled: data for evaluation have to be taken from epidemiological trials exempt from sampling error, since the morbidity rates are not normally distributed an appropriate normalizing transformation is needed for point and confidence interval estimations, only unbiased point estimates (dependent variables) could be used in valid models for hypothesis tests, in cases of rejected null hypothesis the ranked estimates of the compared groups must be evaluated in a multiple comparison model in order to diminish the Type I error in the decision. The following quantitative methods are presented to evaluate the effectiveness of BCG vaccination in Hungary: linear regression analysis, stepwise regression analysis and log-linear analysis.

  7. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    PubMed

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  8. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  9. Advanced 3D inverse method for designing turbomachine blades

    SciTech Connect

    Dang, T.

    1995-10-01

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  10. Advancing digital methods in the fight against communicable diseases.

    PubMed

    Chabot-Couture, Guillaume; Seaman, Vincent Y; Wenger, Jay; Moonen, Bruno; Magill, Alan

    2015-03-01

    Important advances are being made in the fight against communicable diseases by using new digital tools. While they can be a challenge to deploy at-scale, GPS-enabled smartphones, electronic dashboards and computer models have multiple benefits. They can facilitate program operations, lead to new insights about the disease transmission and support strategic planning. Today, tools such as these are used to vaccinate more children against polio in Nigeria, reduce the malaria burden in Zambia and help predict the spread of the Ebola epidemic in West Africa.

  11. Quantitation of mRNA levels of steroid 5alpha-reductase isozymes: a novel method that combines quantitative RT-PCR and capillary electrophoresis.

    PubMed

    Torres, Jesús M; Ortega, Esperanza

    2004-01-01

    A novel, accurate, rapid and modestly labor-intensive method has been developed to quantitate specific mRNA species by reverse transcription-polymerase chain reaction (RT-PCR). This strategy combines the high degree of specificity of competitive PCR with the sensitivity of laser-induced fluorescence capillary electrophoresis (LIF-CE). The specific target mRNA and a mimic DNA fragment, used as an internal standard (IS), were co-amplified in a single reaction in which the same primers are used. The amount of mRNA was then quantitated by extrapolation from the standard curve generated with the internal standard. PCR primers were designed to amplify both a 185 bp fragment of the target cDNA for steroid 5alpha-reductase 1 (5alpha-R1) and a 192 bp fragment of the target cDNA for steroid 5alpha-reductase type 2 (5alpha-R2). The 5' forward primers were end-labeled with 6-carboxy-fluorescein (6-FAM). Two synthetic internal standard DNAs of 300 bp were synthesized from the sequence of plasmid pEGFP-C1. The ratio of fluorescence intensity between amplified products of the target cDNA (185 or 192 bp fragments) and the competitive DNA (300 bp fragment) was determined quantitatively after separation by capillary electrophoresis and fluorescence analysis. The accurate quantitation of low-abundance mRNAs by the present method allows low-level gene expression to be characterized.

  12. Design of primers and probes for quantitative real-time PCR methods.

    PubMed

    Rodríguez, Alicia; Rodríguez, Mar; Córdoba, Juan J; Andrade, María J

    2015-01-01

    Design of primers and probes is one of the most crucial factors affecting the success and quality of quantitative real-time PCR (qPCR) analyses, since an accurate and reliable quantification depends on using efficient primers and probes. Design of primers and probes should meet several criteria to find potential primers and probes for specific qPCR assays. The formation of primer-dimers and other non-specific products should be avoided or reduced. This factor is especially important when designing primers for SYBR(®) Green protocols but also in designing probes to ensure specificity of the developed qPCR protocol. To design primers and probes for qPCR, multiple software programs and websites are available being numerous of them free. These tools often consider the default requirements for primers and probes, although new research advances in primer and probe design should be progressively added to different algorithm programs. After a proper design, a precise validation of the primers and probes is necessary. Specific consideration should be taken into account when designing primers and probes for multiplex qPCR and reverse transcription qPCR (RT-qPCR). This chapter provides guidelines for the design of suitable primers and probes and their subsequent validation through the development of singlex qPCR, multiplex qPCR, and RT-qPCR protocols.

  13. Classification methods for noise transients in advanced gravitational-wave detectors II: performance tests on Advanced LIGO data

    NASA Astrophysics Data System (ADS)

    Powell, Jade; Torres-Forné, Alejandro; Lynch, Ryan; Trifirò, Daniele; Cuoco, Elena; Cavaglià, Marco; Heng, Ik Siong; Font, José A.

    2017-02-01

    The data taken by the advanced LIGO and Virgo gravitational-wave detectors contains short duration noise transients that limit the significance of astrophysical detections and reduce the duty cycle of the instruments. As the advanced detectors are reaching sensitivity levels that allow for multiple detections of astrophysical gravitational-wave sources it is crucial to achieve a fast and accurate characterization of non-astrophysical transient noise shortly after it occurs in the detectors. Previously we presented three methods for the classification of transient noise sources. They are Principal Component Analysis for Transients (PCAT), Principal Component LALInference Burst (PC-LIB) and Wavelet Detection Filter with Machine Learning (WDF-ML). In this study we carry out the first performance tests of these algorithms on gravitational-wave data from the Advanced LIGO detectors. We use the data taken between the 3rd of June 2015 and the 14th of June 2015 during the 7th engineering run (ER7), and outline the improvements made to increase the performance and lower the latency of the algorithms on real data. This work provides an important test for understanding the performance of these methods on real, non stationary data in preparation for the second advanced gravitational-wave detector observation run, planned for later this year. We show that all methods can classify transients in non stationary data with a high level of accuracy and show the benefits of using multiple classifiers.

  14. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity

    PubMed Central

    O’Brien-Simpson, Neil M.; Pantarat, Namfon; Attard, Troy J.; Walsh, Katrina A.; Reynolds, Eric C.

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  15. Quantitative analysis of uranium in aqueous solutions using a semiconductor laser-based spectroscopic method.

    PubMed

    Cho, Hye-Ryun; Jung, Euo Chang; Cha, Wansik; Song, Kyuseok

    2013-05-07

    A simple analytical method based on the simultaneous measurement of the luminescence of hexavalent uranium ions (U(VI)) and the Raman scattering of water, was investigated for determining the concentration of U(VI) in aqueous solutions. Both spectra were measured using a cw semiconductor laser beam at a center wavelength of 405 nm. The empirical calibration curve for the quantitative analysis of U(VI) was obtained by measuring the ratio of the luminescence intensity of U(VI) at 519 nm to the Raman scattering intensity of water at 469 nm. The limit of detection (LOD) in the parts per billion range and a dynamic range from the LOD up to several hundred parts per million were achieved. The concentration of uranium in groundwater determined by this method is in good agreement with the results determined by kinetic phosphorescence analysis and inductively coupled plasma mass spectrometry.

  16. Enantiomer labelling, a method for the quantitative analysis of amino acids.

    PubMed

    Frank, H; Nicholson, G J; Bayer, E

    1978-12-21

    Enantiomer labelling a method for the quntitative analysis of optically active natural compounds by gas chromatography, involves the use of the unnatural enantiomer as an internal standard. With Chirasil-Val, a chiral stationary phase that is thermally stable up to up to 240 degrees, the enantiomers of amino acids and a variety of other compounds can be separated and quantitated. Incomplete recovery from the sample, incomplete derivatization, hydrolysis and thermal decomposition of the derivative and shifting response factors can be compensated for by adding the unnatural enantiomer. The accuracy of amino acid analysis by enantiomer labelling is equal or superior to that of hitherto known methods. The procedure affords a complete analysis of peptides with respect to both amino acid composition and the optical purity of each amino acid.

  17. Simple saponification method for the quantitative determination of carotenoids in green vegetables.

    PubMed

    Larsen, Erik; Christensen, Lars P

    2005-08-24

    A simple, reliable, and gentle saponification method for the quantitative determination of carotenoids in green vegetables was developed. The method involves an extraction procedure with acetone and the selective removal of the chlorophylls and esterified fatty acids from the organic phase using a strongly basic resin (Ambersep 900 OH). Extracts from common green vegetables (beans, broccoli, green bell pepper, chive, lettuce, parsley, peas, and spinach) were analyzed by high-performance liquid chromatography (HPLC) for their content of major carotenoids before and after action of Ambersep 900 OH. The mean recovery percentages for most carotenoids [(all-E)-violaxanthin, (all-E)-lutein epoxide, (all-E)-lutein, neolutein A, and (all-E)-beta-carotene] after saponification of the vegetable extracts with Ambersep 900 OH were close to 100% (99-104%), while the mean recovery percentages of (9'Z)-neoxanthin increased to 119% and that of (all-E)-neoxanthin and neolutein B decreased to 90% and 72%, respectively.

  18. Comparison of QIAGEN automated nucleic acid extraction methods for CMV quantitative PCR testing.

    PubMed

    Miller, Steve; Seet, Henrietta; Khan, Yasmeen; Wright, Carolyn; Nadarajah, Rohan

    2010-04-01

    We examined the effect of nucleic acid extraction methods on the analytic characteristics of a quantitative polymerase chain reaction (PCR) assay for cytomegalovirus (CMV). Human serum samples were extracted with 2 automated instruments (BioRobot EZ1 and QIAsymphony SP, QIAGEN, Valencia, CA) and CMV PCR results compared with those of pp65 antigenemia testing. Both extraction methods yielded results that were comparably linear and precise, whereas the QIAsymphony SP had a slightly lower limit of detection (1.92 log(10) copies/mL vs 2.26 log(10) copies/mL). In both cases, PCR was more sensitive than CMV antigen detection, detecting CMV viremia in 12% (EZ1) and 21% (QIAsymphony) of antigen-negative specimens. This study demonstrates the feasibility of using 2 different extraction techniques to yield results within 0.5 log(10) copies/mL of the mean value, a level that would allow for clinical comparison between different laboratory assays.

  19. Quantitative methods of measuring the sensitivity of the mouse sperm morphology assay

    SciTech Connect

    Moore, D.H.; Bennett, D.E.; Kranzler, D.; Wyrobek, A.J.

    1982-09-01

    In this study murine sperm were subjected to graded doses of X irradiation (0 to 120 rad) to determine whether quantitative measurements made on enlarged photographs of the sperm heads are related to radiation dose. We found that the Mahalanobis distance statistic, when used to measure distance in a multivariate space from a control group of measurements, could be used to classify sperm as normal or abnormal. The percent classified as abnormal by this method was found to be linearly related to dose. The results suggest that sensitivity of the murine sperm assay can be improved by selecting an optimal set of measurements. This improvement can reduce the doubling dose from approximately 70 rad to 10 to 15 rad while keeping the percentage of abnormal sperm in control mice at 3%, equal to the current visual method.

  20. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level.

  1. Statistical methods for mapping quantitative trait loci from a dense set of markers.

    PubMed Central

    Dupuis, J; Siegmund, D

    1999-01-01

    Lander and Botstein introduced statistical methods for searching an entire genome for quantitative trait loci (QTL) in experimental organisms, with emphasis on a backcross design and QTL having only additive effects. We extend their results to intercross and other designs, and we compare the power of the resulting test as a function of the magnitude of the additive and dominance effects, the sample size and intermarker distances. We also compare three methods for constructing confidence regions for a QTL: likelihood regions, Bayesian credible sets, and support regions. We show that with an appropriate evaluation of the coverage probability a support region is approximately a confidence region, and we provide a theroretical explanation of the empirical observation that the size of the support region is proportional to the sample size, not the square root of the sample size, as one might expect from standard statistical theory. PMID:9872974

  2. Genetic programming:  a novel method for the quantitative analysis of pyrolysis mass spectral data.

    PubMed

    Gilbert, R J; Goodacre, R; Woodward, A M; Kell, D B

    1997-11-01

    A technique for the analysis of multivariate data by genetic programming (GP) is described, with particular reference to the quantitative analysis of orange juice adulteration data collected by pyrolysis mass spectrometry (PyMS). The dimensionality of the input space was reduced by ranking variables according to product moment correlation or mutual information with the outputs. The GP technique as described gives predictive errors equivalent to, if not better than, more widespread methods such as partial least squares and artificial neural networks but additionally can provide a means for easing the interpretation of the correlation between input and output variables. The described application demonstrates that by using the GP method for analyzing PyMS data the adulteration of orange juice with 10% sucrose solution can be quantified reliably over a 0-20% range with an RMS error in the estimate of ∼1%.

  3. Maillard reaction products in bread: A novel semi-quantitative method for evaluating melanoidins in bread.

    PubMed

    Helou, Cynthia; Jacolot, Philippe; Niquet-Léridon, Céline; Gadonna-Widehem, Pascale; Tessier, Frédéric J

    2016-01-01

    The aim of this study was to test the methods currently in use and to develop a new protocol for the evaluation of melanoidins in bread. Markers of the early and advanced stages of the Maillard reaction were also followed in the crumb and the crust of bread throughout baking, and in a crust model system. The crumb of the bread contained N(ε)-fructoselysine and N(ε)-carboxymethyllysine but at levels 7 and 5 times lower than the crust, respectively. 5-Hydroxymethylfurfural was detected only in the crust and its model system. The available methods for the semi-quantification of melanoidins were found to be unsuitable for their analysis in bread. Our new method based on size exclusion chromatography and fluorescence measures soluble fluorescent melanoidins in bread. These melanoidin macromolecules (1.7-5.6 kDa) were detected intact in both crust and model system. They appear to contribute to the dietary fibre in bread.

  4. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, James W.; Molz, Fred W.; Bridges, Robert A.; Dinwiddie, Cynthia L.; Lorinovich, Caitlin J.; Lu, Silong

    2003-02-07

    This project involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation. The investigation was performed in collaboration with Chevron Production Company U.S.A. as an industrial partner, and incorporates data from the Temblor Formation in Chevron's West Coalinga Field, California. Improved prediction of interwell reservoir heterogeneity was needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contained approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley.

  5. Rapid quantitative analysis of lipids using a colorimetric method in a microplate format.

    PubMed

    Cheng, Yu-Shen; Zheng, Yi; VanderGheynst, Jean S

    2011-01-01

    A colorimetric sulfo-phospho-vanillin (SPV) method was developed for high throughput analysis of total lipids. The developed method uses a reaction mixture that is maintained in a 96-well microplate throughout the entire assay. The new assay provides the following advantages over other methods of lipid measurement: (1) background absorbance can be easily corrected for each well, (2) there is less risk of handling and transferring sulfuric acid contained in reaction mixtures, (3) color develops more consistently providing more accurate measurement of absorbance, and (4) the assay can be used for quantitative measurement of lipids extracted from a wide variety of sources. Unlike other spectrophotometric approaches that use fluorescent dyes, the optimal spectra and reaction conditions for the developed assay do not vary with the sample source. The developed method was used to measure lipids in extracts from four strains of microalgae. No significant difference was found in lipid determination when lipid content was measured using the new method and compared to results obtained using a macro-gravimetric method.

  6. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  7. How to use linear regression and correlation in quantitative method comparison studies.

    PubMed

    Twomey, P J; Kroll, M H

    2008-04-01

    Linear regression methods try to determine the best linear relationship between data points while correlation coefficients assess the association (as opposed to agreement) between the two methods. Linear regression and correlation play an important part in the interpretation of quantitative method comparison studies. Their major strength is that they are widely known and as a result both are employed in the vast majority of method comparison studies. While previously performed by hand, the availability of statistical packages means that regression analysis is usually performed by software packages including MS Excel, with or without the software programe Analyze-it as well as by other software packages. Such techniques need to be employed in a way that compares the agreement between the two methods examined and more importantly, because we are dealing with individual patients, whether the degree of agreement is clinically acceptable. Despite their use for many years, there is a lot of ignorance about the validity as well as the pros and cons of linear regression and correlation techniques. This review article describes the types of linear regression and regression (parametric and non-parametric methods) and the necessary general and specific requirements. The selection of the type of regression depends on where one has been trained, the tradition of the laboratory and the availability of adequate software.

  8. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages.

    PubMed

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and "naked" unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance.

  9. Quantitative mineralogical composition of complex mineral wastes - Contribution of the Rietveld method

    SciTech Connect

    Mahieux, P.-Y.; Aubert, J.-E.; Cyr, M.; Coutand, M.; Husson, B.

    2010-03-15

    The objective of the work presented in this paper is the quantitative determination of the mineral composition of two complex mineral wastes: a sewage sludge ash (SSA) and a municipal solid waste incineration fly ash (MSWIFA). The mineral compositions were determined by two different methods: the first based on calculation using the qualitative mineralogical composition of the waste combined with physicochemical analyses; the second the Rietveld method, which uses only X-ray diffraction patterns. The results obtained are coherent, showing that it is possible to quantify the mineral compositions of complex mineral waste with such methods. The apparent simplicity of the Rietveld method (due principally to the availability of software packages implementing the method) facilitates its use. However, care should be taken since the crystal structure analysis based on powder diffraction data needs experience and a thorough understanding of crystallography. So the use of another, complementary, method such as the first one used in this study, may sometimes be needed to confirm the results.

  10. Agreement experiments: a method for quantitatively testing new medical image display approaches

    NASA Astrophysics Data System (ADS)

    Johnston, Richard E.; Yankaskas, Bonnie C.; Perry, John R.; Pizer, Stephen M.; Delany, David J.; Parker, L. A.

    1990-08-01

    New medical image display devices or processes are commonly evaluated by anecdotal reports or subjective evaluations which are informative and relatively easy to acquire but do not provide quantitative nieasures. On the other hand, experinients eniploying ROC analysis, yield quantitative measurements but are very laborious and demand pathological proof of outcome. We have designed and are employing a new approach, which we have termed "agreement experiments," to quantitatively test the equivalence of observer performance on two systems. This was specifically developed to test whether a radiologist using a new display technique, which has some clear advantages over the standard technique, will detect and interpret diagnostic signs as he would with the standard display technique. Agreement experiments use checklists and confidence ratings to measure how well two radiologists agree on the presence of diagnostic signs when both view images on the standard display. This yields a baseline measure of agreement. Agreement measurements are then obtained when the two radiologists view cases using the new display, or display method, compared to the standard technique. If the levels of agreement when one reads from the new and one reads from the standard display are not statistically different from the baseline measures of agreement, we conclude the two systems are equivalent in conveying diagnostic signs. We will report on an experiment using this test. The experiment compares the agreement of radiological findings for chest CT studies viewed on the conventional multiformat film/lightbox to agreement of radiological findings from chest CT images presented on a multiple screen video system. The study consists of 80 chest CT studies. The results were an 86% to 81% agreement between the two viewing modalities which fell within our criteria of showing agreement.

  11. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody.

    PubMed

    Yoshinari, Tomoya; Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena; Ohkawa, Hideo; Sugita-Konishi, Yoshiko

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L(-1). The coefficients of variation were 7.9% at 0.003 mg L(-1), 5.0% at 0.03 mg L(-1) and 13.7% at 0.3 mg L(-1), respectively. The limit of detection was 0.006 mg L(-1) for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9-100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg(-1). The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R(2) = 0.9760) than the immunochromatographic assay kit (R(2) = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety.

  12. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  13. Targeted LC-MS/MS Method for the Quantitation of Plant Lignans and Enterolignans in Biofluids from Humans and Pigs.

    PubMed

    Nørskov, Natalja P; Olsen, Anja; Tjønneland, Anne; Bolvig, Anne Katrine; Lærke, Helle Nygaard; Knudsen, Knud Erik Bach

    2015-07-15

    Lignans have gained nutritional interest due to their promising role in the prevention of lifestyle diseases. However, epidemiological studies are in need of more evidence to link the intake of lignans to this promising role. In this context, it is necessary to study large population groups to obtain sufficient statistical power. Therefore, there is a demand for fast, sensitive, and accurate methods for quantitation with high throughput of samples. This paper presents a validated LC-MS/MS method for the quantitation of eight plant lignans (matairesinol, hydroxymatairesinol, secoisolariciresinol, lariciresinol, isolariciresinol, syringaresinol, medioresinol, and pinoresinol) and two enterolignans (enterodiol and enterolactone) in both human and pig plasma and urine. The method showed high selectivity and sensitivity allowing quantitation of lignans in the range of 0.024-100 ng/mL and with a run time of only 4.8 min per sample. The method was successfully applied to quantitate lignans in biofluids from ongoing studies with humans and pigs.

  14. Advanced methods of microscope control using μManager software

    PubMed Central

    Edelstein, Arthur D.; Tsuchida, Mark A.; Amodaj, Nenad; Pinkard, Henry; Vale, Ronald D.; Stuurman, Nico

    2014-01-01

    μManager is an open-source, cross-platform desktop application, to control a wide variety of motorized microscopes, scientific cameras, stages, illuminators, and other microscope accessories. Since its inception in 2005, μManager has grown to support a wide range of microscopy hardware and is now used by thousands of researchers around the world. The application provides a mature graphical user interface and offers open programming interfaces to facilitate plugins and scripts. Here, we present a guide to using some of the recently added advanced μManager features, including hardware synchronization, simultaneous use of multiple cameras, projection of patterned light onto a specimen, live slide mapping, imaging with multi-well plates, particle localization and tracking, and high-speed imaging. PMID:25606571

  15. Quantitative GSL-glycome analysis of human whole serum based on an EGCase digestion and glycoblotting method[S

    PubMed Central

    Furukawa, Jun-ichi; Sakai, Shota; Yokota, Ikuko; Okada, Kazue; Hanamatsu, Hisatoshi; Kobayashi, Takashi; Yoshida, Yasunobu; Higashino, Kenichi; Tamura, Tomohiro; Igarashi, Yasuyuki; Shinohara, Yasuro

    2015-01-01

    Glycosphingolipids (GSLs) are lipid molecules linked to carbohydrate units that form the plasma membrane lipid raft, which is clustered with sphingolipids, sterols, and specific proteins, and thereby contributes to membrane physical properties and specific recognition sites for various biological events. These bioactive GSL molecules consequently affect the pathophysiology and pathogenesis of various diseases. Thus, altered expression of GSLs in various diseases may be of importance for disease-related biomarker discovery. However, analysis of GSLs in blood is particularly challenging because GSLs are present at extremely low concentrations in serum/plasma. In this study, we established absolute GSL-glycan analysis of human serum based on endoglycoceramidase digestion and glycoblotting purification. We established two sample preparation protocols, one with and the other without GSL extraction using chloroform/methanol. Similar amounts of GSL-glycans were recovered with the two protocols. Both protocols permitted absolute quantitation of GSL-glycans using as little as 20 μl of serum. Using 10 healthy human serum samples, up to 42 signals corresponding to GSL-glycan compositions could be quantitatively detected, and the total serum GSL-glycan concentration was calculated to be 12.1–21.4 μM. We further applied this method to TLC-prefractionated serum samples. These findings will assist the discovery of disease-related biomarkers by serum GSL-glycomics. PMID:26420879

  16. Comparison of Advanced Distillation Control Methods, Final Technical Report

    SciTech Connect

    Dr. James B. Riggs

    2000-11-30

    Detailed dynamic simulations of three industrial distillation columns (a propylene/propane splitter, a xylene/toluene column, and a depropanizer) have been used to evaluate configuration selections for single-ended and dual-composition control, as well as to compare conventional and advanced control approaches. In addition, a simulator of a main fractionator was used to compare the control performance of conventional and advanced control. For each case considered, the controllers were tuned by using setpoint changes and tested using feed composition upsets. Proportional Integral (PI) control performance was used to evaluate the configuration selection problem. For single ended control, the energy balance configuration was found to yield the best performance. For dual composition control, nine configurations were considered. It was determined that the use of dynamic simulations is required in order to identify the optimum configuration from among the nine possible choices. The optimum configurations were used to evaluate the relative control performance of conventional PI controllers, MPC (Model Predictive Control), PMBC (Process Model-Based Control), and ANN (Artificial Neural Networks) control. It was determined that MPC works best when one product is much more important than the other, while PI was superior when both products were equally important. PMBC and ANN were not found to offer significant advantages over PI and MPC. MPC was found to outperform conventional PI control for the main fractionator. MPC was applied to three industrial columns: one at Phillips Petroleum and two at Union Carbide. In each case, MPC was found to significantly outperform PI controls. The major advantage of the MPC controller is its ability to effectively handle a complex set of constraints and control objectives.

  17. The use of electromagnetic induction methods for establishing quantitative permafrost models in West Greenland

    NASA Astrophysics Data System (ADS)

    Ingeman-Nielsen, Thomas; Brandt, Inooraq

    2010-05-01

    The sedimentary settings at West Greenlandic town and infrastructural development sites are dominated by fine-grained marine deposits of late to post glacial origin. Prior to permafrost formation, these materials were leached by percolating precipitation, resulting in depletion of salts. Present day permafrost in these deposits is therefore very ice-rich with ice contents approaching 50-70% vol. in some areas. Such formations are of great concern in building and construction projects in Greenland, as they loose strength and bearing capacity upon thaw. It is therefore of both technical and economical interest to develop methods to precisely investigate and determine parameters such as ice-content and depth to bedrock in these areas. In terms of geophysical methods for near surface investigations, traditional methods such as Electrical Resistivity Tomography (ERT) and Refraction Seismics (RS) have generally been applied with success. The Georadar method usually fails due to very limited penetration depth in the fine-grained materials, and Electromagnetic Induction (EMI) methods are seldom applicable for quantitative interpretation due to the very high resistivities causing low induced currents and thus small secondary fields. Nevertheless, in some areas of Greenland the marine sequence was exposed relatively late, and as a result the sediments may not be completely leached of salts. In such cases, layers with pore water salinity approaching that of sea water, may be present below an upper layer of very ice rich permafrost. The saline pore water causes a freezing-point depression which results in technically unfrozen sediments at permafrost temperatures around -3 °C. Traditional ERT and VES measurements are severely affected by equivalency problems in these settings, practically prohibiting reasonable quantitative interpretation without constraining information. Such prior information may be obtained of course from boreholes, but equipment capable of drilling

  18. Simple absolute quantification method correcting for quantitative PCR efficiency variations for microbial community samples.

    PubMed

    Brankatschk, Robert; Bodenhausen, Natacha; Zeyer, Josef; Bürgmann, Helmut

    2012-06-01

    Real-time quantitative PCR (qPCR) is a widely used technique in microbial community analysis, allowing the quantification of the number of target genes in a community sample. Currently, the standard-curve (SC) method of absolute quantification is widely employed for these kinds of analysis. However, the SC method assumes that the amplification efficiency (E) is the same for both the standard and the sample target template. We analyzed 19 bacterial strains and nine environmental samples in qPCR assays, targeting the nifH and 16S rRNA genes. The E values of the qPCRs differed significantly, depending on the template. This has major implications for the quantification. If the sample and standard differ in their E values, quantification errors of up to orders of magnitude are possible. To address this problem, we propose and test the one-point calibration (OPC) method for absolute quantification. The OPC method corrects for differences in E and was derived from the ΔΔC(T) method with correction for E, which is commonly used for relative quantification in gene expression studies. The SC and OPC methods were compared by quantifying artificial template mixtures from Geobacter sulfurreducens (DSM 12127) and Nostoc commune (Culture Collection of Algae and Protozoa [CCAP] 1453/33), which differ in their E values. While the SC method deviated from the expected nifH gene copy number by 3- to 5-fold, the OPC method quantified the template mixtures with high accuracy. Moreover, analyzing environmental samples, we show that even small differences in E between the standard and the sample can cause significant differences between the copy numbers calculated by the SC and the OPC methods.

  19. An advanced deterministic method for spent fuel criticality safety analysis

    SciTech Connect

    DeHart, M.D.

    1998-01-01

    Over the past two decades, criticality safety analysts have come to rely to a large extent on Monte Carlo methods for criticality calculations. Monte Carlo has become popular because of its capability to model complex, non-orthogonal configurations or fissile materials, typical of real world problems. Over the last few years, however, interest in determinist transport methods has been revived, due shortcomings in the stochastic nature of Monte Carlo approaches for certain types of analyses. Specifically, deterministic methods are superior to stochastic methods for calculations requiring accurate neutron density distributions or differential fluxes. Although Monte Carlo methods are well suited for eigenvalue calculations, they lack the localized detail necessary to assess uncertainties and sensitivities important in determining a range of applicability. Monte Carlo methods are also inefficient as a transport solution for multiple pin depletion methods. Discrete ordinates methods have long been recognized as one of the most rigorous and accurate approximations used to solve the transport equation. However, until recently, geometric constraints in finite differencing schemes have made discrete ordinates methods impractical for non-orthogonal configurations such as reactor fuel assemblies. The development of an extended step characteristic (ESC) technique removes the grid structure limitations of traditional discrete ordinates methods. The NEWT computer code, a discrete ordinates code built upon the ESC formalism, is being developed as part of the SCALE code system. This paper will demonstrate the power, versatility, and applicability of NEWT as a state-of-the-art solution for current computational needs.

  20. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    PubMed

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens.

  1. Quantitative measurement of ultrasound pressure field by optical phase contrast method and acoustic holography

    NASA Astrophysics Data System (ADS)

    Oyama, Seiji; Yasuda, Jun; Hanayama, Hiroki; Yoshizawa, Shin; Umemura, Shin-ichiro

    2016-07-01

    A fast and accurate measurement of an ultrasound field with various exposure sequences is necessary to ensure the efficacy and safety of various ultrasound applications in medicine. The most common method used to measure an ultrasound pressure field, that is, hydrophone scanning, requires a long scanning time and potentially disturbs the field. This may limit the efficiency of developing applications of ultrasound. In this study, an optical phase contrast method enabling fast and noninterfering measurements is proposed. In this method, the modulated phase of light caused by the focused ultrasound pressure field is measured. Then, a computed tomography (CT) algorithm used to quantitatively reconstruct a three-dimensional (3D) pressure field is applied. For a high-intensity focused ultrasound field, a new approach that combines the optical phase contrast method and acoustic holography was attempted. First, the optical measurement of focused ultrasound was rapidly performed over the field near a transducer. Second, the nonlinear propagation of the measured ultrasound was simulated. The result of the new approach agreed well with that of the measurement using a hydrophone and was improved from that of the phase contrast method alone with phase unwrapping.

  2. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods.

    PubMed

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-15

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  3. Membrane chromatographic immunoassay method for rapid quantitative analysis of specific serum antibodies.

    PubMed

    Ghosh, Raja

    2006-02-05

    This paper discusses a membrane chromatographic immunoassay method for rapid detection and quantitative analysis of specific serum antibodies. A type of polyvinylidine fluoride (PVDF) microfiltration membrane was used in the method for its ability to reversibly and specifically bind IgG antibodies from antiserum samples by hydrophobic interaction. Using this form of selective antibody binding and enrichment an affinity membrane with antigen binding ability was obtained in-situ. This was done by passing a pulse of diluted antiserum sample through a stack of microporous PVDF membranes. The affinity membrane thus formed was challenged with a pulse of antigen solution and the amount of antigen bound was accurately determined using chromatographic methods. The antigen binding correlated well with the antibody loading on the membrane. This method is direct, rapid and accurate, does not involve any chemical reaction, and uses very few reagents. Moreover, the same membrane could be repeatedly used for sequential immunoassays on account of the reversible nature of the antibody binding. Proof of concept of this method is provided using human hemoglobin as model antigen and rabbit antiserum against human hemoglobin as the antibody source.

  4. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    NASA Astrophysics Data System (ADS)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  5. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials.

  6. Development of a quantitative method to monitor the effect of a tooth whitening agent.

    PubMed

    Amaechi, Bennett T; Higham, Susan M

    2002-01-01

    This study demonstrated a quantitative method for assessing the effect of a tooth whitening agent. Forty human teeth were stained with a tea solution, and randomly assigned to two groups (A, B) of twenty teeth. The teeth were subsequently treated with either sodium hypochlorite (NaOCL) or deionized distilled water (DDW) by intermittent immersion (60 seconds on each occasion) in a 1:10 dilution of NaOCL (group A) or DDW (group B). Prior to whitening and following each immersion, the color of the teeth at the stained spot was measured using ShadeEye-Ex Dental Chroma Meter and quantitative light-induced fluorescence (QLF). ShadeEye-Ex instantly gave a numerical value for the stain intensity, chroma (C), which is the average of three measurements taken automatically by the machine. QLF gave a quantitative value for the stain, delta Q (% mm2), following analysis of the fluorescence image of the tooth. Immersion was stopped after four readings when one specimen, in group A, was observed to have regained its natural color. There was a good correlation between C and delta Q with either NaOCL (Pearson correlation coefficient (r) = 0.974; p < 0.05) or DDW (r = 0.978; p < 0.05). With NaOCL, an inverse relationship observed between stain measurements, C (Linear fit correlation (R) = -0.982; p < 0.05) or delta Q (R = -0.988; p < 0.05) and exposure time correlated to a linear fit, but not with DDW. ANOVA showed a significant difference between the means (n = 20) of the reading at the measurement intervals (0, 60, 120 and 180 seconds) for both C (p < 0.001) and delta Q (p < 0.001) with NaOCL but not with DDW. In conclusion, the study highlighted the potential of ShadeEye-Ex Dental Chroma Meter as a tool for the quantitative assessment of the gradual change in shade of discolored teeth by tooth whitening products.

  7. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research.

  8. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  9. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-03-01

    This data article describes a controlled, spiked proteomic dataset for which the "ground truth" of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values.

  10. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  11. Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method.

    PubMed

    Meng, Xin; Huang, Huachuan; Yan, Keding; Tian, Xiaolin; Yu, Wei; Cui, Haoyang; Kong, Yan; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-12-20

    In order to realize high contrast imaging with portable devices for potential mobile healthcare, we demonstrate a hand-held smartphone based quantitative phase microscope using the transport of intensity equation method. With a cost-effective illumination source and compact microscope system, multi-focal images of samples can be captured by the smartphone's camera via manual focusing. Phase retrieval is performed using a self-developed Android application, which calculates sample phases from multi-plane intensities via solving the Poisson equation. We test the portable microscope using a random phase plate with known phases, and to further demonstrate its performance, a red blood cell smear, a Pap smear and monocot root and broad bean epidermis sections are also successfully imaged. Considering its advantages as an accurate, high-contrast, cost-effective and field-portable device, the smartphone based hand-held quantitative phase microscope is a promising tool which can be adopted in the future in remote healthcare and medical diagnosis.

  12. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  13. Qualitative and quantitative characterization of protein-phosphoinositide interactions with liposome-based methods.

    PubMed

    Busse, Ricarda A; Scacioc, Andreea; Hernandez, Javier M; Krick, Roswitha; Stephan, Milena; Janshoff, Andreas; Thumm, Michael; Kühnel, Karin

    2013-05-01

    We characterized phosphoinositide binding of the S. cerevisiae PROPPIN Hsv2 qualitatively with density flotation assays and quantitatively through isothermal titration calorimetry (ITC) measurements using liposomes. We discuss the design of these experiments and show with liposome flotation assays that Hsv2 binds with high specificity to both PtdIns3P and PtdIns(3,5)P 2. We propose liposome flotation assays as a more accurate alternative to the commonly used PIP strips for the characterization of phosphoinositide-binding specificities of proteins. We further quantitatively characterized PtdIns3P binding of Hsv2 with ITC measurements and determined a dissociation constant of 0.67 µM and a stoichiometry of 2:1 for PtdIns3P binding to Hsv2. PtdIns3P is crucial for the biogenesis of autophagosomes and their precursors. Besides the PROPPINs there are other PtdIns3P binding proteins with a link to autophagy, which includes the FYVE-domain containing proteins ZFYVE1/DFCP1 and WDFY3/ALFY and the PX-domain containing proteins Atg20 and Snx4/Atg24. The methods described could be useful tools for the characterization of these and other phosphoinositide-binding proteins.

  14. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2016-11-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  15. A quantitative method for zoning of protected areas and its spatial ecological implications.

    PubMed

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  16. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  17. Advances in rapid detection methods for foodborne pathogens.

    PubMed

    Zhao, Xihong; Lin, Chii-Wann; Wang, Jun; Oh, Deog Hwan

    2014-03-28

    Food safety is increasingly becoming an important public health issue, as foodborne diseases present a widespread and growing public health problem in both developed and developing countries. The rapid and precise monitoring and detection of foodborne pathogens are some of the most effective ways to control and prevent human foodborne infections. Traditional microbiological detection and identification methods for foodborne pathogens are well known to be time consuming and laborious as they are increasingly being perceived as insufficient to meet the demands of rapid food testing. Recently, various kinds of rapid detection, identification, and monitoring methods have been developed for foodborne pathogens, including nucleic-acid-based methods, immunological methods, and biosensor-based methods, etc. This article reviews the principles, characteristics, and applications of recent rapid detection methods for foodborne pathogens.

  18. A Perspective on Implementing a Quantitative Systems Pharmacology Platform for Drug Discovery and the Advancement of Personalized Medicine

    PubMed Central

    Stern, Andrew M.; Schurdak, Mark E.; Bahar, Ivet; Berg, Jeremy M.; Taylor, D. Lansing

    2016-01-01

    Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)–driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted thera