Sample records for provide quantitative comparisons

  1. The Local Geometry of Multiattribute Tradeoff Preferences

    PubMed Central

    McGeachie, Michael; Doyle, Jon

    2011-01-01

    Existing representations for multiattribute ceteris paribus preference statements have provided useful treatments and clear semantics for qualitative comparisons, but have not provided similarly clear representations or semantics for comparisons involving quantitative tradeoffs. We use directional derivatives and other concepts from elementary differential geometry to interpret conditional multiattribute ceteris paribus preference comparisons that state bounds on quantitative tradeoff ratios. This semantics extends the familiar economic notion of marginal rate of substitution to multiple continuous or discrete attributes. The same geometric concepts also provide means for interpreting statements about the relative importance of different attributes. PMID:21528018

  2. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  3. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  4. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.

  5. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976

  6. Choice of Intravenous Agents and Intubation Neuromuscular Blockers by Anesthesia Providers

    DTIC Science & Technology

    1996-09-01

    of this study to determine if experience of the provider made a difference in the agent chosen. Both quantitative and qualitative methods were...comparison of quantitative and qualitative data of induction and intubation agents collected from CRNAs and MDAs according to experience of both types of...providers was analyzed to provide meaningful data. The difference in choice of agents by experience was found not to be significant. IV CHOICE OF

  7. Methodological triangulation in a study of social support for siblings of children with cancer.

    PubMed

    Murray, J S

    1999-10-01

    Triangulation is an approach to research that is becoming increasingly popular among nurse researchers. Five types of triangulation are used in nursing research: data, methodological, theoretical, researcher, and analytical triangulation. Methodological triangulation is an attempt to improve validity by combining various techniques in one study. In this article, an example of quantitative and qualitative triangulation is discussed to illustrate the procedures used and the results achieved. The secondary data used as an example are from a previous study that was conducted by the researcher and investigated nursing interventions used by pediatric oncology nurses to provide social support to siblings of children with cancer. Results show that methodological triangulation was beneficial in this study for three reasons. First, the careful comparison of quantitative and qualitative data added support for the social support variables under investigation. Second, the comparison showed more in-depth dimensions about pediatric oncology nurses providing social support to siblings of children with cancer. Finally, the use of methodological triangulation provided insight into revisions for the quantitative instrument.

  8. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  9. Quantitative comparison of 3D third harmonic generation and fluorescence microscopy images.

    PubMed

    Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C

    2018-01-01

    Third harmonic generation (THG) microscopy is a label-free imaging technique that shows great potential for rapid pathology of brain tissue during brain tumor surgery. However, the interpretation of THG brain images should be quantitatively linked to images of more standard imaging techniques, which so far has been done qualitatively only. We establish here such a quantitative link between THG images of mouse brain tissue and all-nuclei-highlighted fluorescence images, acquired simultaneously from the same tissue area. For quantitative comparison of a substantial pair of images, we present here a segmentation workflow that is applicable for both THG and fluorescence images, with a precision of 91.3 % and 95.8 % achieved respectively. We find that the correspondence between the main features of the two imaging modalities amounts to 88.9 %, providing quantitative evidence of the interpretation of dark holes as brain cells. Moreover, 80 % bright objects in THG images overlap with nuclei highlighted in the fluorescence images, and they are 2 times smaller than the dark holes, showing that cells of different morphologies can be recognized in THG images. We expect that the described quantitative comparison is applicable to other types of brain tissue and with more specific staining experiments for cell type identification. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  11. Identifying persistent and characteristic features in firearm tool marks on cartridge cases

    NASA Astrophysics Data System (ADS)

    Ott, Daniel; Soons, Johannes; Thompson, Robert; Song, John

    2017-12-01

    Recent concerns about subjectivity in forensic firearm identification have motivated the development of algorithms to compare firearm tool marks that are imparted on ammunition and to generate quantitative measures of similarity. In this paper, we describe an algorithm that identifies impressed tool marks on a cartridge case that are both consistent between firings and contribute strongly to a surface similarity metric. The result is a representation of the tool mark topography that emphasizes both significant and persistent features across firings. This characteristic surface map is useful for understanding the variability and persistence of the tool marks created by a firearm and can provide improved discrimination between the comparison scores of samples fired from the same firearm and the scores of samples fired from different firearms. The algorithm also provides a convenient method for visualizing areas of similarity that may be useful in providing quantitative support for visual comparisons by trained examiners.

  12. Lunar mineral feedstocks from rocks and soils: X-ray digital imaging in resource evaluation

    NASA Technical Reports Server (NTRS)

    Chambers, John G.; Patchen, Allan; Taylor, Lawrence A.; Higgins, Stefan J.; Mckay, David S.

    1994-01-01

    The rocks and soils of the Moon provide raw materials essential to the successful establishment of a lunar base. Efficient exploitation of these resources requires accurate characterization of mineral abundances, sizes/shapes, and association of 'ore' and 'gangue' phases, as well as the technology to generate high-yield/high-grade feedstocks. Only recently have x-ray mapping and digital imaging techniques been applied to lunar resource evaluation. The topics covered include inherent differences between lunar basalts and soils and quantitative comparison of rock-derived and soil-derived ilmenite concentrates. It is concluded that x-ray digital-imaging characterization of lunar raw materials provides a quantitative comparison that is unattainable by traditional petrographic techniques. These data are necessary for accurately determining mineral distributions of soil and crushed rock material. Application of these techniques will provide an important link to choosing the best raw material for mineral beneficiation.

  13. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  14. Performance comparison of various seal coat grades used in Texas.

    DOT National Transportation Integrated Search

    2012-07-01

    This report documents research efforts to provide comparative quantitative performance information for various grades of seal coat aggregate available in the Texas Department of Transportations standard specifications. Length of service before rep...

  15. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  16. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  17. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  18. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  19. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  20. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  1. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  2. A comparison of two IPv4/IPv6 transition mechanisms - OpenVPN and IVI

    NASA Astrophysics Data System (ADS)

    Vu, Cong Tuan; Tran, Quang Anh; Jiang, Frank

    2012-09-01

    This document presents a comparison of two IPv4/IPv6 transition mechanisms. They are OpenVPN and IVI. Meanwhile OpenVPN is based on tunneling technology, IVI is a stateless IPv4/IPv6 translation technique which is developed by China Education and Research Network (CERNET). This research focus on the quantitative and qualitative comparison of these two main mechanisms; how they are applied in practical situation by the Internet Service Providers, as well as their advantages and drawbacks.

  3. Quantitative comparison between full-spectrum and filter-based imaging in hyperspectral fluorescence microscopy

    PubMed Central

    GAO, L.; HAGEN, N.; TKACZYK, T.S.

    2012-01-01

    Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127

  4. Devising tissue ingrowth metrics: a contribution to the computational characterization of engineered soft tissue healing.

    PubMed

    Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin

    2018-03-14

    The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.

  5. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  6. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  7. Comparison of numerical model simulations and SFO wake vortex windline measurements

    DOT National Transportation Integrated Search

    2003-06-23

    To provide quantitative support for the Simultaneous Offset Instrument Approach (SOIA) procedure, an extensive data collection effort was undertaken at San Francisco International Airport by the Federal Aviation Administration (FAA, U.S. Dept. of Tra...

  8. Comparison of Dynamic Contrast Enhanced MRI and Quantitative SPECT in a Rat Glioma Model

    PubMed Central

    Skinner, Jack T.; Yankeelov, Thomas E.; Peterson, Todd E.; Does, Mark D.

    2012-01-01

    Pharmacokinetic modeling of dynamic contrast enhanced (DCE)-MRI data provides measures of the extracellular volume fraction (ve) and the volume transfer constant (Ktrans) in a given tissue. These parameter estimates may be biased, however, by confounding issues such as contrast agent and tissue water dynamics, or assumptions of vascularization and perfusion made by the commonly used model. In contrast to MRI, radiotracer imaging with SPECT is insensitive to water dynamics. A quantitative dual-isotope SPECT technique was developed to obtain an estimate of ve in a rat glioma model for comparison to the corresponding estimates obtained using DCE-MRI with a vascular input function (VIF) and reference region model (RR). Both DCE-MRI methods produced consistently larger estimates of ve in comparison to the SPECT estimates, and several experimental sources were postulated to contribute to these differences. PMID:22991315

  9. Normalized Temperature Contrast Processing in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  10. Global, long-term surface reflectance records from Landsat

    USDA-ARS?s Scientific Manuscript database

    Global, long-term monitoring of changes in Earth’s land surface requires quantitative comparisons of satellite images acquired under widely varying atmospheric conditions. Although physically based estimates of surface reflectance (SR) ultimately provide the most accurate representation of Earth’s s...

  11. A comparison study of image features between FFDM and film mammogram images

    PubMed Central

    Jing, Hao; Yang, Yongyi; Wernick, Miles N.; Yarusso, Laura M.; Nishikawa, Robert M.

    2012-01-01

    Purpose: This work is to provide a direct, quantitative comparison of image features measured by film and full-field digital mammography (FFDM). The purpose is to investigate whether there is any systematic difference between film and FFDM in terms of quantitative image features and their influence on the performance of a computer-aided diagnosis (CAD) system. Methods: The authors make use of a set of matched film-FFDM image pairs acquired from cadaver breast specimens with simulated microcalcifications consisting of bone and teeth fragments using both a GE digital mammography system and a screen-film system. To quantify the image features, the authors consider a set of 12 textural features of lesion regions and six image features of individual microcalcifications (MCs). The authors first conduct a direct comparison on these quantitative features extracted from film and FFDM images. The authors then study the performance of a CAD classifier for discriminating between MCs and false positives (FPs) when the classifier is trained on images of different types (film, FFDM, or both). Results: For all the features considered, the quantitative results show a high degree of correlation between features extracted from film and FFDM, with the correlation coefficients ranging from 0.7326 to 0.9602 for the different features. Based on a Fisher sign rank test, there was no significant difference observed between the features extracted from film and those from FFDM. For both MC detection and discrimination of FPs from MCs, FFDM had a slight but statistically significant advantage in performance; however, when the classifiers were trained on different types of images (acquired with FFDM or SFM) for discriminating MCs from FPs, there was little difference. Conclusions: The results indicate good agreement between film and FFDM in quantitative image features. While FFDM images provide better detection performance in MCs, FFDM and film images may be interchangeable for the purposes of training CAD algorithms, and a single CAD algorithm may be applied to either type of images. PMID:22830771

  12. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  13. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  14. Comparison of multipoint linkage analyses for quantitative traits in the CEPH data: parametric LOD scores, variance components LOD scores, and Bayes factors.

    PubMed

    Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M

    2007-01-01

    We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus.

  15. Comparison of multipoint linkage analyses for quantitative traits in the CEPH data: parametric LOD scores, variance components LOD scores, and Bayes factors

    PubMed Central

    Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M

    2007-01-01

    We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus. PMID:18466597

  16. Perceived mood, health, and burden in female Mexican American family cancer caregivers.

    PubMed

    Wells, Jo Nell; Cagle, Carolyn Spence; Marshall, David; Hollen, Mary Luna

    2009-07-01

    Female family caregivers of various global cultures provide basic care in health, social, emotional, and financial domains for family members with cancer and may sacrifice their own health to do so. To learn about role-related mood, health status self-perceptions, and burden of one cultural group, we used qualitative and quantitative approaches to study 34 Mexican American (MA) women who provided care for an ill family member with cancer. We report quantitative data on study variables and make comparisons with caregiver qualitative reports. Implications for health planning, service delivery, and future research with underserved, minority female caregivers are presented.

  17. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  19. Guidance for Product Category Rule Development, Version 1.0

    EPA Science Inventory

    Environmental claims based on life cycle assessment (LCA) can provide quantitative, full life cycle information on products in a format that can permit comparisons and thereby inform purchasing decisions. In recent years, a number of standards and guides have emerged for making b...

  20. Quantitative Comparison and Metabolite Profiling of Saponins in Different Parts of the Root of Panax notoginseng

    PubMed Central

    2015-01-01

    Although both rhizome and root of Panax notoginseng are officially utilized as notoginseng in “Chinese Pharmacopoeia”, individual parts of the root were differently used in practice. To provide chemical evidence for the differentiated usage, quantitative comparison and metabolite profiling of different portions derived from the whole root, as well as commercial samples, were carried out, showing an overall higher content of saponins in rhizome, followed by main root, branch root, and fibrous root. Ginsenoside Rb2 was proposed as a potential marker with a content of 0.5 mg/g as a threshold value for differentiating rhizome from other parts. Multivariate analysis of the metabolite profile further suggested 32 saponins as potential markers for the discrimination of different parts of notoginseng. Collectively, the study provided comprehensive chemical evidence for the distinct usage of different parts of notoginseng and, hence, is of great importance for the rational application and exploitation of individual parts of notoginseng. PMID:25118819

  1. Comparison of methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles.

    PubMed

    Schneider, Barbara St Pierre; Nicholas, Jennifer; Kurrus, Jeffrey E

    2013-01-01

    To compare the methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles. The methodologic quality of quantitative nursing education research needs to advance to a higher level. Clinical research can provide guidance for nursing education to reach this level. One hundred quantitative clinical research articles from-high impact journals published in 2007 and 37 education research articles from high impact journals published in 2006 to 2007 were chosen for analysis. Clinical articles had significantly higher quality scores than education articles in three domains: number of institutions studied, type of data, and outcomes. The findings indicate three ways in which nursing education researchers can strengthen the methodologic quality of their quantitative research. With this approach, greater funding may be secured for advancing the science of nursing education.

  2. A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  3. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.

  4. Science & Engineering Indicators--1989.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. National Science Board.

    This volume is the ninth in the biennial "Science Indicators" series initiated by the National Science Board. The series provides a broad base of quantitative information about the structure and function of United States science and technology and comparisons with other advanced industrial countries. An overview of science and technology…

  5. In Vitro Comparison of Adipokine Export Signals.

    PubMed

    Sharafi, Parisa; Kocaefe, Y Çetin

    2016-01-01

    Mammalian cells are widely used for recombinant protein production in research and biotechnology. Utilization of export signals significantly facilitates production and purification processes. 35 years after the discovery of the mammalian export machinery, there still are obscurities regarding the efficiency of the export signals. The aim of this study was the comparative evaluation of the efficiency of selected export signals using adipocytes as a cell model. Adipocytes have a large capacity for protein secretion including several enzymes, adipokines, and other signaling molecules, providing a valid system for a quantitative evaluation. Constructs that expressed N-terminal fusion export signals were generated to express Enhanced Green Fluorescence Protein (EGFP) as a reporter for quantitative and qualitative evaluation. Furthermore, fluorescent microscopy was used to trace the intracellular traffic of the reporter. The export efficiency of six selected proteins secreted from adipocytes was evaluated. Quantitative comparison of intracellular and exported fractions of the recombinant constructs demonstrated a similar efficiency among the studied sequences with minor variations. The export signal of Retinol Binding Protein (RBP4) exhibited the highest efficiency. This study presents the first quantitative data showing variations among export signals, in adipocytes which will help optimization of recombinant protein distribution.

  6. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  7. KEY COMPARISON: CCQM-K61: Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA

    NASA Astrophysics Data System (ADS)

    Woolford, Alison; Holden, Marcia; Salit, Marc; Burns, Malcolm; Ellison, Stephen L. R.

    2009-01-01

    Key comparison CCQM-K61 was performed to demonstrate and document the capability of interested national metrology institutes in the determination of the quantity of specific DNA target in an aqueous solution. The study provides support for the following measurement claim: "Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA". The comparison was an activity of the Bioanalysis Working Group (BAWG) of the Comité Consultatif pour la Quantité de Matière and was coordinated by NIST (Gaithersburg, USA) and LGC (Teddington, UK). The following laboratories (in alphabetical order) participated in this key comparison. DMSC (Thailand); IRMM (European Union); KRISS (Republic of Korea); LGC (UK); NIM (China); NIST (USA); NMIA (Australia); NMIJ (Japan); VNIIM (Russian Federation) Good agreement was observed between the reported results of all nine of the participants. Uncertainty estimates did not account fully for the dispersion of results even after allowance for possible inhomogeneity in calibration materials. Preliminary studies suggest that the effects of fluorescence threshold setting might contribute to the excess dispersion, and further study of this topic is suggested Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  8. Effect of vision, touch and stance on cerebellar vermian-related sway and tremor: a quantitative physiological and MRI study.

    PubMed

    Sullivan, Edith V; Rose, Jessica; Pfefferbaum, Adolf

    2006-08-01

    Postural balance is impaired in individuals with pathology of the anterior superior vermis of the cerebellum. Chronic alcoholism, with its known vermian pathology, provides a viable model for studying the relationship between cerebellar pathology and postural stability. Decades of separate study of recovering alcoholics and post-mortem neuroanatomical analysis have demonstrated vermian pathology but few studies have used quantitative posturography, acquired concurrently with quantitative neuroimaging, to establish whether this brain structure-function relationship is selective in vivo. Here, 30 healthy men and 39 chronic alcoholic men, abstinent from alcohol for several months, underwent MRI for volumetric quantitation of the cerebellar vermis and three comparison brain regions, the cerebellar hemispheres, supratentorial cortex and corpus callosum. All subjects also participated in an experiment involving a force platform that measured sway path length and tremor during static standing balance under four sensory conditions and two stance conditions. Three novel findings emerged: (i) sway path length, a physiological index of postural control, was selectively related to volume of the cerebellar vermis and not to any comparison brain region in the alcoholics; (ii) spectral analysis revealed sway prominence in the 2-5 Hz band, another physiological sign of vermian lesions and also selectively related to vermian volume in the alcoholics; and (iii) despite substantial postural sway in the patients, they successfully used vision, touch and stance to normalize sway and reduce tremor. The selective relationship of sway path to vermian but not lateral cerebellar volume provides correlational evidence for functional differentiation of these cerebellar regions. Improvement to virtual normal levels in balance and reduction in sway and tremor with changes in vision, touch and stance provide evidence that adaptive mechanisms recruiting sensorimotor integration can be invoked to compensate for underlying cerebellar vermian-related dysfunction.

  9. On Quantitative Comparative Research in Communication and Language Evolution

    PubMed Central

    Oller, D. Kimbrough; Griebel, Ulrike

    2014-01-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives. PMID:25285057

  10. On Quantitative Comparative Research in Communication and Language Evolution.

    PubMed

    Oller, D Kimbrough; Griebel, Ulrike

    2014-09-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives.

  11. Test-Analysis Correlation for Space Shuttle External Tank Foam Impacting RCC Wing Leading Edge Component Panels

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2008-01-01

    The Space Shuttle Columbia Accident Investigation Board recommended that NASA develop, validate, and maintain a modeling tool capable of predicting the damage threshold for debris impacts on the Space Shuttle Reinforced Carbon-Carbon (RCC) wing leading edge and nosecap assembly. The results presented in this paper are one part of a multi-level approach that supported the development of the predictive tool used to recertify the shuttle for flight following the Columbia Accident. The assessment of predictive capability was largely based on test analysis comparisons for simpler component structures. This paper provides comparisons of finite element simulations with test data for external tank foam debris impacts onto 6-in. square RCC flat panels. Both quantitative displacement and qualitative damage assessment correlations are provided. The comparisons show good agreement and provided the Space Shuttle Program with confidence in the predictive tool.

  12. OdorMapComparer: an application for quantitative analyses and comparisons of fMRI brain odor maps.

    PubMed

    Liu, Nian; Xu, Fuqiang; Miller, Perry L; Shepherd, Gordon M

    2007-01-01

    Brain odor maps are reconstructed flat images that describe the spatial activity patterns in the glomerular layer of the olfactory bulbs in animals exposed to different odor stimuli. We have developed a software application, OdorMapComparer, to carry out quantitative analyses and comparisons of the fMRI odor maps. This application is an open-source window program that first loads two odor map images being compared. It allows image transformations including scaling, flipping, rotating, and warping so that the two images can be appropriately aligned to each other. It performs simple subtraction, addition, and average of signals in the two images. It also provides comparative statistics including the normalized correlation (NC) and spatial correlation coefficient. Experimental studies showed that the rodent fMRI odor maps for aliphatic aldehydes displayed spatial activity patterns that are similar in gross outlines but somewhat different in specific subregions. Analyses with OdorMapComparer indicate that the similarity between odor maps decreases with increasing difference in the length of carbon chains. For example, the map of butanal is more closely related to that of pentanal (with a NC = 0.617) than to that of octanal (NC = 0.082), which is consistent with animal behavioral studies. The study also indicates that fMRI odor maps are statistically odor-specific and repeatable across both the intra- and intersubject trials. OdorMapComparer thus provides a tool for quantitative, statistical analyses and comparisons of fMRI odor maps in a fashion that is integrated with the overall odor mapping techniques.

  13. Systematic Comparison of Label-Free, Metabolic Labeling, and Isobaric Chemical Labeling for Quantitative Proteomics on LTQ Orbitrap Velos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhou; Adams, Rachel M; Chourey, Karuna

    2012-01-01

    A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less

  14. 78 FR 72891 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-04

    ...' preferences about specific long- term care insurance features. In the DCE, respondents will complete a series... of hypothetical comparisons provide quantitative data on the relative preferences and importance of... research study, the study purpose, procedures, duration of the survey, possible risks or discomforts from...

  15. Comparison of Transformational Leadership Practices: Implications for School Districts and Principal Preparation

    ERIC Educational Resources Information Center

    Quin, Jeff; Deris, Aaron; Bischoff, Greg; Johnson, James T.

    2015-01-01

    The purpose of this study was to determine the leadership practices needed to improve academic achievement and generate positive change in school organizations. The study was also conducted to provide insight to principal preparation programs and school districts about effective transformational leadership practices. A quantitative research method…

  16. Investigating Methodological Differences in the Assessment of Dendritic Morphology of Basolateral Amygdala Principal Neurons-A Comparison of Golgi-Cox and Neurobiotin Electroporation Techniques.

    PubMed

    Klenowski, Paul M; Wright, Sophie E; Mu, Erica W H; Noakes, Peter G; Lavidis, Nickolas A; Bartlett, Selena E; Bellingham, Mark C; Fogarty, Matthew J

    2017-12-19

    Quantitative assessments of neuronal subtypes in numerous brain regions show large variations in dendritic arbor size. A critical experimental factor is the method used to visualize neurons. We chose to investigate quantitative differences in basolateral amygdala (BLA) principal neuron morphology using two of the most common visualization methods: Golgi-Cox staining and neurobiotin (NB) filling. We show in 8-week-old Wistar rats that NB-filling reveals significantly larger dendritic arbors and different spine densities, compared to Golgi-Cox-stained BLA neurons. Our results demonstrate important differences and provide methodological insights into quantitative disparities of BLA principal neuron morphology reported in the literature.

  17. Abundances of Neutral and Ionized PAH Along The Lines-of-Sight of Diffuse and Translucent Interstellar Clouds

    NASA Technical Reports Server (NTRS)

    Salama, Farid; Galazutdinov, Gazinur; Krewloski, Jacek; Biennier, Ludovic; Beletsky, Yuri; Song, In-Ok

    2013-01-01

    The spectra of neutral and ionized PAHs isolated in the gas phase at low temperature have been measured in the laboratory under conditions that mimic interstellar conditions and are compared with a set of astronomical spectra of reddened, early type stars. The comparisons of astronomical and laboratory data provide upper limits for the abundances of neutral PAH molecules and ions along specific lines-of-sight. Something that is not attainable from infrared observations. We present the characteristics of the laboratory facility (COSmIC) that was developed for this study and discuss the findings resulting from the comparison of the laboratory data with high resolution, high S/N ratio astronomical observations. COSmIC combines a supersonic jet expansion with discharge plasma and cavity ringdown spectroscopy and provides experimental conditions that closely mimic the interstellar conditions. The column densities of the individual PAH molecules and ions probed in these surveys are derived from the comparison of the laboratory data with high resolution, high S/N ratio astronomical observations. The comparisons of astronomical and laboratory data lead to clear conclusions regarding the expected abundances for PAHs in the interstellar environments probed in the surveys. Band profile comparisons between laboratory and astronomical spectra lead to information regarding the molecular structures and characteristics associated with the DIB carriers in the corresponding lines-of-sight. These quantitative surveys of neutral and ionized PAHs in the optical range open the way for quantitative searches of PAHs and complex organics in a variety of interstellar and circumstellar environments.

  18. Quantitative laser speckle flowmetry of the in vivo microcirculation using sidestream dark field microscopy

    PubMed Central

    Nadort, Annemarie; Woolthuis, Rutger G.; van Leeuwen, Ton G.; Faber, Dirk J.

    2013-01-01

    We present integrated Laser Speckle Contrast Imaging (LSCI) and Sidestream Dark Field (SDF) flowmetry to provide real-time, non-invasive and quantitative measurements of speckle decorrelation times related to microcirculatory flow. Using a multi exposure acquisition scheme, precise speckle decorrelation times were obtained. Applying SDF-LSCI in vitro and in vivo allows direct comparison between speckle contrast decorrelation and flow velocities, while imaging the phantom and microcirculation architecture. This resulted in a novel analysis approach that distinguishes decorrelation due to flow from other additive decorrelation sources. PMID:24298399

  19. Comparison of GEANT4 very low energy cross section models with experimental data in water.

    PubMed

    Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C

    2010-09-01

    The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.

  20. Quantitative force measurements in liquid using frequency modulation atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, Takayuki; Higgins, Michael J.; Yasuda, Satoshi; Jarvis, Suzanne P.; Akita, Seiji; Nakayama, Yoshikazu; Sader, John E.

    2004-10-01

    The measurement of short-range forces with the atomic force microscope (AFM) typically requires implementation of dynamic techniques to maintain sensitivity and stability. While frequency modulation atomic force microscopy (FM-AFM) is used widely for high-resolution imaging and quantitative force measurements in vacuum, quantitative force measurements using FM-AFM in liquids have proven elusive. Here we demonstrate that the formalism derived for operation in vacuum can also be used in liquids, provided certain modifications are implemented. To facilitate comparison with previous measurements taken using surface forces apparatus, we choose a model system (octamethylcyclotetrasiloxane) that is known to exhibit short-ranged structural ordering when confined between two surfaces. Force measurements obtained are found to be in excellent agreement with previously reported results. This study therefore establishes FM-AFM as a powerful tool for the quantitative measurement of forces in liquid.

  1. Investing in Education: Analysis of the 1999 World Education Indicators. Education and Skills.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This Organisation for Economic Cooperation and Development report documents the growing demand for learning around the world. A quantitative description of the functioning of education systems allows for international comparisons and the identification of the strengths and weaknesses of various approaches to providing quality education. Chapter 1,…

  2. A Comparison of Community College Full-Time and Adjunct Faculties' Perceptions of Factors Associated with Grade Inflation

    ERIC Educational Resources Information Center

    Schutz, Kelly R.; Drake, Brent M.; Lessner, Janet; Hughes, Gail F.

    2015-01-01

    Grades historically have indicated student performance in college. Previous studies in the higher education literature, primarily conducted at four-year teaching institutions, have suggested reasons for grade inflation but have provided little supporting empirical data. This quantitative, non-experimental, comparative study used survey research to…

  3. Implications of the Java language on computer-based patient records.

    PubMed

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs.

  4. Quantitative comparison of a human cancer cell surface proteome between interphase and mitosis.

    PubMed

    Özlü, Nurhan; Qureshi, Mohammad H; Toyoda, Yusuke; Renard, Bernhard Y; Mollaoglu, Gürkan; Özkan, Nazlı E; Bulbul, Selda; Poser, Ina; Timm, Wiebke; Hyman, Anthony A; Mitchison, Timothy J; Steen, Judith A

    2015-01-13

    The cell surface is the cellular compartment responsible for communication with the environment. The interior of mammalian cells undergoes dramatic reorganization when cells enter mitosis. These changes are triggered by activation of the CDK1 kinase and have been studied extensively. In contrast, very little is known of the cell surface changes during cell division. We undertook a quantitative proteomic comparison of cell surface-exposed proteins in human cancer cells that were tightly synchronized in mitosis or interphase. Six hundred and twenty-eight surface and surface-associated proteins in HeLa cells were identified; of these, 27 were significantly enriched at the cell surface in mitosis and 37 in interphase. Using imaging techniques, we confirmed the mitosis-selective cell surface localization of protocadherin PCDH7, a member of a family with anti-adhesive roles in embryos. We show that PCDH7 is required for development of full mitotic rounding pressure at the onset of mitosis. Our analysis provided basic information on how cell cycle progression affects the cell surface. It also provides potential pharmacodynamic biomarkers for anti-mitotic cancer chemotherapy. © 2014 The Authors.

  5. Quantitative comparison of a human cancer cell surface proteome between interphase and mitosis

    PubMed Central

    Özlü, Nurhan; Qureshi, Mohammad H; Toyoda, Yusuke; Renard, Bernhard Y; Mollaoglu, Gürkan; Özkan, Nazlı E; Bulbul, Selda; Poser, Ina; Timm, Wiebke; Hyman, Anthony A; Mitchison, Timothy J; Steen, Judith A

    2015-01-01

    The cell surface is the cellular compartment responsible for communication with the environment. The interior of mammalian cells undergoes dramatic reorganization when cells enter mitosis. These changes are triggered by activation of the CDK1 kinase and have been studied extensively. In contrast, very little is known of the cell surface changes during cell division. We undertook a quantitative proteomic comparison of cell surface-exposed proteins in human cancer cells that were tightly synchronized in mitosis or interphase. Six hundred and twenty-eight surface and surface-associated proteins in HeLa cells were identified; of these, 27 were significantly enriched at the cell surface in mitosis and 37 in interphase. Using imaging techniques, we confirmed the mitosis-selective cell surface localization of protocadherin PCDH7, a member of a family with anti-adhesive roles in embryos. We show that PCDH7 is required for development of full mitotic rounding pressure at the onset of mitosis. Our analysis provided basic information on how cell cycle progression affects the cell surface. It also provides potential pharmacodynamic biomarkers for anti-mitotic cancer chemotherapy. PMID:25476450

  6. Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses

    PubMed Central

    Alexander, Elsinore; Wei, Xin; Lee, Shinwook

    2018-01-01

    Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526

  7. Direct comparison of low- and mid-frequency Raman spectroscopy for quantitative solid-state pharmaceutical analysis.

    PubMed

    Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J

    2018-02-05

    This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Obesity prevention: Comparison of techniques and potential solution

    NASA Astrophysics Data System (ADS)

    Zulkepli, Jafri; Abidin, Norhaslinda Zainal; Zaibidi, Nerda Zura

    2014-12-01

    Over the years, obesity prevention has been a broadly studied subject by both academicians and practitioners. It is one of the most serious public health issue as it can cause numerous chronic health and psychosocial problems. Research is needed to suggest a population-based strategy for obesity prevention. In the academic environment, the importance of obesity prevention has triggered various problem solving approaches. A good obesity prevention model, should comprehend and cater all complex and dynamics issues. Hence, the main purpose of this paper is to discuss the qualitative and quantitative approaches on obesity prevention study and to provide an extensive literature review on various recent modelling techniques for obesity prevention. Based on these literatures, the comparison of both quantitative and qualitative approahes are highlighted and the justification on the used of system dynamics technique to solve the population of obesity is discussed. Lastly, a potential framework solution based on system dynamics modelling is proposed.

  9. Abundances of Neutral and Ionized PAH Along The Lines-of-Sight of Diffuse and Translucent Interstellar Clouds

    NASA Astrophysics Data System (ADS)

    Salama, Farid; Galazutdinov, G.; Krelowski, J.; Biennier, L.; Beletsky, Y.; Song, I.

    2013-06-01

    The spectra of neutral and ionized PAHs isolated in the gas phase at low temperature have been measured in the laboratory under conditions that mimic interstellar conditions and are compared with a set of astronomical spectra of reddened, early type stars. The comparisons of astronomical and laboratory data provide upper limits for the abundances of neutral PAH molecules and ions along specific lines-of-sight. Something that is not attainable from infrared observations. We present the characteristics of the laboratory facility (COSmIC) that was developed for this study and discuss the findings resulting from the comparison of the laboratory data with high resolution, high S/N ratio astronomical observations. COSmIC combines a supersonic jet expansion with discharge plasma and cavity ringdown spectroscopy and provides experimental conditions that closely mimic the interstellar conditions. The column densities of the individual PAH molecules and ions probed in these surveys are derived from the comparison of the laboratory data with high resolution, high S/N ratio astronomical observations. The comparisons of astronomical and laboratory data lead to clear conclusions regarding the expected abundances for PAHs in the interstellar environments probed in the surveys. Band profile comparisons between laboratory and astronomical spectra lead to information regarding the molecular structures and characteristics associated with the DIB carriers in the corresponding lines-of-sight. These quantitative surveys of neutral and ionized PAHs in the optical range open the way for quantitative searches of PAHs and complex organics in a variety of interstellar and circumstellar environments. Acknowledgements: F.S. acknowledges the support of the Astrophysics Research and Analysis Program of the NASA Space Mission Directorate and the technical support provided by R. Walker at NASA ARC. J.K. acknowledges the financial support of the Polish State. The authors are deeply grateful to the ESO archive as well as to the ESO staff members for their active support.

  10. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  11. Recent Progress in DIB Research: Survey of PAHS and DIBS

    NASA Technical Reports Server (NTRS)

    Salama, Farid; Galazutdinov, G.; Krelowski, J.; Biennier, L.; Beletsky, Y.; Song, I.

    2013-01-01

    The spectra of several neutral and ionized PAHs isolated in the gas phase at low temperature have been measured in the laboratory under experimental conditions that mimic interstellar conditions and are compared with an extensive set of astronomical spectra of reddened, early type stars [1, 2]. The comparisons of astronomical and laboratory data provide upper limits for the abundances of specific neutral PAH molecules and ions along specific lines-of-sight. Something that is not attainable from infrared observations alone. We present the characteristics of the laboratory facility (COSmIC) that was developed for this study and discuss the findings resulting from the comparison of these unique laboratory data with high resolution, high S/N ratio astronomical observations. COSmIC combines a supersonic free jet expansion with discharge plasma and high-sensitivity cavity ringdown spectroscopy and provides experimental conditions that closely mimic the interstellar conditions. The column densities of the individual neutral PAH molecules and ions probed in these surveys are derived from the comparison of these unique laboratory data with high resolution, high S/N ratio astronomical observations. The comparisons of astronomical and laboratory data lead to clear and unambiguous conclusions regarding the expected abundances for PAHs of various sizes and charge states in the interstellar environments probed in the surveys. Band profile comparisons between laboratory and astronomical spectra lead to information regarding the molecular structures and characteristics associated with the DIB carriers in the corresponding lines-of-sight. These quantitative surveys of neutral and ionized PAHs in the optical range open the way for unambiguous quantitative searches of PAHs and complex organics in a variety of interstellar and circumstellar environments.

  12. Identification of malaria infected red blood samples by digital holographic quantitative phase microscope

    NASA Astrophysics Data System (ADS)

    Patel, Nimit R.; Chhaniwal, Vani K.; Javidi, Bahram; Anand, Arun

    2015-07-01

    Development of devices for automatic identification of diseases is desired especially in developing countries. In the case of malaria, even today the gold standard is the inspection of chemically treated blood smears through a microscope. This requires a trained technician/microscopist to identify the cells in the field of view, with which the labeling chemicals gets attached. Bright field microscopes provide only low contrast 2D images of red blood cells and cell thickness distribution cannot be obtained. Quantitative phase contrast microscopes can provide both intensity and phase profiles of the cells under study. The phase information can be used to determine thickness profile of the cell. Since cell morphology is available, many parameters pertaining to the 3D shape of the cell can be computed. These parameters in turn could be used to decide about the state of health of the cell leading to disease diagnosis. Here the investigations done on digital holographic microscope, which provides quantitative phase images, for comparison of parameters obtained from the 3D shape profile of objects leading to identification of diseased samples is described.

  13. English Engagement Markers: A Comparison of Humanities and Science Journal Articles

    ERIC Educational Resources Information Center

    Sahragard, Rahman; Yazdanpanahi, Solmaz

    2017-01-01

    Engagement markers (hereafter, EMs) are crucial interpersonal devices to interact with readers through texts. However, little is known about the differences of EMs use in Humanities and Science journal research articles (hereafter, RAs), as well as the changes in markers use over the passage of time. The present study provides a quantitative and…

  14. A Comparison of Urban School- and Community-Based Dental Clinics

    ERIC Educational Resources Information Center

    Larsen, Charles D.; Larsen, Michael D.; Handwerker, Lisa B.; Kim, Maile S.; Rosenthal, Murray

    2009-01-01

    Background: The objective of the study was to quantitatively compare school- and community-based dental clinics in New York City that provide dental services to children in need. It was hypothesized that the school-based clinics would perform better in terms of several measures. Methods: We reviewed billing and visit data derived from encounter…

  15. The Effect of Studying Tech Prep in High School and College Academic Performance

    ERIC Educational Resources Information Center

    Ray, Larry A.

    2011-01-01

    This study examined the academic performance of Tech Prep students (referred to as participants) in comparison to non-Tech Prep students (referred to as non-participants) entering a two-year community college from sixteen different high schools in Stark County, Ohio. This study provided a quantitative analysis of students' academic experiences to…

  16. Cultural Consensus and Cultural Diversity: A Mixed Methods Investigation of Human Service Providers' Models of Domestic Violence

    ERIC Educational Resources Information Center

    Collins, Cyleste C.; Dressler, William W.

    2008-01-01

    This study uses mixed methods and theory from cognitive anthropology to examine the cultural models of domestic violence among domestic violence agency workers, welfare workers, nurses, and a general population comparison group. Data collection and analysis uses quantitative and qualitative techniques, and the findings are integrated for…

  17. A multiplexed system for quantitative comparisons of chromatin landscapes

    PubMed Central

    van Galen, Peter; Viny, Aaron D.; Ram, Oren; Ryan, Russell J.H.; Cotton, Matthew J.; Donohue, Laura; Sievers, Cem; Drier, Yotam; Liau, Brian B.; Gillespie, Shawn M.; Carroll, Kaitlin M.; Cross, Michael B.; Levine, Ross L.; Bernstein, Bradley E.

    2015-01-01

    Genome-wide profiling of histone modifications can provide systematic insight into the regulatory elements and programs engaged in a given cell type. However, conventional chromatin immunoprecipitation and sequencing (ChIP-seq) does not capture quantitative information on histone modification levels, requires large amounts of starting material, and involves tedious processing of each individual sample. Here we address these limitations with a technology that leverages DNA barcoding to profile chromatin quantitatively and in multiplexed format. We concurrently map relative levels of multiple histone modifications across multiple samples, each comprising as few as a thousand cells. We demonstrate the technology by monitoring dynamic changes following inhibition of P300, EZH2 or KDM5, by linking altered epigenetic landscapes to chromatin regulator mutations, and by mapping active and repressive marks in purified human hematopoietic stem cells. Hence, this technology enables quantitative studies of chromatin state dynamics across rare cell types, genotypes, environmental conditions and drug treatments. PMID:26687680

  18. Accurate virus quantitation using a Scanning Transmission Electron Microscopy (STEM) detector in a scanning electron microscope.

    PubMed

    Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G

    2017-10-01

    A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Solar Occultation Satellite Data and Derived Meteorological Products: Sampling Issues and Comparisons with Aura MLS

    NASA Technical Reports Server (NTRS)

    Manney, Gloria; Daffer, William H.; Zawodny, Joseph M.; Bernath, Peter F.; Hoppel, Karl W.; Walker, Kaley A.; Knosp, Brian W.; Boone, Chris; Remsberg, Ellis E.; Santee, Michelle L.; hide

    2007-01-01

    Derived Meteorological Products (DMPs, including potential temperature (theta), potential vorticity, equivalent latitude (EqL), horizontal winds and tropopause locations) have been produced for the locations and times of measurements by several solar occultation (SO) instruments and the Aura Microwave Limb Sounder (MLS). DMPs are calculated from several meteorological analyses for the Atmospheric Chemistry Experiment-Fourier Transform Spectrometer, Stratospheric Aerosol and Gas Experiment II and III, Halogen Occultation Experiment, and Polar Ozone and Aerosol Measurement II and III SO instruments and MLS. Time-series comparisons of MLS version 1.5 and SO data using DMPs show good qualitative agreement in time evolution of O3, N2O, H20, CO, HNO3, HCl and temperature; quantitative agreement is good in most cases. EqL-coordinate comparisons of MLS version 2.2 and SO data show good quantitative agreement throughout the stratosphere for most of these species, with significant biases for a few species in localized regions. Comparisons in EqL coordinates of MLS and SO data, and of SO data with geographically coincident MLS data provide insight into where and how sampling effects are important in interpretation of the sparse SO data, thus assisting in fully utilizing the SO data in scientific studies and comparisons with other sparse datasets. The DMPs are valuable for scientific studies and to facilitate validation of non-coincident measurements.

  20. Cost benefit analysis of the transfer of NASA remote sensing technology to the state of Georgia

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P. (Principal Investigator); Wilkins, R. D.; Kelly, D. L.; Brown, D. M.

    1977-01-01

    The author has identified the following significant results. First order benefits can generally be quantified, thus allowing quantitative comparisons of candidate land cover data systems. A meaningful dollar evaluation of LANDSAT can be made by a cost comparison with equally effective data systems. Users of LANDSAT data can be usefully categorized as performing three general functions: planning, permitting, and enforcing. The value of LANDSAT data to the State of Georgia is most sensitive to the parameters: discount rate, digitization cost, and photo acquisition cost. Under a constrained budget, LANDSAT could provide digitized land cover information roughly seven times more frequently than could otherwise be obtained. Thus on one hand, while the services derived from LANDSAT data in comparison to the baseline system has a positive net present value, on the other hand if the budget were constrained, more frequent information could be provided using the LANDSAT system than otherwise be obtained.

  1. Validation Process for LEWICE Coupled by Use of a Navier-stokes Solver

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    2016-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth for many meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will show the differences in ice shape between LEWICE 3.5 and experimental data. In addition, comparisons will be made between the lift and drag calculated on the ice shapes from experiment and those produced by LEWICE. This report will also provide a description of both programs. Quantitative geometric comparisons are shown for horn height, horn angle, icing limit, area and leading edge thickness. Quantitative comparisons of calculated lift and drag will also be shown. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  2. DOT/NASA comparative assessment of Brayton engines for guideway vehicle and buses. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Department of Transportation requested that the NASA Office of Aeronautics and Space Technology evaluate and assess the potential of several types of gas turbine engines and fuels for the on-board power and propulsion of a future heavy-duty ground transportation system. The purpose of the investigation was threefold: (1) to provide a definition of the potential for turbine engines to minimize pollution, energy consumption, and noise; (2) to provide a useful means of comparison of the types of engine based on consistent assumptions and a common analytical approach; and (3) to provide a compendium of comparative performance data that would serve as the technical basis for future planning. Emphasis was on establishing comparison trends rather than on absolute values and a definitive engine selection. The primary value of this study is intended to be usefulness of the results to provide a quantitative basis for future judgement.

  3. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  4. CSGRqtl: A Comparative Quantitative Trait Locus Database for Saccharinae Grasses.

    PubMed

    Zhang, Dong; Paterson, Andrew H

    2017-01-01

    Conventional biparental quantitative trait locus (QTL) mapping has led to some successes in the identification of causal genes in many organisms. QTL likelihood intervals not only provide "prior information" for finer-resolution approaches such as GWAS but also provide better statistical power than GWAS to detect variants with low/rare frequency in a natural population. Here, we describe a new element of an ongoing effort to provide online resources to facilitate study and improvement of the important Saccharinae clade. The primary goal of this new resource is the anchoring of published QTLs for this clade to the Sorghum genome. Genetic map alignments translate a wealth of genomic information from sorghum to Saccharum spp., Miscanthus spp., and other taxa. In addition, genome alignments facilitate comparison of the Saccharinae QTL sets to those of other taxa that enjoy comparable resources, exemplified herein by rice.

  5. Assessing browse trend at the landscape level Part 2: Monitoring

    USGS Publications Warehouse

    Keigley, R.B.; Frisina, M.R.; Fager, C.W.

    2002-01-01

    In Part 1, we assessed browse trend across a wide geographic area of Mt. Haggin Wildlife Management Area by conducting surveys of browsing-related architectures. Those data were qualitative. Below we describe the periodic collection of quantitative data from permanently marked locations; we refer to this phase of the trend assessment program as "monitoring." Trend was monitored by three methods: 1 Repeat photography. 2 Comparison of the height of live stems with the height of stems killed by browsing (LD Index). 3 Net annual stem growth rate (NAGRL3). The photography provides an assessment of trend from the comparison of photographs taken at intervals of a few years. The LD Index and NAGRL3 measurements provide an immediate assessment of trend.

  6. Interpreting comprehensive two-dimensional gas chromatography using peak topography maps with application to petroleum forensics.

    PubMed

    Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M

    2016-01-01

    Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.

  7. "They're Not Girly Girls": An Exploration of Quantitative and Qualitative Data on Engineering and Gender in Higher Education

    ERIC Educational Resources Information Center

    Barnard, S.; Hassan, T.; Bagilhole, B.; Dainty, A.

    2012-01-01

    Despite sustained efforts to promote engineering careers to young women, it remains the most male-dominated academic discipline in Europe. This paper will provide an overview of UK data and research on women in engineering higher education, within the context of Europe. Comparisons between data from European countries representing various regions…

  8. Method comparison for forest soil carbon and nitrogen estimates in the Delaware River basin

    Treesearch

    B. Xu; Yude Pan; A.H. Johnson; A.F. Plante

    2016-01-01

    The accuracy of forest soil C and N estimates is hampered by forest soils that are rocky, inaccessible, and spatially heterogeneous. A composite coring technique is the standard method used in Forest Inventory and Analysis, but its accuracy has been questioned. Quantitative soil pits provide direct measurement of rock content and soil mass from a larger, more...

  9. One Model Fits All: Explaining Many Aspects of Number Comparison within a Single Coherent Model-A Random Walk Account

    ERIC Educational Resources Information Center

    Reike, Dennis; Schwarz, Wolf

    2016-01-01

    The time required to determine the larger of 2 digits decreases with their numerical distance, and, for a given distance, increases with their magnitude (Moyer & Landauer, 1967). One detailed quantitative framework to account for these effects is provided by random walk models. These chronometric models describe how number-related noisy…

  10. A quantitative dynamic systems model of health-related quality of life among older adults

    PubMed Central

    Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela

    2015-01-01

    Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722

  11. A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.

    PubMed

    Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao

    2015-06-15

    ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Least squares QR-based decomposition provides an efficient way of computing optimal regularization parameter in photoacoustic tomography.

    PubMed

    Shaw, Calvin B; Prakash, Jaya; Pramanik, Manojit; Yalavarthy, Phaneendra K

    2013-08-01

    A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison.

  13. TH-AB-209-09: Quantitative Imaging of Electrical Conductivity by VHF-Induced Thermoacoustics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patch, S; Hull, D; See, W

    Purpose: To demonstrate that very high frequency (VHF) induced thermoacoustics has the potential to provide quantitative images of electrical conductivity in Siemens/meter, much as shear wave elastography provides tissue stiffness in kPa. Quantitatively imaging a large organ requires exciting thermoacoustic pulses throughout the volume and broadband detection of those pulses because tomographic image reconstruction preserves frequency content. Applying the half-wavelength limit to a 200-micron inclusion inside a 7.5 cm diameter organ requires measurement sensitivity to frequencies ranging from 4 MHz down to 10 kHz, respectively. VHF irradiation provides superior depth penetration over near infrared used in photoacoustics. Additionally, VHF signalmore » production is proportional to electrical conductivity, and prostate cancer is known to suppress electrical conductivity of prostatic fluid. Methods: A dual-transducer system utilizing a P4-1 array connected to a Verasonics V1 system augmented by a lower frequency focused single element transducer was developed. Simultaneous acquisition of VHF-induced thermoacoustic pulses by both transducers enabled comparison of transducer performance. Data from the clinical array generated a stack of 96-images with separation of 0.3 mm, whereas the single element transducer imaged only in a single plane. In-plane resolution and quantitative accuracy were measured at isocenter. Results: The array provided volumetric imaging capability with superior resolution whereas the single element transducer provided superior quantitative accuracy. Combining axial images from both transducers preserved resolution of the P4-1 array and improved image contrast. Neither transducer was sensitive to frequencies below 50 kHz, resulting in a DC offset and low-frequency shading over fields of view exceeding 15 mm. Fresh human prostates were imaged ex vivo and volumetric reconstructions reveal structures rarely seen in diagnostic images. Conclusion: Quantitative whole-organ thermoacoustic tomography will be feasible by sparsely interspersing transducer elements sensitive to the low end of the ultrasonic range.« less

  14. The mummified brain of a pleistocene woolly mammoth (Mammuthus primigenius) compared with the brain of the extant African elephant (Loxodonta africana).

    PubMed

    Kharlamova, Anastasia S; Saveliev, Sergei V; Protopopov, Albert V; Maseko, Busisiwe C; Bhagwandin, Adhil; Manger, Paul R

    2015-11-01

    This study presents the results of an examination of the mummified brain of a pleistocene woolly mammoth (Mammuthus primigenius) recovered from the Yakutian permafrost in Siberia, Russia. This unique specimen (from 39,440-38,850 years BP) provides the rare opportunity to compare the brain morphology of this extinct species with a related extant species, the African elephant (Loxodonta africana). An anatomical description of the preserved brain of the woolly mammoth is provided, along with a series of quantitative analyses of various brain structures. These descriptions are based on visual inspection of the actual specimen as well as qualitative and quantitative comparison of computed tomography imaging data obtained for the woolly mammoth in comparison with magnetic resonance imaging data from three African elephant brains. In general, the brain of the woolly mammoth specimen examined, estimated to weigh between 4,230 and 4,340 g, showed the typical shape, size, and gross structures observed in extant elephants. Quantitative comparative analyses of various features of the brain, such as the amygdala, corpus callosum, cerebellum, and gyrnecephalic index, all indicate that the brain of the woolly mammoth specimen examined has many similarities with that of modern African elephants. The analysis provided here indicates that a specific brain type representative of the Elephantidae is likely to be a feature of this mammalian family. In addition, the extensive similarities between the woolly mammoth brain and the African elephant brain indicate that the specializations observed in the extant elephant brain are likely to have been present in the woolly mammoth. © 2015 Wiley Periodicals, Inc.

  15. Drift mobility of photo-electrons in organic molecular crystals: Quantitative comparison between theory and experiment

    NASA Astrophysics Data System (ADS)

    Reineker, P.; Kenkre, V. M.; Kühne, R.

    1981-08-01

    A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.

  16. Quantitative analysis and comparative study of four cities green pattern in API system on the background of big data

    NASA Astrophysics Data System (ADS)

    Xin, YANG; Si-qi, WU; Qi, ZHANG

    2018-05-01

    Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.

  17. Recent trends in high spin sensitivity magnetic resonance

    NASA Astrophysics Data System (ADS)

    Blank, Aharon; Twig, Ygal; Ishay, Yakir

    2017-07-01

    Magnetic resonance is a very powerful methodology that has been employed successfully in many applications for about 70 years now, resulting in a wealth of scientific, technological, and diagnostic data. Despite its many advantages, one major drawback of magnetic resonance is its relatively poor sensitivity and, as a consequence, its bad spatial resolution when examining heterogeneous samples. Contemporary science and technology often make use of very small amounts of material and examine heterogeneity on a very small length scale, both of which are well beyond the current capabilities of conventional magnetic resonance. It is therefore very important to significantly improve both the sensitivity and the spatial resolution of magnetic resonance techniques. The quest for higher sensitivity led in recent years to the development of many alternative detection techniques that seem to rival and challenge the conventional ;old-fashioned; induction-detection approach. The aim of this manuscript is to briefly review recent advances in the field, and to provide a quantitative as well as qualitative comparison between various detection methods with an eye to future potential advances and developments. We first offer a common definition of sensitivity in magnetic resonance to enable proper quantitative comparisons between various detection methods. Following that, up-to-date information about the sensitivity capabilities of the leading recently-developed detection approaches in magnetic resonance is provided, accompanied by a critical comparison between them and induction detection. Our conclusion from this comparison is that induction detection is still indispensable, and as such, it is very important to look for ways to significantly improve it. To do so, we provide expressions for the sensitivity of induction-detection, derived from both classical and quantum mechanics, that identify its main limiting factors. Examples from current literature, as well as a description of new ideas, show how these limiting factors can be mitigated to significantly improve the sensitivity of induction detection. Finally, we outline some directions for the possible applications of high-sensitivity induction detection in the field of electron spin resonance.

  18. A Comparison of Temporal Dominance of Sensation (TDS) and Quantitative Descriptive Analysis (QDA™) to Identify Flavors in Strawberries.

    PubMed

    Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell

    2018-04-01

    Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.

  19. Cost and Efficacy Assessment of an Alternative Medication Compliance Urine Drug Testing Strategy.

    PubMed

    Doyle, Kelly; Strathmann, Frederick G

    2017-02-01

    This study investigates the frequency at which quantitative results provide additional clinical benefit compared to qualitative results alone. A comparison between alternative urine drug screens and conventional screens including the assessment of cost-to-payer differences, accuracy of prescription compliance or polypharmacy/substance abuse was also included. In a reference laboratory evaluation of urine specimens from across the United States, 213 urine specimens with provided prescription medication information (302 prescriptions) were analyzed by two testing algorithms: 1) conventional immunoassay screen with subsequent reflexive testing of positive results by quantitative mass spectrometry; and 2) a combined immunoassay/qualitative mass-spectrometry screen that substantially reduced the need for subsequent testing. The qualitative screen was superior to immunoassay with reflex to mass spectrometry in confirming compliance per prescription (226/302 vs 205/302), and identifying non-prescription abuse (97 vs 71). Pharmaceutical impurities and inconsistent drug metabolite patterns were detected in only 3.8% of specimens, suggesting that quantitative results have limited benefit. The percentage difference between the conventional testing algorithm and the alternative screen was projected to be 55%, and a 2-year evaluation of test utilization as a measure of test order volume follows an exponential trend for alternative screen test orders over conventional immunoassay screens that require subsequent confirmation testing. Alternative, qualitative urine drug screens provide a less expensive, faster, and more comprehensive evaluation of patient medication compliance and drug abuse. The vast majority of results were interpretable with qualitative results alone indicating a reduced need to automatically reflex to quantitation or provide quantitation for the majority of patients. This strategy highlights a successful approach using an alternative strategy for both the laboratory and physician to align clinical needs while being mindful of costs.

  20. Comparison of the signal-to-noise characteristics of quantum versus thermal ghost imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Sullivan, Malcolm N.; Chan, Kam Wai Clifford; Boyd, Robert W.

    2010-11-15

    We present a theoretical comparison of the signal-to-noise characteristics of quantum versus thermal ghost imaging. We first calculate the signal-to-noise ratio of each process in terms of its controllable experimental conditions. We show that a key distinction is that a thermal ghost image always resides on top of a large background; the fluctuations in this background constitutes an intrinsic noise source for thermal ghost imaging. In contrast, there is a negligible intrinsic background to a quantum ghost image. However, for practical reasons involving achievable illumination levels, acquisition times for thermal ghost images are often much shorter than those for quantummore » ghost images. We provide quantitative predictions for the conditions under which each process provides superior performance. Our conclusion is that each process can provide useful functionality, although under complementary conditions.« less

  1. Tree Testing of Hierarchical Menu Structures for Health Applications

    PubMed Central

    Le, Thai; Chaudhuri, Shomir; Chung, Jane; Thompson, Hilaire J; Demiris, George

    2014-01-01

    To address the need for greater evidence-based evaluation of Health Information Technology (HIT) systems we introduce a method of usability testing termed tree testing. In a tree test, participants are presented with an abstract hierarchical tree of the system taxonomy and asked to navigate through the tree in completing representative tasks. We apply tree testing to a commercially available health application, demonstrating a use case and providing a comparison with more traditional in-person usability testing methods. Online tree tests (N=54) and in-person usability tests (N=15) were conducted from August to September 2013. Tree testing provided a method to quantitatively evaluate the information structure of a system using various navigational metrics including completion time, task accuracy, and path length. The results of the analyses compared favorably to the results seen from the traditional usability test. Tree testing provides a flexible, evidence-based approach for researchers to evaluate the information structure of HITs. In addition, remote tree testing provides a quick, flexible, and high volume method of acquiring feedback in a structured format that allows for quantitative comparisons. With the diverse nature and often large quantities of health information available, addressing issues of terminology and concept classifications during the early development process of a health information system will improve navigation through the system and save future resources. Tree testing is a usability method that can be used to quickly and easily assess information hierarchy of health information systems. PMID:24582924

  2. iMet-Q: A User-Friendly Tool for Label-Free Metabolomics Quantitation Using Dynamic Peak-Width Determination

    PubMed Central

    Chang, Hui-Yin; Chen, Ching-Tai; Lih, T. Mamie; Lynn, Ke-Shiuan; Juo, Chiun-Gung; Hsu, Wen-Lian; Sung, Ting-Yi

    2016-01-01

    Efficient and accurate quantitation of metabolites from LC-MS data has become an important topic. Here we present an automated tool, called iMet-Q (intelligent Metabolomic Quantitation), for label-free metabolomics quantitation from high-throughput MS1 data. By performing peak detection and peak alignment, iMet-Q provides a summary of quantitation results and reports ion abundance at both replicate level and sample level. Furthermore, it gives the charge states and isotope ratios of detected metabolite peaks to facilitate metabolite identification. An in-house standard mixture and a public Arabidopsis metabolome data set were analyzed by iMet-Q. Three public quantitation tools, including XCMS, MetAlign, and MZmine 2, were used for performance comparison. From the mixture data set, seven standard metabolites were detected by the four quantitation tools, for which iMet-Q had a smaller quantitation error of 12% in both profile and centroid data sets. Our tool also correctly determined the charge states of seven standard metabolites. By searching the mass values for those standard metabolites against Human Metabolome Database, we obtained a total of 183 metabolite candidates. With the isotope ratios calculated by iMet-Q, 49% (89 out of 183) metabolite candidates were filtered out. From the public Arabidopsis data set reported with two internal standards and 167 elucidated metabolites, iMet-Q detected all of the peaks corresponding to the internal standards and 167 metabolites. Meanwhile, our tool had small abundance variation (≤0.19) when quantifying the two internal standards and had higher abundance correlation (≥0.92) when quantifying the 167 metabolites. iMet-Q provides user-friendly interfaces and is publicly available for download at http://ms.iis.sinica.edu.tw/comics/Software_iMet-Q.html. PMID:26784691

  3. Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing

    PubMed Central

    Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak

    2012-01-01

    This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700

  4. Development and characterization of a dynamic lesion phantom for the quantitative evaluation of dynamic contrast-enhanced MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Hariharan, Prasanna; Myers, Matthew R; Badano, Aldo

    2011-10-01

    To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml/s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml/s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions.

  5. Garlic (Allium sativum L.) fertility: transcriptome and proteome analyses provide insight into flower and pollen development

    PubMed Central

    Shemesh-Mayer, Einat; Ben-Michael, Tomer; Rotem, Neta; Rabinowitch, Haim D.; Doron-Faigenboim, Adi; Kosmala, Arkadiusz; Perlikowski, Dawid; Sherman, Amir; Kamenetsky, Rina

    2015-01-01

    Commercial cultivars of garlic, a popular condiment, are sterile, making genetic studies and breeding of this plant challenging. However, recent fertility restoration has enabled advanced physiological and genetic research and hybridization in this important crop. Morphophysiological studies, combined with transcriptome and proteome analyses and quantitative PCR validation, enabled the identification of genes and specific processes involved in gametogenesis in fertile and male-sterile garlic genotypes. Both genotypes exhibit normal meiosis at early stages of anther development, but in the male-sterile plants, tapetal hypertrophy after microspore release leads to pollen degeneration. Transcriptome analysis and global gene-expression profiling showed that >16,000 genes are differentially expressed in the fertile vs. male-sterile developing flowers. Proteome analysis and quantitative comparison of 2D-gel protein maps revealed 36 significantly different protein spots, 9 of which were present only in the male-sterile genotype. Bioinformatic and quantitative PCR validation of 10 candidate genes exhibited significant expression differences between male-sterile and fertile flowers. A comparison of morphophysiological and molecular traits of fertile and male-sterile garlic flowers suggests that respiratory restrictions and/or non-regulated programmed cell death of the tapetum can lead to energy deficiency and consequent pollen abortion. Potential molecular markers for male fertility and sterility in garlic are proposed. PMID:25972879

  6. Comparison of detection limits in environmental analysis--is it possible? An approach on quality assurance in the lower working range by verification.

    PubMed

    Geiss, S; Einax, J W

    2001-07-01

    Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.

  7. Comparison of the radiological and chemical toxicity of lead

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beitel, G.A.; Mott, S.

    1995-03-01

    This report estimates the worst-case radiological dose to an individual from ingested lead containing picocurie levels of radionuclides and then compares the calculated radiological health effects to the chemical toxic effects from that same lead. This comparison provides an estimate of the consequences of inadvertently recycling, in the commercial market, lead containing nominally undetectable concentrations of radionuclides. Quantitative expressions for the radiological and chemical toxicities of lead are based on concentrations of lead in the blood stream. The result shows that the chemical toxicity of lead is a greater health hazard, by orders of magnitude, than any probable companion radiationmore » dose.« less

  8. What We Talk About When We Talk About Drought: Tree-ring Perspectives on Model-Data Comparisons in Hydroclimate Research

    NASA Astrophysics Data System (ADS)

    Cook, B.; Anchukaitis, K. J.

    2017-12-01

    Comparative analyses of paleoclimate reconstructions and climate model simulations can provide valuable insights into past and future climate events. Conducting meaningful and quantitative comparisons, however, can be difficult for a variety of reasons. Here, we use tree-ring based hydroclimate reconstructions to discuss some best practices for paleoclimate-model comparisons, highlighting recent studies that have successfully used this approach. These analyses have improved our understanding of the Medieval-era megadroughts, ocean forcing of large scale drought patterns, and even climate change contributions to future drought risk. Additional work is needed, however, to better reconcile and formalize uncertainties across observed, modeled, and reconstructed variables. In this regard, process based forward models of proxy-systems will likely be a critical tool moving forward.

  9. Comparison of four glycosyl residue composition methods for effectiveness in detecting sugars from cell walls of dicot and grass tissues.

    PubMed

    Biswal, Ajaya K; Tan, Li; Atmodjo, Melani A; DeMartini, Jaclyn; Gelineo-Albersheim, Ivana; Hunt, Kimberly; Black, Ian M; Mohanty, Sushree S; Ryno, David; Wyman, Charles E; Mohnen, Debra

    2017-01-01

    The effective use of plant biomass for biofuel and bioproduct production requires a comprehensive glycosyl residue composition analysis to understand the different cell wall polysaccharides present in the different biomass sources. Here we compared four methods side-by-side for their ability to measure the neutral and acidic sugar composition of cell walls from herbaceous, grass, and woody model plants and bioenergy feedstocks. Arabidopsis, Populus , rice, and switchgrass leaf cell walls, as well as cell walls from Populus wood, rice stems, and switchgrass tillers, were analyzed by (1) gas chromatography-mass spectrometry (GC-MS) of alditol acetates combined with a total uronic acid assay; (2) carbodiimide reduction of uronic acids followed by GC-MS of alditol acetates; (3) GC-MS of trimethylsilyl (TMS) derivatives; and (4) high-pressure, anion-exchange chromatography (HPAEC). All four methods gave comparable abundance ranking of the seven neutral sugars, and three of the methods were able to quantify unique acidic sugars. The TMS, HPAEC, and carbodiimide methods provided comparable quantitative results for the specific neutral and acidic sugar content of the biomass, with the TMS method providing slightly greater yield of specific acidic sugars and high total sugar yields. The alditol acetate method, while providing comparable information on the major neutral sugars, did not provide the requisite quantitative information on the specific acidic sugars in plant biomass. Thus, the alditol acetate method is the least informative of the four methods. This work provides a side-by-side comparison of the efficacy of four different established glycosyl residue composition analysis methods in the analysis of the glycosyl residue composition of cell walls from both dicot (Arabidopsis and Populus ) and grass (rice and switchgrass) species. Both primary wall-enriched leaf tissues and secondary wall-enriched wood/stem tissues were analyzed for mol% and mass yield of the non-cellulosic sugars. The TMS, HPAEC, and carbodiimide methods were shown to provide comparable quantitative data on the nine neutral and acidic sugars present in all plant cell walls.

  10. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  11. Effects of forest practices on peak flows and consequent channel response: a state-of-science report for western Oregon and Washington.

    Treesearch

    Gordon E. Grant; Sarah L. Lewis; Frederick J. Swanson; John H. Cissel; Jeffrey J. McDonnell

    2008-01-01

    This is a state-of-the-science synthesis of the effects of forest harvest activities on peak flows and channel morphology in the Pacific Northwest, with a specific focus on western Oregon and Washington. We develop a database of relevant studies reporting peak flow data across rain-, transient-, and snow-dominated hydrologic zones, and provide a quantitative comparison...

  12. Stimulation Induced Changes in Frog Neuromuscular Junctions: A Quantitative Ultrastructural Comparison of Rapid-Frozen and Chemically Fixed Nerve Terminals

    DTIC Science & Technology

    1984-03-06

    study was conducted to determine the presynaptic morphological changes due to neural activity in rapidly stimulated neuromuscular junctions...Control preparations were unstimulated and preserved either by chemical fixation or rapid-freezing. This study provides evidence that most of the...tissue. The rapid-frozen preparations in the present study showed, in addition, that rapid stimulation produces an increase in synaptic vesicle

  13. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study.

    PubMed

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W; Arokoski, Jari P; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T; Tervonen, Osmo; Koski, Juhani M; Saarakkala, Simo

    2016-03-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level.

  14. Investigating Children's Abilities to Count and Make Quantitative Comparisons

    ERIC Educational Resources Information Center

    Lee, Joohi; Md-Yunus, Sham'ah

    2016-01-01

    This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…

  15. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    PubMed

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  16. Calibration with MCNP of NaI detector for the determination of natural radioactivity levels in the field.

    PubMed

    Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan

    2016-05-01

    In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Investigation of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian A.

    2005-01-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical model. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. Excellent agreement is achieved between the predicted and measured results, thereby quantitatively validating the numerical tool.

  18. A Quantitative Comparison of Calibration Methods for RGB-D Sensors Using Different Technologies.

    PubMed

    Villena-Martínez, Víctor; Fuster-Guilló, Andrés; Azorín-López, Jorge; Saval-Calvo, Marcelo; Mora-Pascual, Jeronimo; Garcia-Rodriguez, Jose; Garcia-Garcia, Alberto

    2017-01-27

    RGB-D (Red Green Blue and Depth) sensors are devices that can provide color and depth information from a scene at the same time. Recently, they have been widely used in many solutions due to their commercial growth from the entertainment market to many diverse areas (e.g., robotics, CAD, etc.). In the research community, these devices have had good uptake due to their acceptable levelofaccuracyformanyapplicationsandtheirlowcost,butinsomecases,theyworkatthelimitof their sensitivity, near to the minimum feature size that can be perceived. For this reason, calibration processes are critical in order to increase their accuracy and enable them to meet the requirements of such kinds of applications. To the best of our knowledge, there is not a comparative study of calibration algorithms evaluating its results in multiple RGB-D sensors. Specifically, in this paper, a comparison of the three most used calibration methods have been applied to three different RGB-D sensors based on structured light and time-of-flight. The comparison of methods has been carried out by a set of experiments to evaluate the accuracy of depth measurements. Additionally, an object reconstruction application has been used as example of an application for which the sensor works at the limit of its sensitivity. The obtained results of reconstruction have been evaluated through visual inspection and quantitative measurements.

  19. Satellite Derived Volcanic Ash Product Inter-Comparison in Support to SCOPE-Nowcasting

    NASA Astrophysics Data System (ADS)

    Siddans, Richard; Thomas, Gareth; Pavolonis, Mike; Bojinski, Stephan

    2016-04-01

    In support of aeronautical meteorological services, WMO organized a satellite-based volcanic ash retrieval algorithm inter-comparison activity, to improve the consistency of quantitative volcanic ash products from satellites, under the Sustained, Coordinated Processing of Environmental Satellite Data for Nowcasting (SCOPEe Nowcasting) initiative (http:/ jwww.wmo.int/pagesjprogjsatjscopee nowcasting_en.php). The aims of the intercomparison were as follows: 1. Select cases (Sarychev Peak 2009, Eyjafyallajökull 2010, Grimsvötn 2011, Puyehue-Cordón Caulle 2011, Kirishimayama 2011, Kelut 2014), and quantify the differences between satellite-derived volcanic ash cloud properties derived from different techniques and sensors; 2. Establish a basic validation protocol for satellite-derived volcanic ash cloud properties; 3. Document the strengths and weaknesses of different remote sensing approaches as a function of satellite sensor; 4. Standardize the units and quality flags associated with volcanic cloud geophysical parameters; 5. Provide recommendations to Volcanic Ash Advisory Centers (VAACs) and other users on how to best to utilize quantitative satellite products in operations; 6. Create a "road map" for future volcanic ash related scientific developments and inter-comparison/validation activities that can also be applied to SO2 clouds and emergent volcanic clouds. Volcanic ash satellite remote sensing experts from operational and research organizations were encouraged to participate in the inter-comparison activity, to establish the plans for the inter-comparison and to submit data sets. RAL was contracted by EUMETSAT to perform a systematic inter-comparison of all submitted datasets and results were reported at the WMO International Volcanic Ash Inter-comparison Meeting to held on 29 June - 2 July 2015 in Madison, WI, USA (http:/ /cimss.ssec.wisc.edujmeetings/vol_ash14). 26 different data sets were submitted, from a range of passive imagers and spectrometers and these were inter-compared against each other and against validation data such as CALIPSO lidar, ground-based lidar and aircraft observations. Results of the comparison exercise will be presented together with the conclusions and recommendations arising from the activity.

  20. Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.

    PubMed

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-02-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.

  1. An anthropomorphic phantom for quantitative evaluation of breast MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo

    2011-02-01

    In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of breast MRI imaging protocols for lesion detection and characterization.

  2. Comparability analysis of protein therapeutics by bottom-up LC-MS with stable isotope-tagged reference standards

    PubMed Central

    Manuilov, Anton V; Radziejewski, Czeslaw H

    2011-01-01

    Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled with 13C6-arginine and 13C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method was robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution. PMID:21654206

  3. Comparability analysis of protein therapeutics by bottom-up LC-MS with stable isotope-tagged reference standards.

    PubMed

    Manuilov, Anton V; Radziejewski, Czeslaw H; Lee, David H

    2011-01-01

    Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled (13)C6-arginine and (13)C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method is robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution.

  4. Deep Learning for Magnetic Resonance Fingerprinting: A New Approach for Predicting Quantitative Parameter Values from Time Series.

    PubMed

    Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas

    2017-01-01

    The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.

  5. Computer-oriented synthesis of wide-band non-uniform negative resistance amplifiers

    NASA Technical Reports Server (NTRS)

    Branner, G. R.; Chan, S.-P.

    1975-01-01

    This paper presents a synthesis procedure which provides design values for broad-band amplifiers using non-uniform negative resistance devices. Employing a weighted least squares optimization scheme, the technique, based on an extension of procedures for uniform negative resistance devices, is capable of providing designs for a variety of matching network topologies. It also provides, for the first time, quantitative results for predicting the effects of parameter element variations on overall amplifier performance. The technique is also unique in that it employs exact partial derivatives for optimization and sensitivity computation. In comparison with conventional procedures, significantly improved broad-band designs are shown to result.

  6. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    PubMed Central

    Harper, Sam; Ruder, Eric; Roman, Henry A.; Geggel, Amelia; Nweke, Onyemaechi; Payne-Sturges, Devon; Levy, Jonathan I.

    2013-01-01

    Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative measures of health inequality in other settings, and these measures may be applicable to environmental regulatory analyses. In this paper, we provide information to assist policy decision makers in determining the viability of using measures of health inequality in the context of environmental regulatory analyses. We conclude that quantification of the distribution of inequalities in health outcomes across social groups of concern, considering both within-group and between-group comparisons, would be consistent with both the structure of regulatory analysis and the core definition of environmental justice. Appropriate application of inequality indicators requires thorough characterization of the baseline distribution of exposures and risks, leveraging data generally available within regulatory analyses. Multiple inequality indicators may be applicable to regulatory analyses, and the choice among indicators should be based on explicit value judgments regarding the dimensions of environmental justice of greatest interest. PMID:23999551

  7. Uncertainties in ecosystem service maps: a comparison on the European scale.

    PubMed

    Schulp, Catharina J E; Burkhard, Benjamin; Maes, Joachim; Van Vliet, Jasper; Verburg, Peter H

    2014-01-01

    Safeguarding the benefits that ecosystems provide to society is increasingly included as a target in international policies. To support such policies, ecosystem service maps are made. However, there is little attention for the accuracy of these maps. We made a systematic review and quantitative comparison of ecosystem service maps on the European scale to generate insights in the uncertainty of ecosystem service maps and discuss the possibilities for quantitative validation. Maps of climate regulation and recreation were reasonably similar while large uncertainties among maps of erosion protection and flood regulation were observed. Pollination maps had a moderate similarity. Differences among the maps were caused by differences in indicator definition, level of process understanding, mapping aim, data sources and methodology. Absence of suitable observed data on ecosystem services provisioning hampers independent validation of the maps. Consequently, there are, so far, no accurate measures for ecosystem service map quality. Policy makers and other users need to be cautious when applying ecosystem service maps for decision-making. The results illustrate the need for better process understanding and data acquisition to advance ecosystem service mapping, modelling and validation.

  8. Correlating subjective and objective descriptors of ultra high molecular weight wear particles from total joint prostheses.

    PubMed

    McMullin, Brian T; Leung, Ming-Ying; Shanbhag, Arun S; McNulty, Donald; Mabrey, Jay D; Agrawal, C Mauli

    2006-02-01

    A total of 750 images of individual ultra-high molecular weight polyethylene (UHMWPE) particles isolated from periprosthetic failed hip, knee, and shoulder arthroplasties were extracted from archival scanning electron micrographs. Particle size and morphology was subsequently analyzed using computerized image analysis software utilizing five descriptors found in ASTM F1877-98, a standard for quantitative description of wear debris. An online survey application was developed to display particle images, and allowed ten respondents to classify particle morphologies according to commonly used terminology as fibers, flakes, or granules. Particles were categorized based on a simple majority of responses. All descriptors were evaluated using a one-way ANOVA and Tukey-Kramer test for all-pairs comparison among each class of particles. A logistic regression model using half of the particles included in the survey was then used to develop a mathematical scheme to predict whether a given particle should be classified as a fiber, flake, or granule based on its quantitative measurements. The validity of the model was then assessed using the other half of the survey particles and compared with human responses. Comparison of the quantitative measurements of isolated particles showed that the morphologies of each particle type classified by respondents were statistically different from one another (p<0.05). The average agreement between mathematical prediction and human respondents was 83.5% (standard error 0.16%). These data suggest that computerized descriptors can be feasibly correlated with subjective terminology, thus providing a basis for a common vocabulary for particle description which can be translated into quantitative dimensions.

  9. Correlating subjective and objective descriptors of ultra high molecular weight wear particles from total joint prostheses

    PubMed Central

    McMullin, Brian T.; Leung, Ming-Ying; Shanbhag, Arun S.; McNulty, Donald; Mabrey, Jay D.; Agrawal, C. Mauli

    2014-01-01

    A total of 750 images of individual ultra-high molecular weight polyethylene (UHMWPE) particles isolated from periprosthetic failed hip, knee, and shoulder arthroplasties were extracted from archival scanning electron micrographs. Particle size and morphology was subsequently analyzed using computerized image analysis software utilizing five descriptors found in ASTM F1877-98, a standard for quantitative description of wear debris. An online survey application was developed to display particle images, and allowed ten respondents to classify particle morphologies according to commonly used terminology as fibers, flakes, or granules. Particles were categorized based on a simple majority of responses. All descriptors were evaluated using a one-way ANOVA and Tukey–Kramer test for all-pairs comparison among each class of particles. A logistic regression model using half of the particles included in the survey was then used to develop a mathematical scheme to predict whether a given particle should be classified as a fiber, flake, or granule based on its quantitative measurements. The validity of the model was then assessed using the other half of the survey particles and compared with human responses. Comparison of the quantitative measurements of isolated particles showed that the morphologies of each particle type classified by respondents were statistically different from one another (po0:05). The average agreement between mathematical prediction and human respondents was 83.5% (standard error 0.16%). These data suggest that computerized descriptors can be feasibly correlated with subjective terminology, thus providing a basis for a common vocabulary for particle description which can be translated into quantitative dimensions. PMID:16112725

  10. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  11. Thermal comparison of buried-heterostructure and shallow-ridge lasers

    NASA Astrophysics Data System (ADS)

    Rustichelli, V.; Lemaître, F.; Ambrosius, H. P. M. M.; Brenot, R.; Williams, K. A.

    2018-02-01

    We present finite difference thermal modeling to predict temperature distribution, heat flux, and thermal resistance inside lasers with different waveguide geometries. We provide a quantitative experimental and theoretical comparison of the thermal behavior of shallow-ridge (SR) and buried-heterostructure (BH) lasers. We investigate the influence of a split heat source to describe p-layer Joule heating and nonradiative energy loss in the active layer and the heat-sinking from top as well as bottom when quantifying thermal impedance. From both measured values and numerical modeling we can quantify the thermal resistance for BH lasers and SR lasers, showing an improved thermal performance from 50K/W to 30K/W for otherwise equivalent BH laser designs.

  12. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18O-Labeling Method for Quantitative Proteomics

    PubMed Central

    López-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-01-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min with a minimized amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from the bacteria Shewanella oneidensis, and mouse plasma, as well as 18O labeling of such complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, rapid, and thus well-suited for automation. PMID:19555078

  13. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    PubMed

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  14. [Effect of algorithms for calibration set selection on quantitatively determining asiaticoside content in Centella total glucosides by near infrared spectroscopy].

    PubMed

    Zhan, Xue-yan; Zhao, Na; Lin, Zhao-zhou; Wu, Zhi-sheng; Yuan, Rui-juan; Qiao, Yan-jiang

    2014-12-01

    The appropriate algorithm for calibration set selection was one of the key technologies for a good NIR quantitative model. There are different algorithms for calibration set selection, such as Random Sampling (RS) algorithm, Conventional Selection (CS) algorithm, Kennard-Stone(KS) algorithm and Sample set Portioning based on joint x-y distance (SPXY) algorithm, et al. However, there lack systematic comparisons between two algorithms of the above algorithms. The NIR quantitative models to determine the asiaticoside content in Centella total glucosides were established in the present paper, of which 7 indexes were classified and selected, and the effects of CS algorithm, KS algorithm and SPXY algorithm for calibration set selection on the accuracy and robustness of NIR quantitative models were investigated. The accuracy indexes of NIR quantitative models with calibration set selected by SPXY algorithm were significantly different from that with calibration set selected by CS algorithm or KS algorithm, while the robustness indexes, such as RMSECV and |RMSEP-RMSEC|, were not significantly different. Therefore, SPXY algorithm for calibration set selection could improve the predicative accuracy of NIR quantitative models to determine asiaticoside content in Centella total glucosides, and have no significant effect on the robustness of the models, which provides a reference to determine the appropriate algorithm for calibration set selection when NIR quantitative models are established for the solid system of traditional Chinese medcine.

  15. Evaluation of the clinical sensitivity for the quantification of human immunodeficiency virus type 1 RNA in plasma: Comparison of the new COBAS TaqMan HIV-1 with three current HIV-RNA assays--LCx HIV RNA quantitative, VERSANT HIV-1 RNA 3.0 (bDNA) and COBAS AMPLICOR HIV-1 Monitor v1.5.

    PubMed

    Katsoulidou, Antigoni; Petrodaskalaki, Maria; Sypsa, Vana; Papachristou, Eleni; Anastassopoulou, Cleo G; Gargalianos, Panagiotis; Karafoulidou, Anastasia; Lazanas, Marios; Kordossis, Theodoros; Andoniadou, Anastasia; Hatzakis, Angelos

    2006-02-01

    The COBAS TaqMan HIV-1 test (Roche Diagnostics) was compared with the LCx HIV RNA quantitative assay (Abbott Laboratories), the Versant HIV-1 RNA 3.0 (bDNA) assay (Bayer) and the COBAS Amplicor HIV-1 Monitor v1.5 test (Roche Diagnostics), using plasma samples of various viral load levels from HIV-1-infected individuals. In the comparison of TaqMan with LCx, TaqMan identified as positive 77.5% of the 240 samples versus 72.1% identified by LCx assay, while their overall agreement was 94.6% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.91). Similarly, in the comparison of TaqMan with bDNA 3.0, both methods identified 76.3% of the 177 samples as positive, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.95). Finally, in the comparison of TaqMan with Monitor v1.5, TaqMan identified 79.5% of the 156 samples as positive versus 80.1% identified by Monitor v1.5, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.96). In conclusion, the new COBAS TaqMan HIV-1 test showed excellent agreement with other widely used commercially available tests for the quantitation of HIV-1 viral load.

  16. COMPARISON OF GENETIC METHODS TO OPTICAL METHODS IN THE IDENTIFICATION AND ASSESSMENT OF MOLD IN THE BUILT ENVIRONMENT -- COMPARISON OF TAQMAN AND MICROSCOPIC ANALYSIS OF CLADOSPORIUM SPORES RETRIEVED FROM ZEFON AIR-O-CELL TRACES

    EPA Science Inventory

    Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.

    In this pilot study, quantitative...

  17. Quantitative Comparison of Dense-Core Amyloid Plaque Accumulation in Amyloid-β Protein Precursor Transgenic Mice.

    PubMed

    Liu, Peng; Reichl, John H; Rao, Eshaan R; McNellis, Brittany M; Huang, Eric S; Hemmy, Laura S; Forster, Colleen L; Kuskowski, Michael A; Borchelt, David R; Vassar, Robert; Ashe, Karen H; Zahs, Kathleen R

    2017-01-01

    There exist several dozen lines of transgenic mice that express human amyloid-β protein precursor (AβPP) with Alzheimer's disease (AD)-linked mutations. AβPP transgenic mouse lines differ in the types and amounts of Aβ that they generate and in their spatiotemporal patterns of expression of Aβ assemblies, providing a toolkit to study Aβ amyloidosis and the influence of Aβ aggregation on brain function. More complete quantitative descriptions of the types of Aβ assemblies present in transgenic mice and in humans during disease progression should add to our understanding of how Aβ toxicity in mice relates to the pathogenesis of AD. Here, we provide a direct quantitative comparison of amyloid plaque burdens and plaque sizes in four lines of AβPP transgenic mice. We measured the fraction of cortex and hippocampus occupied by dense-core plaques, visualized by staining with Thioflavin S, in mice from young adulthood through advanced age. We found that the plaque burdens among the transgenic lines varied by an order of magnitude: at 15 months of age, the oldest age studied, the median cortical plaque burden in 5XFAD mice was already ∼4.5 times that of 21-month-old Tg2576 mice and ∼15 times that of 21-24-month-old rTg9191 mice. Plaque-size distributions changed across the lifespan in a line- and region-dependent manner. We also compared the dense-core plaque burdens in the mice to those measured in a set of pathologically-confirmed AD cases from the Nun Study. Cortical plaque burdens in Tg2576, APPSwePS1ΔE9, and 5XFAD mice eventually far exceeded those measured in the human cohort.

  18. Quantitative Comparison of Dense-Core Amyloid Plaque Accumulation in Amyloid-β Precursor Protein Transgenic Mice

    PubMed Central

    Liu, Peng; Reichl, John H.; Rao, Eshaan R.; McNellis, Brittany M.; Huang, Eric S.; Hemmy, Laura S.; Forster, Colleen L.; Kuskowski, Michael A.; Borchelt, David R.; Vassar, Robert; Ashe, Karen H.; Zahs, Kathleen R.

    2016-01-01

    There exist several dozen lines of transgenic mice that express human amyloid-β precursor protein (AβPP) with Alzheimer’s disease (AD)-linked mutations. AβPP transgenic mouse lines differ in the types and amounts of Aβ that they generate and in their spatiotemporal patterns of expression of Aβ assemblies, providing a toolkit to study Aβ amyloidosis and the influence of Aβ aggregation on brain function. More complete quantitative descriptions of the types of Aβ assemblies present in transgenic mice and in humans during disease progression should add to our understanding of how Aβ toxicity in mice relates to the pathogenesis of AD. Here, we provide a direct quantitative comparison of amyloid plaque burdens and plaque sizes in four lines of AβPP transgenic mice. We measured the fraction of cortex and hippocampus occupied by dense-core plaques, visualized by staining with Thioflavin S, in mice from young adulthood through advanced age. We found that the plaque burdens among the transgenic lines varied by an order of magnitude: at 15 months of age, the oldest age studied, the median cortical plaque burden in 5XFAD mice was already ~4.5 times that of 21-month Tg2576 mice and ~15 times that of 21–24-month rTg9191 mice. Plaque-size distributions changed across the lifespan in a line- and region-dependent manner. We also compared the dense-core plaque burdens in the mice to those measured in a set of pathologically-confirmed AD cases from the Nun Study. Cortical plaque burdens in Tg2576, APPSwePS1ΔE9, and 5XFAD mice eventually far exceeded those measured in the human cohort. PMID:28059792

  19. Combining real-time PCR and next-generation DNA sequencing to provide quantitative comparisons of fungal aerosol populations

    NASA Astrophysics Data System (ADS)

    Dannemiller, Karen C.; Lang-Yona, Naama; Yamamoto, Naomichi; Rudich, Yinon; Peccia, Jordan

    2014-02-01

    We examined fungal communities associated with the PM10 mass of Rehovot, Israel outdoor air samples collected in the spring and fall seasons. Fungal communities were described by 454 pyrosequencing of the internal transcribed spacer (ITS) region of the fungal ribosomal RNA encoding gene. To allow for a more quantitative comparison of fungal exposure in humans, the relative abundance values of specific taxa were transformed to absolute concentrations through multiplying these values by the sample's total fungal spore concentration (derived from universal fungal qPCR). Next, the sequencing-based absolute concentrations for Alternaria alternata, Cladosporium cladosporioides, Epicoccum nigrum, and Penicillium/Aspergillus spp. were compared to taxon-specific qPCR concentrations for A. alternata, C. cladosporioides, E. nigrum, and Penicillium/Aspergillus spp. derived from the same spring and fall aerosol samples. Results of these comparisons showed that the absolute concentration values generated from pyrosequencing were strongly associated with the concentration values derived from taxon-specific qPCR (for all four species, p < 0.005, all R > 0.70). The correlation coefficients were greater for species present in higher concentrations. Our microbial aerosol population analyses demonstrated that fungal diversity (number of fungal operational taxonomic units) was higher in the spring compared to the fall (p = 0.02), and principal coordinate analysis showed distinct seasonal differences in taxa distribution (ANOSIM p = 0.004). Among genera containing allergenic and/or pathogenic species, the absolute concentrations of Alternaria, Aspergillus, Fusarium, and Cladosporium were greater in the fall, while Cryptococcus, Penicillium, and Ulocladium concentrations were greater in the spring. The transformation of pyrosequencing fungal population relative abundance data to absolute concentrations can improve next-generation DNA sequencing-based quantitative aerosol exposure assessment.

  20. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  1. The Application of SILAC Mouse in Human Body Fluid Proteomics Analysis Reveals Protein Patterns Associated with IgA Nephropathy.

    PubMed

    Zhao, Shilin; Li, Rongxia; Cai, Xiaofan; Chen, Wanjia; Li, Qingrun; Xing, Tao; Zhu, Wenjie; Chen, Y Eugene; Zeng, Rong; Deng, Yueyi

    2013-01-01

    Body fluid proteome is the most informative proteome from a medical viewpoint. But the lack of accurate quantitation method for complicated body fluid limited its application in disease research and biomarker discovery. To address this problem, we introduced a novel strategy, in which SILAC-labeled mouse serum was used as internal standard for human serum and urine proteome analysis. The SILAC-labeled mouse serum was mixed with human serum and urine, and multidimensional separation coupled with tandem mass spectrometry (IEF-LC-MS/MS) analysis was performed. The shared peptides between two species were quantified by their SILAC pairs, and the human-only peptides were quantified by mouse peptides with coelution. The comparison for the results from two replicate experiments indicated the high repeatability of our strategy. Then the urine from Immunoglobulin A nephropathy patients treated and untreated was compared by this quantitation strategy. Fifty-three peptides were found to be significantly changed between two groups, including both known diagnostic markers for IgAN and novel candidates, such as Complement C3, Albumin, VDBP, ApoA,1 and IGFBP7. In conclusion, we have developed a practical and accurate quantitation strategy for comparison of complicated human body fluid proteome. The results from such strategy could provide potential disease-related biomarkers for evaluation of treatment.

  2. Normalized Temperature Contrast Processing in Flash Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing of flash infrared thermography method by the author given in US 8,577,120 B1. The method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided, including converting one from the other. Methods of assessing emissivity of the object, afterglow heat flux, reflection temperature change and temperature video imaging during flash thermography are provided. Temperature imaging and normalized temperature contrast imaging provide certain advantages over pixel intensity normalized contrast processing by reducing effect of reflected energy in images and measurements, providing better quantitative data. The subject matter for this paper mostly comes from US 9,066,028 B1 by the author. Examples of normalized image processing video images and normalized temperature processing video images are provided. Examples of surface temperature video images, surface temperature rise video images and simple contrast video images area also provided. Temperature video imaging in flash infrared thermography allows better comparison with flash thermography simulation using commercial software which provides temperature video as the output. Temperature imaging also allows easy comparison of surface temperature change to camera temperature sensitivity or noise equivalent temperature difference (NETD) to assess probability of detecting (POD) anomalies.

  3. Wide-field spectrally resolved quantitative fluorescence imaging system: toward neurosurgical guidance in glioma resection

    NASA Astrophysics Data System (ADS)

    Xie, Yijing; Thom, Maria; Ebner, Michael; Wykes, Victoria; Desjardins, Adrien; Miserocchi, Anna; Ourselin, Sebastien; McEvoy, Andrew W.; Vercauteren, Tom

    2017-11-01

    In high-grade glioma surgery, tumor resection is often guided by intraoperative fluorescence imaging. 5-aminolevulinic acid-induced protoporphyrin IX (PpIX) provides fluorescent contrast between normal brain tissue and glioma tissue, thus achieving improved tumor delineation and prolonged patient survival compared with conventional white-light-guided resection. However, commercially available fluorescence imaging systems rely solely on visual assessment of fluorescence patterns by the surgeon, which makes the resection more subjective than necessary. We developed a wide-field spectrally resolved fluorescence imaging system utilizing a Generation II scientific CMOS camera and an improved computational model for the precise reconstruction of the PpIX concentration map. In our model, the tissue's optical properties and illumination geometry, which distort the fluorescent emission spectra, are considered. We demonstrate that the CMOS-based system can detect low PpIX concentration at short camera exposure times, while providing high-pixel resolution wide-field images. We show that total variation regularization improves the contrast-to-noise ratio of the reconstructed quantitative concentration map by approximately twofold. Quantitative comparison between the estimated PpIX concentration and tumor histopathology was also investigated to further evaluate the system.

  4. Genetic basis of adaptation in Arabidopsis thaliana: local adaptation at the seed dormancy QTL DOG1.

    PubMed

    Kronholm, Ilkka; Picó, F Xavier; Alonso-Blanco, Carlos; Goudet, Jérôme; de Meaux, Juliette

    2012-07-01

    Local adaptation provides an opportunity to study the genetic basis of adaptation and investigate the allelic architecture of adaptive genes. We study delay of germination 1 (DOG1), a gene controlling natural variation in seed dormancy in Arabidopsis thaliana and investigate evolution of dormancy in 41 populations distributed in four regions separated by natural barriers. Using F(ST) and Q(ST) comparisons, we compare variation at DOG1 with neutral markers and quantitative variation in seed dormancy. Patterns of genetic differentiation among populations suggest that the gene DOG1 contributes to local adaptation. Although Q(ST) for seed dormancy is not different from F(ST) for neutral markers, a correlation with variation in summer precipitation supports that seed dormancy is adaptive. We characterize dormancy variation in several F(2) -populations and show that a series of functionally distinct alleles segregate at the DOG1 locus. Theoretical models have shown that the number and effect of alleles segregatin at quantitative trait loci (QTL) have important consequences for adaptation. Our results provide support to models postulating a large number of alleles at quantitative trait loci involved in adaptation. © 2012 The Author(s).

  5. Steady-state and transient operation of a heat-pipe radiator system

    NASA Technical Reports Server (NTRS)

    Sellers, J. P.

    1974-01-01

    Data obtained on a VCHP heat-pipe radiator system tested in a vacuum environment were studied. Analyses and interpretation of the steady-state results are presented along with an initial analysis of some of the transient data. Particular emphasis was placed on quantitative comparisons of the experimental data with computer model simulations. The results of the study provide a better understanding of the system but do not provide a complete explanation for the observed low VCHP performance and the relatively flat radiator panel temperature distribution. The results of the study also suggest hardware, software, and testing improvements.

  6. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  7. Rapid comparison of properties on protein surface

    PubMed Central

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-01-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM β/α barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure. PMID:18618695

  8. Rapid comparison of properties on protein surface.

    PubMed

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-10-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM beta/alpha barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure.

  9. Congruent climate-related genecological responses from molecular markers and quantitative traits for western white pine (Pinus monticola)

    Treesearch

    Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim

    2009-01-01

    Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...

  10. Chemometric Methods to Quantify 1D and 2D NMR Spectral Differences Among Similar Protein Therapeutics.

    PubMed

    Chen, Kang; Park, Junyong; Li, Feng; Patil, Sharadrao M; Keire, David A

    2018-04-01

    NMR spectroscopy is an emerging analytical tool for measuring complex drug product qualities, e.g., protein higher order structure (HOS) or heparin chemical composition. Most drug NMR spectra have been visually analyzed; however, NMR spectra are inherently quantitative and multivariate and thus suitable for chemometric analysis. Therefore, quantitative measurements derived from chemometric comparisons between spectra could be a key step in establishing acceptance criteria for a new generic drug or a new batch after manufacture change. To measure the capability of chemometric methods to differentiate comparator NMR spectra, we calculated inter-spectra difference metrics on 1D/2D spectra of two insulin drugs, Humulin R® and Novolin R®, from different manufacturers. Both insulin drugs have an identical drug substance but differ in formulation. Chemometric methods (i.e., principal component analysis (PCA), 3-way Tucker3 or graph invariant (GI)) were performed to calculate Mahalanobis distance (D M ) between the two brands (inter-brand) and distance ratio (D R ) among the different lots (intra-brand). The PCA on 1D inter-brand spectral comparison yielded a D M value of 213. In comparing 2D spectra, the Tucker3 analysis yielded the highest differentiability value (D M  = 305) in the comparisons made followed by PCA (D M  = 255) then the GI method (D M  = 40). In conclusion, drug quality comparisons among different lots might benefit from PCA on 1D spectra for rapidly comparing many samples, while higher resolution but more time-consuming 2D-NMR-data-based comparisons using Tucker3 analysis or PCA provide a greater level of assurance for drug structural similarity evaluation between drug brands.

  11. Quantitative proteomic characterization of the lung extracellular matrix in chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis.

    PubMed

    Åhrman, Emma; Hallgren, Oskar; Malmström, Lars; Hedström, Ulf; Malmström, Anders; Bjermer, Leif; Zhou, Xiao-Hong; Westergren-Thorsson, Gunilla; Malmström, Johan

    2018-03-01

    Remodeling of the extracellular matrix (ECM) is a common feature in lung diseases such as chronic obstructive pulmonary disease (COPD) and idiopathic pulmonary fibrosis (IPF). Here, we applied a sequential tissue extraction strategy to describe disease-specific remodeling of human lung tissue in disease, using end-stages of COPD and IPF. Our strategy was based on quantitative comparison of the disease proteomes, with specific focus on the matrisome, using data-independent acquisition and targeted data analysis (SWATH-MS). Our work provides an in-depth proteomic characterization of human lung tissue during impaired tissue remodeling. In addition, we show important quantitative and qualitative effects of the solubility of matrisome proteins. COPD was characterized by a disease-specific increase in ECM regulators, metalloproteinase inhibitor 3 (TIMP3) and matrix metalloproteinase 28 (MMP-28), whereas for IPF, impairment in cell adhesion proteins, such as collagen VI and laminins, was most prominent. For both diseases, we identified increased levels of proteins involved in the regulation of endopeptidase activity, with several proteins belonging to the serpin family. The established human lung quantitative proteome inventory and the construction of a tissue-specific protein assay library provides a resource for future quantitative proteomic analyses of human lung tissues. We present a sequential tissue extraction strategy to determine changes in extractability of matrisome proteins in end-stage COPD and IPF compared to healthy control tissue. Extensive quantitative analysis of the proteome changes of the disease states revealed altered solubility of matrisome proteins involved in ECM regulators and cell-ECM communication. The results highlight disease-specific remodeling mechanisms associated with COPD and IPF. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Surface areas of fractally rough particles studied by scattering

    NASA Astrophysics Data System (ADS)

    Hurd, Alan J.; Schaefer, Dale W.; Smith, Douglas M.; Ross, Steven B.; Le Méhauté, Alain; Spooner, Steven

    1989-05-01

    The small-angle scattering from fractally rough surfaces has the potential to give information on the surface area at a given resolution. By use of quantitative neutron and x-ray scattering, a direct comparison of surface areas of fractally rough powders was made between scattering and adsorption techniques. This study supports a recently proposed correction to the theory for scattering from fractal surfaces. In addition, the scattering data provide an independent calibration of molecular adsorbate areas.

  13. 75 FR 68468 - List of Fisheries for 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...

  14. SWMPr: An R Package for Retrieving, Organizing, and ...

    EPA Pesticide Factsheets

    The System-Wide Monitoring Program (SWMP) was implemented in 1995 by the US National Estuarine Research Reserve System. This program has provided two decades of continuous monitoring data at over 140 fixed stations in 28 estuaries. However, the increasing quantity of data provided by the monitoring network has complicated broad-scale comparisons between systems and, in some cases, prevented simple trend analysis of water quality parameters at individual sites. This article describes the SWMPr package that provides several functions that facilitate data retrieval, organization, andanalysis of time series data in the reserve estuaries. Previously unavailable functions for estuaries are also provided to estimate rates of ecosystem metabolism using the open-water method. The SWMPr package has facilitated a cross-reserve comparison of water quality trends and links quantitative information with analysis tools that have use for more generic applications to environmental time series. The manuscript describes a software package that was recently developed to retrieve, organize, and analyze monitoring data from the National Estuarine Research Reserve System. Functions are explained in detail, including recent applications for trend analysis of ecosystem metabolism.

  15. Improving the seismic small-scale modelling by comparison with numerical methods

    NASA Astrophysics Data System (ADS)

    Pageot, Damien; Leparoux, Donatienne; Le Feuvre, Mathieu; Durand, Olivier; Côte, Philippe; Capdeville, Yann

    2017-10-01

    The potential of experimental seismic modelling at reduced scale provides an intermediate step between numerical tests and geophysical campaigns on field sites. Recent technologies such as laser interferometers offer the opportunity to get data without any coupling effects. This kind of device is used in the Mesures Ultrasonores Sans Contact (MUSC) measurement bench for which an automated support system makes possible to generate multisource and multireceivers seismic data at laboratory scale. Experimental seismic modelling would become a great tool providing a value-added stage in the imaging process validation if (1) the experimental measurement chain is perfectly mastered, and thus if the experimental data are perfectly reproducible with a numerical tool, as well as if (2) the effective source is reproducible along the measurement setup. These aspects for a quantitative validation concerning devices with piezoelectrical sources and a laser interferometer have not been yet quantitatively studied in published studies. Thus, as a new stage for the experimental modelling approach, these two key issues are tackled in the proposed paper in order to precisely define the quality of the experimental small-scale data provided by the bench MUSC, which are available in the scientific community. These two steps of quantitative validation are dealt apart any imaging techniques in order to offer the opportunity to geophysicists who want to use such data (delivered as free data) of precisely knowing their quality before testing any imaging technique. First, in order to overcome the 2-D-3-D correction usually done in seismic processing when comparing 2-D numerical data with 3-D experimental measurement, we quantitatively refined the comparison between numerical and experimental data by generating accurate experimental line sources, avoiding the necessity of geometrical spreading correction for 3-D point-source data. The comparison with 2-D and 3-D numerical modelling is based on the Spectral Element Method. The approach shows the relevance of building a line source by sampling several source points, except the boundaries effects on later arrival times. Indeed, the experimental results highlight the amplitude feature and the delay equal to π/4 provided by a line source in the same manner than numerical data. In opposite, the 2-D corrections applied on 3-D data showed discrepancies which are higher on experimental data than on numerical ones due to the source wavelet shape and interferences between different arrivals. The experimental results from the approach proposed here show that discrepancies are avoided, especially for the reflected echoes. Concerning the second point aiming to assess the experimental reproducibility of the source, correlation coefficients of recording from a repeated source impact on a homogeneous model are calculated. The quality of the results, that is, higher than 0.98, allow to calculate a mean source wavelet by inversion of a mean data set. Results obtained on a more realistic model simulating clays on limestones, confirmed the reproducibility of the source impact.

  16. Estimates of occupational safety and health impacts resulting from large-scale production of major photovoltaic technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, T.; Ungers, L.; Briggs, T.

    1980-08-01

    The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many ofmore » the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.« less

  17. Development and characterization of a dynamic lesion phantom for the quantitative evaluation of dynamic contrast-enhanced MRI

    PubMed Central

    Freed, Melanie; de Zwart, Jacco A.; Hariharan, Prasanna; R. Myers, Matthew; Badano, Aldo

    2011-01-01

    Purpose: To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. Methods: The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml∕s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. Results: The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml∕s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. Conclusions: The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions. PMID:21992378

  18. Impact of immersion oils and mounting media on the confocal imaging of dendritic spines

    PubMed Central

    Peterson, Brittni M.; Mermelstein, Paul G.; Meisel, Robert L.

    2015-01-01

    Background Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. New Method Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Results Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Comparison with Existing Method Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Conclusion Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. PMID:25601477

  19. 3D quantitative analysis of early decomposition changes of the human face.

    PubMed

    Caplova, Zuzana; Gibelli, Daniele Maria; Poppa, Pasquale; Cummaudo, Marco; Obertova, Zuzana; Sforza, Chiarella; Cattaneo, Cristina

    2018-03-01

    Decomposition of the human body and human face is influenced, among other things, by environmental conditions. The early decomposition changes that modify the appearance of the face may hamper the recognition and identification of the deceased. Quantitative assessment of those changes may provide important information for forensic identification. This report presents a pilot 3D quantitative approach of tracking early decomposition changes of a single cadaver in controlled environmental conditions by summarizing the change with weekly morphological descriptions. The root mean square (RMS) value was used to evaluate the changes of the face after death. The results showed a high correlation (r = 0.863) between the measured RMS and the time since death. RMS values of each scan are presented, as well as the average weekly RMS values. The quantification of decomposition changes could improve the accuracy of antemortem facial approximation and potentially could allow the direct comparisons of antemortem and postmortem 3D scans.

  20. Regulation of Glycan Structures in Animal Tissues

    PubMed Central

    Nairn, Alison V.; York, William S.; Harris, Kyle; Hall, Erica M.; Pierce, J. Michael; Moremen, Kelley W.

    2008-01-01

    Glycan structures covalently attached to proteins and lipids play numerous roles in mammalian cells, including protein folding, targeting, recognition, and adhesion at the molecular or cellular level. Regulating the abundance of glycan structures on cellular glycoproteins and glycolipids is a complex process that depends on numerous factors. Most models for glycan regulation hypothesize that transcriptional control of the enzymes involved in glycan synthesis, modification, and catabolism determines glycan abundance and diversity. However, few broad-based studies have examined correlations between glycan structures and transcripts encoding the relevant biosynthetic and catabolic enzymes. Low transcript abundance for many glycan-related genes has hampered broad-based transcript profiling for comparison with glycan structural data. In an effort to facilitate comparison with glycan structural data and to identify the molecular basis of alterations in glycan structures, we have developed a medium-throughput quantitative real time reverse transcriptase-PCR platform for the analysis of transcripts encoding glycan-related enzymes and proteins in mouse tissues and cells. The method employs a comprehensive list of >700 genes, including enzymes involved in sugar-nucleotide biosynthesis, transporters, glycan extension, modification, recognition, catabolism, and numerous glycosylated core proteins. Comparison with parallel microarray analyses indicates a significantly greater sensitivity and dynamic range for our quantitative real time reverse transcriptase-PCR approach, particularly for the numerous low abundance glycan-related enzymes. Mapping of the genes and transcript levels to their respective biosynthetic pathway steps allowed a comparison with glycan structural data and provides support for a model where many, but not all, changes in glycan abundance result from alterations in transcript expression of corresponding biosynthetic enzymes. PMID:18411279

  1. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  2. Reflectance spectroscopy for evaluating hair follicle cycle

    NASA Astrophysics Data System (ADS)

    Liu, Caihua; Guan, Yue; Wang, Jianru; Zhu, Dan

    2014-02-01

    Hair follicle, as a mini-organ with perpetually cycling of telogen, anagen and catagen, provides a valuable experimental model for studying hair and organ regeneration. The transition of hair follicle from telogen to anagen is a significant sign for successful regeneration. So far discrimination of the hair follicle stage is mostly based on canonical histological examination and empirical speculation based on skin color. Hardly a method has been proposed to quantitatively evaluate the hair follicle stage. In this work, a commercial optical fiber spectrometer was applied to monitor diffuse reflectance of mouse skin with hair follicle cycling, and then the change of reflectance was obtained. Histological examination was used to verify the hair follicle stage. In comparison with the histological examination, the skin diffuse reflectance was relatively high for mouse with telogen hair follicles; it decreased once hair follicles transited to anagen stage; then it increased reversely at catagen stage. This study provided a new method to quantitatively evaluate the hair follicle stage, and should be valuable for the basic and therapeutic investigations on hair regeneration.

  3. Option generation in the treatment of unstable patients: An experienced-novice comparison study.

    PubMed

    Whyte, James; Pickett-Hauber, Roxanne; Whyte, Maria D

    2016-09-01

    There are a dearth of studies that quantitatively measure nurses' appreciation of stimuli and the subsequent generation of options in practice environments. The purpose of this paper was to provide an examination of nurses' ability to solve problems while quantifying the stimuli upon which they focus during patient care activities. The study used a quantitative descriptive method that gathered performance data from a simulated task environment using multi-angle video and audio. These videos were coded and transcripts of all of the actions that occurred in the scenario and the verbal reports of the participants were compiled. The results revealed a pattern of superiority of the experienced exemplar group. Novice actions were characterized by difficulty in following common protocols, inconsistencies in their evaluative approaches, and a pattern of omissions of key actions. The study provides support for the deliberate practice-based programs designed to facilitate higher-level performance in novices. © 2016 John Wiley & Sons Australia, Ltd.

  4. The conventional tuning fork as a quantitative tool for vibration threshold.

    PubMed

    Alanazy, Mohammed H; Alfurayh, Nuha A; Almweisheer, Shaza N; Aljafen, Bandar N; Muayqil, Taim

    2018-01-01

    This study was undertaken to describe a method for quantifying vibration when using a conventional tuning fork (CTF) in comparison to a Rydel-Seiffer tuning fork (RSTF) and to provide reference values. Vibration thresholds at index finger and big toe were obtained in 281 participants. Spearman's correlations were performed. Age, weight, and height were analyzed for their covariate effects on vibration threshold. Reference values at the fifth percentile were obtained by quantile regression. The correlation coefficients between CTF and RSTF values at finger/toe were 0.59/0.64 (P = 0.001 for both). Among covariates, only age had a significant effect on vibration threshold. Reference values for CTF at finger/toe for the age groups 20-39 and 40-60 years were 7.4/4.9 and 5.8/4.6 s, respectively. Reference values for RSTF at finger/toe for the age groups 20-39 and 40-60 years were 6.9/5.5 and 6.2/4.7, respectively. CTF provides quantitative values that are as good as those provided by RSTF. Age-stratified reference data are provided. Muscle Nerve 57: 49-53, 2018. © 2017 Wiley Periodicals, Inc.

  5. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  6. An Automatic Assessment System of Diabetic Foot Ulcers Based on Wound Area Determination, Color Segmentation, and Healing Score Evaluation.

    PubMed

    Wang, Lei; Pedersen, Peder C; Strong, Diane M; Tulu, Bengisu; Agu, Emmanuel; Ignotz, Ron; He, Qian

    2015-08-07

    For individuals with type 2 diabetes, foot ulcers represent a significant health issue. The aim of this study is to design and evaluate a wound assessment system to help wound clinics assess patients with foot ulcers in a way that complements their current visual examination and manual measurements of their foot ulcers. The physical components of the system consist of an image capture box, a smartphone for wound image capture and a laptop for analyzing the wound image. The wound image assessment algorithms calculate the overall wound area, color segmented wound areas, and a healing score, to provide a quantitative assessment of the wound healing status both for a single wound image and comparisons of subsequent images to an initial wound image. The system was evaluated by assessing foot ulcers for 12 patients in the Wound Clinic at University of Massachusetts Medical School. As performance measures, the Matthews correlation coefficient (MCC) value for the wound area determination algorithm tested on 32 foot ulcer images was .68. The clinical validity of our healing score algorithm relative to the experienced clinicians was measured by Krippendorff's alpha coefficient (KAC) and ranged from .42 to .81. Our system provides a promising real-time method for wound assessment based on image analysis. Clinical comparisons indicate that the optimized mean-shift-based algorithm is well suited for wound area determination. Clinical evaluation of our healing score algorithm shows its potential to provide clinicians with a quantitative method for evaluating wound healing status. © 2015 Diabetes Technology Society.

  7. Quantitation of benzodiazepine receptor binding with PET [11C]iomazenil and SPECT [123I]iomazenil: preliminary results of a direct comparison in healthy human subjects.

    PubMed

    Bremner, J D; Baldwin, R; Horti, A; Staib, L H; Ng, C K; Tan, P Z; Zea-Ponce, Y; Zoghbi, S; Seibyl, J P; Soufer, R; Charney, D S; Innis, R B

    1999-08-31

    Although positron emission tomography (PET) and single photon emission computed tomography (SPECT) are increasingly used for quantitation of neuroreceptor binding, almost no studies to date have involved a direct comparison of the two. One study found a high level of agreement between the two techniques, although there was a systematic 30% increase in measures of benzodiazepine receptor binding in SPECT compared with PET. The purpose of the current study was to directly compare quantitation of benzodiazepine receptor binding in the same human subjects using PET and SPECT with high specific activity [11C]iomazenil and [123I]iomazenil, respectively. All subjects were administered a single bolus of high specific activity iomazenil labeled with 11C or 123I followed by dynamic PET or SPECT imaging of the brain. Arterial blood samples were obtained for measurement of metabolite-corrected radioligand in plasma. Compartmental modeling was used to fit values for kinetic rate constants of transfer of radioligand between plasma and brain compartments. These values were used for calculation of binding potential (BP = Bmax/Kd) and product of BP and the fraction of free non-protein-bound parent compound (V3'). Mean values for V3' in PET and SPECT were as follows: temporal cortex 23+/-5 and 22+/-3 ml/g, frontal cortex23+/-6 and 22+/-3 ml/g, occipital cortex 28+/-3 and 31+/-5 ml/g, and striatum 4+/-4 and 7+/-4 ml/g. These preliminary findings indicate that PET and SPECT provide comparable results in quantitation of neuroreceptor binding in the human brain.

  8. Comparison of three-way and four-way calibration for the real-time quantitative analysis of drug hydrolysis in complex dynamic samples by excitation-emission matrix fluorescence.

    PubMed

    Yin, Xiao-Li; Gu, Hui-Wen; Liu, Xiao-Lu; Zhang, Shan-Hui; Wu, Hai-Long

    2018-03-05

    Multiway calibration in combination with spectroscopic technique is an attractive tool for online or real-time monitoring of target analyte(s) in complex samples. However, how to choose a suitable multiway calibration method for the resolution of spectroscopic-kinetic data is a troubling problem in practical application. In this work, for the first time, three-way and four-way fluorescence-kinetic data arrays were generated during the real-time monitoring of the hydrolysis of irinotecan (CPT-11) in human plasma by excitation-emission matrix fluorescence. Alternating normalization-weighted error (ANWE) and alternating penalty trilinear decomposition (APTLD) were used as three-way calibration for the decomposition of the three-way kinetic data array, whereas alternating weighted residual constraint quadrilinear decomposition (AWRCQLD) and alternating penalty quadrilinear decomposition (APQLD) were applied as four-way calibration to the four-way kinetic data array. The quantitative results of the two kinds of calibration models were fully compared from the perspective of predicted real-time concentrations, spiked recoveries of initial concentration, and analytical figures of merit. The comparison study demonstrated that both three-way and four-way calibration models could achieve real-time quantitative analysis of the hydrolysis of CPT-11 in human plasma under certain conditions. However, it was also found that both of them possess some critical advantages and shortcomings during the process of dynamic analysis. The conclusions obtained in this paper can provide some helpful guidance for the reasonable selection of multiway calibration models to achieve the real-time quantitative analysis of target analyte(s) in complex dynamic systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Leveraging Semantic Labels for Multi-level Abstraction in Medical Process Mining and Trace Comparison.

    PubMed

    Leonardi, Giorgio; Striani, Manuel; Quaglini, Silvana; Cavallini, Anna; Montani, Stefania

    2018-05-21

    Many medical information systems record data about the executed process instances in the form of an event log. In this paper, we present a framework, able to convert actions in the event log into higher level concepts, at different levels of abstraction, on the basis of domain knowledge. Abstracted traces are then provided as an input to trace comparison and semantic process discovery. Our abstraction mechanism is able to manage non trivial situations, such as interleaved actions or delays between two actions that abstract to the same concept. Trace comparison resorts to a similarity metric able to take into account abstraction phase penalties, and to deal with quantitative and qualitative temporal constraints in abstracted traces. As for process discovery, we rely on classical algorithms embedded in the framework ProM, made semantic by the capability of abstracting the actions on the basis of their conceptual meaning. The approach has been tested in stroke care, where we adopted abstraction and trace comparison to cluster event logs of different stroke units, to highlight (in)correct behavior, abstracting from details. We also provide process discovery results, showing how the abstraction mechanism allows to obtain stroke process models more easily interpretable by neurologists. Copyright © 2018. Published by Elsevier Inc.

  10. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  11. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  12. High pressure rinsing system comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Sertore; M. Fusetti; P. Michelato

    2007-06-01

    High pressure rinsing (HPR) is a key process for the surface preparation of high field superconducting cavities. A portable apparatus for the water jet characterization, based on the transferred momentum between the water jet and a load cell, has been used in different laboratories. This apparatus allows to collected quantitative parameters that characterize the HPR water jet. In this paper, we present a quantitative comparison of the different water jet produced by various nozzles routinely used in different laboratories for the HPR process

  13. Field Demonstration Report Applied Innovative Technologies for Characterization of Nitrocellulose- and Nitroglycerine Contaminated Buildings and Soils, Rev 1

    DTIC Science & Technology

    2007-01-05

    positive / false negatives. The quantitative on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison...Conclusion ...............................................................................................3-9 3.2 Quantitative Analysis Using CRREL...3-37 3.3 Quantitative Analysis for NG by GC/TID.........................................................3-38 3.3.1 Introduction

  14. Development of a mobile toolmark characterization/comparison system [Development of a mobile, automated toolmark characterization/comparison system

    DOE PAGES

    Chumbley, Scott; Zhang, Song; Morris, Max; ...

    2016-11-16

    Since the development of the striagraph, various attempts have been made to enhance forensic investigation through the use of measuring and imaging equipment. This study describes the development of a prototype system employing an easy-to-use software interface designed to provide forensic examiners with the ability to measure topography of a toolmarked surface and then conduct various comparisons using a statistical algorithm. Acquisition of the data is carried out using a portable 3D optical profilometer, and comparison of the resulting data files is made using software named “MANTIS” (Mark and Tool Inspection Suite). The system has been tested on laboratory-produced markingsmore » that include fully striated marks (e.g., screwdriver markings), quasistriated markings produced by shear-cut pliers, impression marks left by chisels, rifling marks on bullets, and cut marks produced by knives. Using the system, an examiner has the potential to (i) visually compare two toolmarked surfaces in a manner similar to a comparison microscope and (ii) use the quantitative information embedded within the acquired data to obtain an objective statistical comparison of the data files. Finally, this study shows that, based on the results from laboratory samples, the system has great potential for aiding examiners in conducting comparisons of toolmarks.« less

  15. Development of a mobile toolmark characterization/comparison system [Development of a mobile, automated toolmark characterization/comparison system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chumbley, Scott; Zhang, Song; Morris, Max

    Since the development of the striagraph, various attempts have been made to enhance forensic investigation through the use of measuring and imaging equipment. This study describes the development of a prototype system employing an easy-to-use software interface designed to provide forensic examiners with the ability to measure topography of a toolmarked surface and then conduct various comparisons using a statistical algorithm. Acquisition of the data is carried out using a portable 3D optical profilometer, and comparison of the resulting data files is made using software named “MANTIS” (Mark and Tool Inspection Suite). The system has been tested on laboratory-produced markingsmore » that include fully striated marks (e.g., screwdriver markings), quasistriated markings produced by shear-cut pliers, impression marks left by chisels, rifling marks on bullets, and cut marks produced by knives. Using the system, an examiner has the potential to (i) visually compare two toolmarked surfaces in a manner similar to a comparison microscope and (ii) use the quantitative information embedded within the acquired data to obtain an objective statistical comparison of the data files. Finally, this study shows that, based on the results from laboratory samples, the system has great potential for aiding examiners in conducting comparisons of toolmarks.« less

  16. How to compare movement? A review of physical movement similarity measures in geographic information science and beyond.

    PubMed

    Ranacher, Peter; Tzavella, Katerina

    2014-05-27

    In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods.

  17. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less

  18. How to compare movement? A review of physical movement similarity measures in geographic information science and beyond

    PubMed Central

    Ranacher, Peter; Tzavella, Katerina

    2014-01-01

    In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods. PMID:27019646

  19. Comparison of Quantitative PCR and Droplet Digital PCR Multiplex Assays for Two Genera of Bloom-Forming Cyanobacteria, Cylindrospermopsis and Microcystis

    PubMed Central

    Te, Shu Harn; Chen, Enid Yingru

    2015-01-01

    The increasing occurrence of harmful cyanobacterial blooms, often linked to deteriorated water quality and adverse public health effects, has become a worldwide concern in recent decades. The use of molecular techniques such as real-time quantitative PCR (qPCR) has become increasingly popular in the detection and monitoring of harmful cyanobacterial species. Multiplex qPCR assays that quantify several toxigenic cyanobacterial species have been established previously; however, there is no molecular assay that detects several bloom-forming species simultaneously. Microcystis and Cylindrospermopsis are the two most commonly found genera and are known to be able to produce microcystin and cylindrospermopsin hepatotoxins. In this study, we designed primers and probes which enable quantification of these genera based on the RNA polymerase C1 gene for Cylindrospermopsis species and the c-phycocyanin beta subunit-like gene for Microcystis species. Duplex assays were developed for two molecular techniques—qPCR and droplet digital PCR (ddPCR). After optimization, both qPCR and ddPCR assays have high linearity and quantitative correlations for standards. Comparisons of the two techniques showed that qPCR has higher sensitivity, a wider linear dynamic range, and shorter analysis time and that it was more cost-effective, making it a suitable method for initial screening. However, the ddPCR approach has lower variability and was able to handle the PCR inhibition and competitive effects found in duplex assays, thus providing more precise and accurate analysis for bloom samples. PMID:26025892

  20. [A comparison of convenience sampling and purposive sampling].

    PubMed

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  1. Monte Carlo evaluation of accuracy and noise properties of two scatter correction methods for /sup 201/Tl cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Narita, Y.; Iida, H.; Ebert, S.; Nakamura, T.

    1997-12-01

    Two independent scatter correction techniques, transmission dependent convolution subtraction (TDCS) and triple-energy window (TEW) method, were evaluated in terms of quantitative accuracy and noise properties using Monte Carlo simulation (EGS4). Emission projections (primary, scatter and scatter plus primary) were simulated for three numerical phantoms for /sup 201/Tl. Data were reconstructed with ordered-subset EM algorithm including noise-less transmission data based attenuation correction. Accuracy of TDCS and TEW scatter corrections were assessed by comparison with simulated true primary data. The uniform cylindrical phantom simulation demonstrated better quantitative accuracy with TDCS than with TEW (-2.0% vs. 16.7%) and better S/N (6.48 vs. 5.05). A uniform ring myocardial phantom simulation demonstrated better homogeneity with TDCS than TEW in the myocardium; i.e., anterior-to-posterior wall count ratios were 0.99 and 0.76 with TDCS and TEW, respectively. For the MCAT phantom, TDCS provided good visual and quantitative agreement with simulated true primary image without noticeably increasing the noise after scatter correction. Overall TDCS proved to be more accurate and less noisy than TEW, facilitating quantitative assessment of physiological functions with SPECT.

  2. A differential mobility spectrometry/mass spectrometry platform for the rapid detection and quantitation of DNA adduct dG-ABP.

    PubMed

    Kafle, Amol; Klaene, Joshua; Hall, Adam B; Glick, James; Coy, Stephen L; Vouros, Paul

    2013-07-15

    There is continued interest in exploring new analytical technologies for the detection and quantitation of DNA adducts, biomarkers which provide direct evidence of exposure and genetic damage in cells. With the goal of reducing clean-up steps and improving sample throughput, a Differential Mobility Spectrometry/Mass Spectrometry (DMS/MS) platform has been introduced for adduct analysis. A DMS/MS platform has been utilized for the analysis of dG-ABP, the deoxyguanosine adduct of the bladder carcinogen 4-aminobiphenyl (4-ABP). After optimization of the DMS parameters, each sample was analyzed in just 30 s following a simple protein precipitation step of the digested DNA. A detection limit of one modification in 10^6 nucleosides has been achieved using only 2 µg of DNA. A brief comparison (quantitative and qualitative) with liquid chromatography/mass spectrometry is also presented highlighting the advantages of using the DMS/MS method as a high-throughput platform. The data presented demonstrate the successful application of a DMS/MS/MS platform for the rapid quantitation of DNA adducts using, as a model analyte, the deoxyguanosine adduct of the bladder carcinogen 4-aminobiphenyl. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  4. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  5. Quantitative Insights into the Fast Pyrolysis of Extracted Cellulose, Hemicelluloses, and Lignin

    PubMed Central

    Windt, Michael; Ziegler, Bernhard; Appelt, Jörn; Saake, Bodo; Meier, Dietrich; Bridgwater, Anthony

    2017-01-01

    Abstract The transformation of lignocellulosic biomass into bio‐based commodity chemicals is technically possible. Among thermochemical processes, fast pyrolysis, a relatively mature technology that has now reached a commercial level, produces a high yield of an organic‐rich liquid stream. Despite recent efforts to elucidate the degradation paths of biomass during pyrolysis, the selectivity and recovery rates of bio‐compounds remain low. In an attempt to clarify the general degradation scheme of biomass fast pyrolysis and provide a quantitative insight, the use of fast pyrolysis microreactors is combined with spectroscopic techniques (i.e., mass spectrometry and NMR spectroscopy) and mixtures of unlabeled and 13C‐enriched materials. The first stage of the work aimed to select the type of reactor to use to ensure control of the pyrolysis regime. A comparison of the chemical fragmentation patterns of “primary” fast pyrolysis volatiles detected by using GC‐MS between two small‐scale microreactors showed the inevitable occurrence of secondary reactions. In the second stage, liquid fractions that are also made of primary fast pyrolysis condensates were analyzed by using quantitative liquid‐state 13C NMR spectroscopy to provide a quantitative distribution of functional groups. The compilation of these results into a map that displays the distribution of functional groups according to the individual and main constituents of biomass (i.e., hemicelluloses, cellulose and lignin) confirmed the origin of individual chemicals within the fast pyrolysis liquids. PMID:28644517

  6. Quantitative Insights into the Fast Pyrolysis of Extracted Cellulose, Hemicelluloses, and Lignin.

    PubMed

    Carrier, Marion; Windt, Michael; Ziegler, Bernhard; Appelt, Jörn; Saake, Bodo; Meier, Dietrich; Bridgwater, Anthony

    2017-08-24

    The transformation of lignocellulosic biomass into bio-based commodity chemicals is technically possible. Among thermochemical processes, fast pyrolysis, a relatively mature technology that has now reached a commercial level, produces a high yield of an organic-rich liquid stream. Despite recent efforts to elucidate the degradation paths of biomass during pyrolysis, the selectivity and recovery rates of bio-compounds remain low. In an attempt to clarify the general degradation scheme of biomass fast pyrolysis and provide a quantitative insight, the use of fast pyrolysis microreactors is combined with spectroscopic techniques (i.e., mass spectrometry and NMR spectroscopy) and mixtures of unlabeled and 13 C-enriched materials. The first stage of the work aimed to select the type of reactor to use to ensure control of the pyrolysis regime. A comparison of the chemical fragmentation patterns of "primary" fast pyrolysis volatiles detected by using GC-MS between two small-scale microreactors showed the inevitable occurrence of secondary reactions. In the second stage, liquid fractions that are also made of primary fast pyrolysis condensates were analyzed by using quantitative liquid-state 13 C NMR spectroscopy to provide a quantitative distribution of functional groups. The compilation of these results into a map that displays the distribution of functional groups according to the individual and main constituents of biomass (i.e., hemicelluloses, cellulose and lignin) confirmed the origin of individual chemicals within the fast pyrolysis liquids. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  7. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women.

    PubMed

    Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M

    2012-07-01

    The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.

  8. Cecropia peltata Accumulates Starch or Soluble Glycogen by Differentially Regulating Starch Biosynthetic Genes[W][OA

    PubMed Central

    Bischof, Sylvain; Umhang, Martin; Eicke, Simona; Streb, Sebastian; Qi, Weihong; Zeeman, Samuel C.

    2013-01-01

    The branched glucans glycogen and starch are the most widespread storage carbohydrates in living organisms. The production of semicrystalline starch granules in plants is more complex than that of small, soluble glycogen particles in microbes and animals. However, the factors determining whether glycogen or starch is formed are not fully understood. The tropical tree Cecropia peltata is a rare example of an organism able to make either polymer type. Electron micrographs and quantitative measurements show that glycogen accumulates to very high levels in specialized myrmecophytic structures (Müllerian bodies), whereas starch accumulates in leaves. Compared with polymers comprising leaf starch, glycogen is more highly branched and has shorter branches—factors that prevent crystallization and explain its solubility. RNA sequencing and quantitative shotgun proteomics reveal that isoforms of all three classes of glucan biosynthetic enzyme (starch/glycogen synthases, branching enzymes, and debranching enzymes) are differentially expressed in Müllerian bodies and leaves, providing a system-wide view of the quantitative programming of storage carbohydrate metabolism. This work will prompt targeted analysis in model organisms and cross-species comparisons. Finally, as starch is the major carbohydrate used for food and industrial applications worldwide, these data provide a basis for manipulating starch biosynthesis in crops to synthesize tailor-made polyglucans. PMID:23632447

  9. Quantitative Assessment of RNA-Protein Interactions with High Throughput Sequencing - RNA Affinity Profiling (HiTS-RAP)

    PubMed Central

    Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.

    2016-01-01

    Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240

  10. Quantitative real-time in vivo detection of magnetic nanoparticles by their nonlinear magnetization

    NASA Astrophysics Data System (ADS)

    Nikitin, M. P.; Torno, M.; Chen, H.; Rosengart, A.; Nikitin, P. I.

    2008-04-01

    A novel method of highly sensitive quantitative detection of magnetic nanoparticles (MP) in biological tissues and blood system has been realized and tested in real time in vivo experiments. The detection method is based on nonlinear magnetic properties of MP and the related device can record a very small relative variation of nonlinear magnetic susceptibility up to 10-8 at room temperature, providing sensitivity of several nanograms of MP in 0.1ml volume. Real-time quantitative in vivo measurements of dynamics of MP concentration in blood flow have been performed. A catheter that carried the blood flow of a rat passed through the measuring device. After an MP injection, the quantity of MP in the circulating blood was continuously recorded. The method has also been used to evaluate the MP distribution between rat's organs. Its sensitivity was compared with detection of the radioactive MP based on isotope of Fe59. The comparison of magnetic and radioactive signals in the rat's blood and organ samples demonstrated similar sensitivity for both methods. However, the proposed magnetic method is much more convenient as it is safe, less expensive, and provides real-time measurements in vivo. Moreover, the sensitivity of the method can be further improved by optimization of the device geometry.

  11. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    EPA Science Inventory

    Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...

  12. Chronic Obstructive Pulmonary Disease: Lobe-based Visual Assessment of Volumetric CT by Using Standard Images—Comparison with Quantitative CT and Pulmonary Function Test in the COPDGene Study

    PubMed Central

    Kim, Song Soo; Lee, Ho Yun; Nevrekar, Dipti V.; Forssen, Anna V.; Crapo, James D.; Schroeder, Joyce D.; Lynch, David A.

    2013-01-01

    Purpose: To provide a new detailed visual assessment scheme of computed tomography (CT) for chronic obstructive pulmonary disease (COPD) by using standard reference images and to compare this visual assessment method with quantitative CT and several physiologic parameters. Materials and Methods: This research was approved by the institutional review board of each institution. CT images of 200 participants in the COPDGene study were evaluated. Four thoracic radiologists performed independent, lobar analysis of volumetric CT images for type (centrilobular, panlobular, and mixed) and extent (on a six-point scale) of emphysema, the presence of bronchiectasis, airway wall thickening, and tracheal abnormalities. Standard images for each finding, generated by two radiologists, were used for reference. The extent of emphysema, airway wall thickening, and luminal area were quantified at the lobar level by using commercial software. Spearman rank test and simple and multiple regression analyses were performed to compare the results of visual assessment with physiologic and quantitative parameters. Results: The type of emphysema, determined by four readers, showed good agreement (κ = 0.63). The extent of the emphysema in each lobe showed good agreement (mean weighted κ = 0.70) and correlated with findings at quantitative CT (r = 0.75), forced expiratory volume in 1 second (FEV1) (r = −0.68), FEV1/forced vital capacity (FVC) ratio (r = −0.74) (P < .001). Agreement for airway wall thickening was fair (mean κ = 0.41), and the number of lobes with thickened bronchial walls correlated with FEV1 (r = −0.60) and FEV1/FVC ratio (r = −0.60) (P < .001). Conclusion: Visual assessment of emphysema and airways disease in individuals with COPD can provide reproducible, physiologically substantial information that may complement that provided by quantitative CT assessment. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12120385/-/DC1 PMID:23220894

  13. Comparison of the 1996 and 2001 census data for Aboriginal and non-Aboriginal workers in health care occupations.

    PubMed

    Lecompte, Emily; Baril, Mireille

    2008-01-01

    To meet the unique health needs of Aboriginal peoples (First Nations, Inuit and Métis), it is important to increase and encourage Aboriginal representation in health care. One Federal initiative, the Aboriginal Health Human Resource Initiative (AHHRI) at Health Canada, focuses on: (1) increasing the number of Aboriginal people working in health careers; (2) adapting health care educational curricula to support the development of cultural competencies; and (3) improving the retention of health care workers in Aboriginal communities. A health care system that focuses on understanding the unique challenges, concerns, and needs of Aboriginal people can better respond to this specific population, which suffers disproportionately from ill health in comparison to their non-Aboriginal counterparts. This report examines the supply of Aboriginal health care providers in Canada, based on geographic region, area of residence, Aboriginal identity, and occupation. Findings are drawn from the 1996 and 2001 censuses from Statistics Canada. Quantitative results provide a greater understanding of labour force characteristics of First Nation, Inuit, Métis, and non-Aboriginal health providers.

  14. Quantitative comparison of alternative methods for coarse-graining biological networks

    PubMed Central

    Bowman, Gregory R.; Meng, Luming; Huang, Xuhui

    2013-01-01

    Markov models and master equations are a powerful means of modeling dynamic processes like protein conformational changes. However, these models are often difficult to understand because of the enormous number of components and connections between them. Therefore, a variety of methods have been developed to facilitate understanding by coarse-graining these complex models. Here, we employ Bayesian model comparison to determine which of these coarse-graining methods provides the models that are most faithful to the original set of states. We find that the Bayesian agglomerative clustering engine and the hierarchical Nyström expansion graph (HNEG) typically provide the best performance. Surprisingly, the original Perron cluster cluster analysis (PCCA) method often provides the next best results, outperforming the newer PCCA+ method and the most probable paths algorithm. We also show that the differences between the models are qualitatively significant, rather than being minor shifts in the boundaries between states. The performance of the methods correlates well with the entropy of the resulting coarse-grainings, suggesting that finding states with more similar populations (i.e., avoiding low population states that may just be noise) gives better results. PMID:24089717

  15. Comparative Performance of Reagents and Platforms for Quantitation of Cytomegalovirus DNA by Digital PCR

    PubMed Central

    Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.

    2016-01-01

    A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685

  16. Comparison of propidium monoazide-quantitative PCR and reverse transcription quantitative PCR for viability detection of fresh Cryptosporidium oocysts following disinfection and after long-term storage in water samples

    EPA Science Inventory

    Purified oocysts of Cryptosporidium parvum were used to evaluate applicability of two quantitative PCR (qPCR) viability detection methods in raw surface water and disinfection treated water. Propidium monoazide-qPCR targeting hsp70 gene was compared to reverse transcription (RT)-...

  17. Preserving elemental content in adherent mammalian cells for analysis by synchrotron-based x-ray fluorescence microscopy

    DOE PAGES

    Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry; ...

    2016-08-31

    Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less

  18. Preserving elemental content in adherent mammalian cells for analysis by synchrotron-based x-ray fluorescence microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry

    Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kienhuis, Anne S., E-mail: anne.kienhuis@rivm.nl; RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen; Netherlands Toxicogenomics Centre

    Hepatic systems toxicology is the integrative analysis of toxicogenomic technologies, e.g., transcriptomics, proteomics, and metabolomics, in combination with traditional toxicology measures to improve the understanding of mechanisms of hepatotoxic action. Hepatic toxicology studies that have employed toxicogenomic technologies to date have already provided a proof of principle for the value of hepatic systems toxicology in hazard identification. In the present review, acetaminophen is used as a model compound to discuss the application of toxicogenomics in hepatic systems toxicology for its potential role in the risk assessment process, to progress from hazard identification towards hazard characterization. The toxicogenomics-based parallelogram is usedmore » to identify current achievements and limitations of acetaminophen toxicogenomic in vivo and in vitro studies for in vitro-to-in vivo and interspecies comparisons, with the ultimate aim to extrapolate animal studies to humans in vivo. This article provides a model for comparison of more species and more in vitro models enhancing the robustness of common toxicogenomic responses and their relevance to human risk assessment. To progress to quantitative dose-response analysis needed for hazard characterization, in hepatic systems toxicology studies, generation of toxicogenomic data of multiple doses/concentrations and time points is required. Newly developed bioinformatics tools for quantitative analysis of toxicogenomic data can aid in the elucidation of dose-responsive effects. The challenge herein is to assess which toxicogenomic responses are relevant for induction of the apical effect and whether perturbations are sufficient for the induction of downstream events, eventually causing toxicity.« less

  20. The chemiluminescent response of human monocytes to red cells sensitized with monoclonal anti-Rh(D) antibodies.

    PubMed

    Hadley, A G; Kumpel, B M; Merry, A H

    1988-01-01

    Luminol-enhanced chemiluminescence (CL) was used to assess the metabolic response of human monocytes to red cells sensitized with known amounts of anti-Rh(D). Monoclonal antibodies were used to facilitate a comparison between the functional activities of IgG1 and IgG3 subclasses. The detection of CL provided a simple, rapid and semi-quantitative means of measuring monocyte response to sensitized red cells (IgG-RBC). Monocyte response to IgG3-RBC was quantitatively greater, more rapid and less susceptible to inhibition by fluid phase IgG than monocyte response to IgG1-RBC. The minimum levels of sensitization required to elicit CL from monocytes were approximately 2500 IgG3 molecules per red cell, or approximately 5000 IgG1 molecules per cell.

  1. Summary of Quantitative Interpretation of Image Far Ultraviolet Auroral Data

    NASA Technical Reports Server (NTRS)

    Frey, H. U.; Immel, T. J.; Mende, S. B.; Gerard, J.-C.; Hubert, B.; Habraken, S.; Span, J.; Gladstone, G. R.; Bisikalo, D. V.; Shematovich, V. I.; hide

    2002-01-01

    Direct imaging of the magnetosphere by instruments on the IMAGE spacecraft is supplemented by simultaneous observations of the global aurora in three far ultraviolet (FUV) wavelength bands. The purpose of the multi-wavelength imaging is to study the global auroral particle and energy input from thc magnetosphere into the atmosphere. This paper describes provides the method for quantitative interpretation of FUV measurements. The Wide-Band Imaging Camera (WIC) provides broad band ultraviolet images of the aurora with maximum spatial and temporal resolution by imaging the nitrogen lines and bands between 140 and 180 nm wavelength. The Spectrographic Imager (SI), a dual wavelength monochromatic instrument, images both Doppler-shifted Lyman alpha emissions produced by precipitating protons, in the SI-12 channel and OI 135.6 nm emissions in the SI-13 channel. From the SI-12 Doppler shifted Lyman alpha images it is possible to obtain the precipitating proton flux provided assumptions are made regarding the mean energy of the protons. Knowledge of the proton (flux and energy) component allows the calculation of the contribution produced by protons in the WIC and SI-13 instruments. Comparison of the corrected WIC and SI-13 signals provides a measure of the electron mean energy, which can then be used to determine the electron energy fluxun-. To accomplish this reliable modeling emission modeling and instrument calibrations are required. In-flight calibration using early-type stars was used to validate the pre-flight laboratory calibrations and determine long-term trends in sensitivity. In general, very reasonable agreement is found between in-situ measurements and remote quantitative determinations.

  2. A quantitative comparison of leading-edge vortices in incompressible and supersonic flows

    DOT National Transportation Integrated Search

    2002-01-14

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that pl...

  3. Variations in optical coherence tomography resolution and uniformity: a multi-system performance comparison

    PubMed Central

    Fouad, Anthony; Pfefer, T. Joshua; Chen, Chao-Wei; Gong, Wei; Agrawal, Anant; Tomlins, Peter H.; Woolliams, Peter D.; Drezek, Rebekah A.; Chen, Yu

    2014-01-01

    Point spread function (PSF) phantoms based on unstructured distributions of sub-resolution particles in a transparent matrix have been demonstrated as a useful tool for evaluating resolution and its spatial variation across image volumes in optical coherence tomography (OCT) systems. Measurements based on PSF phantoms have the potential to become a standard test method for consistent, objective and quantitative inter-comparison of OCT system performance. Towards this end, we have evaluated three PSF phantoms and investigated their ability to compare the performance of four OCT systems. The phantoms are based on 260-nm-diameter gold nanoshells, 400-nm-diameter iron oxide particles and 1.5-micron-diameter silica particles. The OCT systems included spectral-domain and swept source systems in free-beam geometries as well as a time-domain system in both free-beam and fiberoptic probe geometries. Results indicated that iron oxide particles and gold nanoshells were most effective for measuring spatial variations in the magnitude and shape of PSFs across the image volume. The intensity of individual particles was also used to evaluate spatial variations in signal intensity uniformity. Significant system-to-system differences in resolution and signal intensity and their spatial variation were readily quantified. The phantoms proved useful for identification and characterization of irregularities such as astigmatism. Our multi-system results provide evidence of the practical utility of PSF-phantom-based test methods for quantitative inter-comparison of OCT system resolution and signal uniformity. PMID:25071949

  4. CCD TV focal plane guider development and comparison to SIRTF applications

    NASA Technical Reports Server (NTRS)

    Rank, David M.

    1989-01-01

    It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.

  5. Fitting Multimeric Protein Complexes into Electron Microscopy Maps Using 3D Zernike Descriptors

    PubMed Central

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2012-01-01

    A novel computational method for fitting high-resolution structures of multiple proteins into a cryoelectron microscopy map is presented. The method named EMLZerD generates a pool of candidate multiple protein docking conformations of component proteins, which are later compared with a provided electron microscopy (EM) density map to select the ones that fit well into the EM map. The comparison of docking conformations and the EM map is performed using the 3D Zernike descriptor (3DZD), a mathematical series expansion of three-dimensional functions. The 3DZD provides a unified representation of the surface shape of multimeric protein complex models and EM maps, which allows a convenient, fast quantitative comparison of the three dimensional structural data. Out of 19 multimeric complexes tested, near native complex structures with a root mean square deviation of less than 2.5 Å were obtained for 14 cases while medium range resolution structures with correct topology were computed for the additional 5 cases. PMID:22417139

  6. Fitting multimeric protein complexes into electron microscopy maps using 3D Zernike descriptors.

    PubMed

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2012-06-14

    A novel computational method for fitting high-resolution structures of multiple proteins into a cryoelectron microscopy map is presented. The method named EMLZerD generates a pool of candidate multiple protein docking conformations of component proteins, which are later compared with a provided electron microscopy (EM) density map to select the ones that fit well into the EM map. The comparison of docking conformations and the EM map is performed using the 3D Zernike descriptor (3DZD), a mathematical series expansion of three-dimensional functions. The 3DZD provides a unified representation of the surface shape of multimeric protein complex models and EM maps, which allows a convenient, fast quantitative comparison of the three-dimensional structural data. Out of 19 multimeric complexes tested, near native complex structures with a root-mean-square deviation of less than 2.5 Å were obtained for 14 cases while medium range resolution structures with correct topology were computed for the additional 5 cases.

  7. Diffusion MRI microstructure models with in vivo human brain Connectome data: results from a multi-group comparison.

    PubMed

    Ferizi, Uran; Scherrer, Benoit; Schneider, Torben; Alipoor, Mohammad; Eufracio, Odin; Fick, Rutger H J; Deriche, Rachid; Nilsson, Markus; Loya-Olivas, Ana K; Rivera, Mariano; Poot, Dirk H J; Ramirez-Manzanares, Alonso; Marroquin, Jose L; Rokem, Ariel; Pötter, Christian; Dougherty, Robert F; Sakaie, Ken; Wheeler-Kingshott, Claudia; Warfield, Simon K; Witzel, Thomas; Wald, Lawrence L; Raya, José G; Alexander, Daniel C

    2017-09-01

    A large number of mathematical models have been proposed to describe the measured signal in diffusion-weighted (DW) magnetic resonance imaging (MRI). However, model comparison to date focuses only on specific subclasses, e.g. compartment models or signal models, and little or no information is available in the literature on how performance varies among the different types of models. To address this deficiency, we organized the 'White Matter Modeling Challenge' during the International Symposium on Biomedical Imaging (ISBI) 2015 conference. This competition aimed to compare a range of different kinds of models in their ability to explain a large range of measurable in vivo DW human brain data. Specifically, we assessed the ability of models to predict the DW signal accurately for new diffusion gradients and b values. We did not evaluate the accuracy of estimated model parameters, as a ground truth is hard to obtain. We used the Connectome scanner at the Massachusetts General Hospital, using gradient strengths of up to 300 mT/m and a broad set of diffusion times. We focused on assessing the DW signal prediction in two regions: the genu in the corpus callosum, where the fibres are relatively straight and parallel, and the fornix, where the configuration of fibres is more complex. The challenge participants had access to three-quarters of the dataset and their models were ranked on their ability to predict the remaining unseen quarter of the data. The challenge provided a unique opportunity for a quantitative comparison of diverse methods from multiple groups worldwide. The comparison of the challenge entries reveals interesting trends that could potentially influence the next generation of diffusion-based quantitative MRI techniques. The first is that signal models do not necessarily outperform tissue models; in fact, of those tested, tissue models rank highest on average. The second is that assuming a non-Gaussian (rather than purely Gaussian) noise model provides little improvement in prediction of unseen data, although it is possible that this may still have a beneficial effect on estimated parameter values. The third is that preprocessing the training data, here by omitting signal outliers, and using signal-predicting strategies, such as bootstrapping or cross-validation, could benefit the model fitting. The analysis in this study provides a benchmark for other models and the data remain available to build up a more complete comparison in the future. Copyright © 2017 The Authors. NMR in Biomedicine Published by John Wiley & Sons, Ltd.

  8. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  9. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  10. Social Comparison and Body Image in Adolescence: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Krayer, A.; Ingledew, D. K.; Iphofen, R.

    2008-01-01

    This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…

  11. Target Scattering Metrics: Model-Model and Model-Data Comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  12. Target Scattering Metrics: Model-Model and Model Data comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  13. Sustainable Urban Forestry Potential Based Quantitative And Qualitative Measurement Using Geospatial Technique

    NASA Astrophysics Data System (ADS)

    Rosli, A. Z.; Reba, M. N. M.; Roslan, N.; Room, M. H. M.

    2014-02-01

    In order to maintain the stability of natural ecosystems around urban areas, urban forestry will be the best initiative to maintain and control green space in our country. Integration between remote sensing (RS) and geospatial information system (GIS) serves as an effective tool for monitoring environmental changes and planning, managing and developing a sustainable urbanization. This paper aims to assess capability of the integration of RS and GIS to provide information for urban forest potential sites based on qualitative and quantitative by using priority parameter ranking in the new township of Nusajaya. SPOT image was used to provide high spatial accuracy while map of topography, landuse, soils group, hydrology, Digital Elevation Model (DEM) and soil series data were applied to enhance the satellite image in detecting and locating present attributes and features on the ground. Multi-Criteria Decision Making (MCDM) technique provides structural and pair wise quantification and comparison elements and criteria for priority ranking for urban forestry purpose. Slope, soil texture, drainage, spatial area, availability of natural resource, and vicinity of urban area are criteria considered in this study. This study highlighted the priority ranking MCDM is cost effective tool for decision-making in urban forestry planning and landscaping.

  14. Perceived improvement in integrated management of childhood illness implementation through use of mobile technology: qualitative evidence from a pilot study in Tanzania.

    PubMed

    Mitchell, Marc; Getchell, Maya; Nkaka, Melania; Msellemu, Daniel; Van Esch, Jan; Hedt-Gauthier, Bethany

    2012-01-01

    This study examined health care provider and caretaker perceptions of electronic Integrated Management of Childhood Illness (eIMCI) in diagnosing and treating childhood illnesses. The authors conducted semi-structured interviews among caretakers (n = 20) and health care providers (n = 11) in the Pwani region of Tanzania. This qualitative study was nested within a larger quantitative study measuring impact of eIMCI on provider adherence to IMCI protocols. Caretakers and health care workers involved in the larger study provided their perceptions of eIMCI in comparison with the conventional paper forms. One health care provider from each participating health center participated in qualitative interviews; 20 caretakers were selected from 1 health center involved in the quantitative study. Interviews were conducted in Swahili and lasted 5-10 min each. Providers expressed positive opinions of eIMCI, noting that the personal digital assistants were faster and easier to use than were the paper forms and encouraged adherence to IMCI procedures. Caretakers also held a positive view of eIMCI, noting improved service from providers, more thorough examination of their child, and a perception that providers who used the personal digital assistants were more knowledgeable. Research indicates widespread nonadherence to IMCI guidelines, suggesting improved methods for implementing IMCI are necessary. The authors conclude that eIMCI represents a promising method for improving health care delivery because it improves health care provider and caretaker perception of the clinical encounter. Further investigation into this technology is warranted.

  15. Comparison of Activity Determination of Radium 226 in FUSRAP Soil using Various Energy Lines - 12299

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, Brian; Donakowski, Jough; Hays, David

    2012-07-01

    Gamma spectroscopy is used at the Formerly Utilized Sites Remedial Action Program (FUSRAP) Maywood Superfund Site as the primary radioanalytical tool for quantization of activities of the radionuclides of concern in site soil. When selecting energy lines in gamma spectroscopy, a number of factors are considered including assumptions concerning secondary equilibrium, interferences, and the strength of the lines. The case of the Maywood radionuclide of concern radium-226 (Ra-226) is considered in this paper. At the FUSRAP Maywood Superfund Site, one of the daughters produced from radioactive decay of Ra-226, lead-214 (Pb- 214), is used to quantitate Ra-226. Another Ra-226 daughter,more » bismuth-214 (Bi-214), also may be used to quantitate Ra-226. In this paper, a comparison of Ra-226 to Pb-214 activities and Ra-226 to Bi-214 activities, obtained using gamma spectrometry for a large number of soil samples, was performed. The Pb-214, Bi-214, and Ra-226 activities were quantitated using the 352 kilo electron volt (keV), 609 keV, and 186 keV lines, respectively. The comparisons were made after correcting the Ra-226 activities by a factor of 0.571 and both ignoring and accounting for the contribution of a U-235 interfering line to the Ra-226 line. For the Pb-214 and Bi-214 activities, a mean in-growth factor was employed. The gamma spectrometer was calibrated for efficiency and energy using a mixed gamma standard and an energy range of 59 keV to 1830 keV. The authors expect other sites with Ra-226 contamination in soil may benefit from the discussions and points in this paper. Proper use of correction factors and comparison of the data from three different gamma-emitting radionuclides revealed agreement with expectations and provided confidence that using such correction factors generates quality data. The results indicate that if contamination is low level and due to NORM, the Ra-226 can be measured directly if corrected to subtract the contribution from U-235. If there is any indication that technologically enhanced uranium may be present, the preferred measurement approach for quantitation of Ra-226 activity is detection of one of the Ra-226 daughters, Pb-214 or Bi-214, using a correction factor obtained from an in-growth curve. The results also show that the adjusted Ra-226 results compare very well with both the Pb-214 and Bi-214 results obtained using an in-growth curve correction factor. (authors)« less

  16. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  17. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  19. Science advancements key to increasing management value of life stage monitoring networks for endangered Sacramento River winter-run Chinook salmon in California

    USGS Publications Warehouse

    Johnson, Rachel C.; Windell, Sean; Brandes, Patricia L.; Conrad, J. Louise; Ferguson, John; Goertler, Pascale A. L.; Harvey, Brett N.; Heublein, Joseph; Isreal, Joshua A.; Kratville, Daniel W.; Kirsch, Joseph E.; Perry, Russell W.; Pisciotto, Joseph; Poytress, William R.; Reece, Kevin; Swart, Brycen G.

    2017-01-01

    A robust monitoring network that provides quantitative information about the status of imperiled species at key life stages and geographic locations over time is fundamental for sustainable management of fisheries resources. For anadromous species, management actions in one geographic domain can substantially affect abundance of subsequent life stages that span broad geographic regions. Quantitative metrics (e.g., abundance, movement, survival, life history diversity, and condition) at multiple life stages are needed to inform how management actions (e.g., hatcheries, harvest, hydrology, and habitat restoration) influence salmon population dynamics. The existing monitoring network for endangered Sacramento River winterrun Chinook Salmon (SRWRC, Oncorhynchus tshawytscha) in California’s Central Valley was compared to conceptual models developed for each life stage and geographic region of the life cycle to identify relevant SRWRC metrics. We concluded that the current monitoring network was insufficient to diagnose when (life stage) and where (geographic domain) chronic or episodic reductions in SRWRC cohorts occur, precluding within- and among-year comparisons. The strongest quantitative data exist in the Upper Sacramento River, where abundance estimates are generated for adult spawners and emigrating juveniles. However, once SRWRC leave the upper river, our knowledge of their identity, abundance, and condition diminishes, despite the juvenile monitoring enterprise. We identified six system-wide recommended actions to strengthen the value of data generated from the existing monitoring network to assess resource management actions: (1) incorporate genetic run identification; (2) develop juvenile abundance estimates; (3) collect data for life history diversity metrics at multiple life stages; (4) expand and enhance real-time fish survival and movement monitoring; (5) collect fish condition data; and (6) provide timely public access to monitoring data in open data formats. To illustrate how updated technologies can enhance the existing monitoring to provide quantitative data on SRWRC, we provide examples of how each recommendation can address specific management issues.

  20. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    PubMed

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  1. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.

    PubMed

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C

    2016-07-21

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  2. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods

    NASA Astrophysics Data System (ADS)

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.

    2016-07-01

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  3. Visual investigation on the heat dissipation process of a heat sink by using digital holographic interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Bingjing; Zhao, Jianlin, E-mail: jlzhao@nwpu.edu.cn; Wang, Jun

    2013-11-21

    We present a method for visually and quantitatively investigating the heat dissipation process of plate-fin heat sinks by using digital holographic interferometry. A series of phase change maps reflecting the temperature distribution and variation trend of the air field surrounding heat sink during the heat dissipation process are numerically reconstructed based on double-exposure holographic interferometry. According to the phase unwrapping algorithm and the derived relationship between temperature and phase change of the detection beam, the full-field temperature distributions are quantitatively obtained with a reasonably high measurement accuracy. And then the impact of heat sink's channel width on the heat dissipationmore » performance in the case of natural convection is analyzed. In addition, a comparison between simulation and experiment results is given to verify the reliability of this method. The experiment results certify the feasibility and validity of the presented method in full-field, dynamical, and quantitative measurement of the air field temperature distribution, which provides a basis for analyzing the heat dissipation performance of plate-fin heat sinks.« less

  4. Numerical formulation for the prediction of solid/liquid change of a binary alloy

    NASA Technical Reports Server (NTRS)

    Schneider, G. E.; Tiwari, S. N.

    1990-01-01

    A computational model is presented for the prediction of solid/liquid phase change energy transport including the influence of free convection fluid flow in the liquid phase region. The computational model considers the velocity components of all non-liquid phase change material control volumes to be zero but fully solves the coupled mass-momentum problem within the liquid region. The thermal energy model includes the entire domain and uses an enthalpy like model and a recently developed method for handling the phase change interface nonlinearity. Convergence studies are performed and comparisons made with experimental data for two different problem specifications. The convergence studies indicate that grid independence was achieved and the comparison with experimental data indicates excellent quantitative prediction of the melt fraction evolution. Qualitative data is also provided in the form of velocity vector diagrams and isotherm plots for selected times in the evolution of both problems. The computational costs incurred are quite low by comparison with previous efforts on solving these problems.

  5. The early development of stereotypy and self-injury: a review of research methods.

    PubMed

    Symons, F J; Sperry, L A; Dropik, P L; Bodfish, J W

    2005-02-01

    The origin and developmental course of stereotypic and self-injurious behaviour among individuals with developmental disabilities such as intellectual disability (ID) or pervasive development disorders such as autism is not well understood. Twelve studies designed to document the prevalence, nature, or development of stereotypic and/or self-injurious behaviour in children under 5 years of age and identified as at risk for developmental delay or disability were reviewed. Comparisons were made with similar studies with typically developing children. It appears that the onset of naturally occurring rhythmic motor stereotypies is delayed in young at-risk children, but that the sequencing may be similar. A very small database, differences in samples, measures, and designs limited the degree to which comparisons could be made across studies. Future work is needed based on appropriately designed prospective comparison studies and uniform quantitative measures to provide an empirical basis for new knowledge about the early development of one of the most serious behaviour disorders afflicting children with ID and related problems of development.

  6. CPTAC Evaluates Long-Term Reproducibility of Quantitative Proteomics Using Breast Cancer Xenografts | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.

  7. Evidence from lattice data for a new particle on the worldsheet of the QCD flux tube.

    PubMed

    Dubovsky, Sergei; Flauger, Raphael; Gorbenko, Victor

    2013-08-09

    We propose a new approach for the calculation of the spectrum of excitations of QCD flux tubes. It relies on the fact that the worldsheet theory is integrable at low energies. With this approach, energy levels can be calculated for much shorter flux tubes than was previously possible, allowing for a quantitative comparison with existing lattice data. The improved theoretical control makes it manifest that existing lattice data provides strong evidence for a new pseudoscalar particle localized on the QCD flux tube--the worldsheet axion.

  8. Unveiling Mars nightside mesosphere dynamics by IUVS/MAVEN global images of NO nightglow

    NASA Astrophysics Data System (ADS)

    Stiepen, A.; Jain, S. K.; Schneider, N. M.; Milby, Z.; Deighan, J. I.; Gonzàlez-Galindo, F.; Gérard, J.-C.; Forget, F.; Bougher, S.; Stewart, A. I. F.; Royer, E.; Stevens, M. H.; Evans, J. S.; Chaffin, M. S.; Crismani, M.; McClintock, W. E.; Clarke, J. T.; Holsclaw, G. W.; Montmessin, F.; Lo, D. Y.

    2017-09-01

    We analyze the morphology of the ultraviolet nightglow in the Martian upper atmosphere through Nitric Oxide (NO) δ and γ bands emissions observed by the Imaging Ultraviolet Spectrograph instrument on the Mars Atmosphere and Volatile EvolutioN spacecraft. The seasonal dynamics of the Martian thermosphere-mesosphere can be constrained based on the distribution of these emissions. We show evidence for local (emission streaks and splotches) and global (longitudinal and seasonal) variability in brightness of the emission and provide quantitative comparisons to GCM simulations.

  9. Investigation of the microstructure and mineralogical composition of urinary calculi fragments by synchrotron radiation X-ray microtomography: a feasibility study.

    PubMed

    Kaiser, Jozef; Holá, Markéta; Galiová, Michaela; Novotný, Karel; Kanický, Viktor; Martinec, Petr; Sčučka, Jiří; Brun, Francesco; Sodini, Nicola; Tromba, Giuliana; Mancini, Lucia; Kořistková, Tamara

    2011-08-01

    The outcomes from the feasibility study on utilization of synchrotron radiation X-ray microtomography (SR-μCT) to investigate the texture and the quantitative mineralogical composition of selected calcium oxalate-based urinary calculi fragments are presented. The comparison of the results obtained by SR-μCT analysis with those derived from current standard analytical approaches is provided. SR-μCT is proved as a potential effective technique for determination of texture, 3D microstructure, and composition of kidney stones.

  10. Simulating Picosecond X-ray Diffraction from shocked crystals by Post-processing Molecular Dynamics Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimminau, G; Nagler, B; Higginbotham, A

    2008-06-19

    Calculations of the x-ray diffraction patterns from shocked crystals derived from the results of Non-Equilibrium-Molecular-Dynamics (NEMD) simulations are presented. The atomic coordinates predicted by the NEMD simulations combined with atomic form factors are used to generate a discrete distribution of electron density. A Fast-Fourier-Transform (FFT) of this distribution provides an image of the crystal in reciprocal space, which can be further processed to produce quantitative simulated data for direct comparison with experiments that employ picosecond x-ray diffraction from laser-irradiated crystalline targets.

  11. The Attentional Drift Diffusion Model of Simple Perceptual Decision-Making.

    PubMed

    Tavares, Gabriela; Perona, Pietro; Rangel, Antonio

    2017-01-01

    Perceptual decisions requiring the comparison of spatially distributed stimuli that are fixated sequentially might be influenced by fluctuations in visual attention. We used two psychophysical tasks with human subjects to investigate the extent to which visual attention influences simple perceptual choices, and to test the extent to which the attentional Drift Diffusion Model (aDDM) provides a good computational description of how attention affects the underlying decision processes. We find evidence for sizable attentional choice biases and that the aDDM provides a reasonable quantitative description of the relationship between fluctuations in visual attention, choices and reaction times. We also find that exogenous manipulations of attention induce choice biases consistent with the predictions of the model.

  12. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  13. An approach to quantitative sustainability assessment in the early stages of process design.

    PubMed

    Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio

    2008-06-15

    A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.

  14. Evaluation of cultured human dermal- and dermo-epidermal substitutes focusing on extracellular matrix components: Comparison of protein and RNA analysis.

    PubMed

    Oostendorp, Corien; Meyer, Sarah; Sobrio, Monia; van Arendonk, Joyce; Reichmann, Ernst; Daamen, Willeke F; van Kuppevelt, Toin H

    2017-05-01

    Treatment of full-thickness skin defects with split-thickness skin grafts is generally associated with contraction and scar formation and cellular skin substitutes have been developed to improve skin regeneration. The evaluation of cultured skin substitutes is generally based on qualitative parameters focusing on histology. In this study we focused on quantitative evaluation to provide a template for comparison of human bio-engineered skin substitutes between clinical and/or research centers, and to supplement histological data. We focused on extracellular matrix proteins since these components play an important role in skin regeneration. As a model we analyzed the human dermal substitute denovoDerm and the dermo-epidermal skin substitute denovoSkin. The quantification of the extracellular matrix proteins type III collagen and laminin 5 in tissue homogenates using western blotting analysis and ELISA was not successful. The same was true for assaying lysyl oxidase, an enzyme involved in crosslinking of matrix molecules. As an alternative, gene expression levels were measured using qPCR. Various RNA isolation procedures were probed. The gene expression profile for specific dermal and epidermal genes could be measured reliably and reproducibly. Differences caused by changes in the cell culture conditions could easily be detected. The number of cells in the skin substitutes was measured using the PicoGreen dsDNA assay, which was found highly quantitative and reproducible. The (dis) advantages of assays used for quantitative evaluation of skin substitutes are discussed. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  15. Comparison of Quantitative PCR and Droplet Digital PCR Multiplex Assays for Two Genera of Bloom-Forming Cyanobacteria, Cylindrospermopsis and Microcystis.

    PubMed

    Te, Shu Harn; Chen, Enid Yingru; Gin, Karina Yew-Hoong

    2015-08-01

    The increasing occurrence of harmful cyanobacterial blooms, often linked to deteriorated water quality and adverse public health effects, has become a worldwide concern in recent decades. The use of molecular techniques such as real-time quantitative PCR (qPCR) has become increasingly popular in the detection and monitoring of harmful cyanobacterial species. Multiplex qPCR assays that quantify several toxigenic cyanobacterial species have been established previously; however, there is no molecular assay that detects several bloom-forming species simultaneously. Microcystis and Cylindrospermopsis are the two most commonly found genera and are known to be able to produce microcystin and cylindrospermopsin hepatotoxins. In this study, we designed primers and probes which enable quantification of these genera based on the RNA polymerase C1 gene for Cylindrospermopsis species and the c-phycocyanin beta subunit-like gene for Microcystis species. Duplex assays were developed for two molecular techniques-qPCR and droplet digital PCR (ddPCR). After optimization, both qPCR and ddPCR assays have high linearity and quantitative correlations for standards. Comparisons of the two techniques showed that qPCR has higher sensitivity, a wider linear dynamic range, and shorter analysis time and that it was more cost-effective, making it a suitable method for initial screening. However, the ddPCR approach has lower variability and was able to handle the PCR inhibition and competitive effects found in duplex assays, thus providing more precise and accurate analysis for bloom samples. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  16. Development and Validation of a Quantitative PCR Assay Using Multiplexed Hydrolysis Probes for Detection and Quantification of Theileria orientalis Isolates and Differentiation of Clinically Relevant Subtypes

    PubMed Central

    Bogema, D. R.; Deutscher, A. T.; Fell, S.; Collins, D.; Eamens, G. J.

    2015-01-01

    Theileria orientalis is an emerging pathogen of cattle in Asia, Australia, and New Zealand. This organism is a vector-borne hemoprotozoan that causes clinical disease characterized by anemia, abortion, and death, as well as persistent subclinical infections. Molecular methods of diagnosis are preferred due to their sensitivity and utility in differentiating between pathogenic and apathogenic genotypes. Conventional PCR (cPCR) assays for T. orientalis detection and typing are laborious and do not provide an estimate of parasite load. Current real-time PCR assays cannot differentiate between clinically relevant and benign genotypes or are only semiquantitative without a defined clinical threshold. Here, we developed and validated a hydrolysis probe quantitative PCR (qPCR) assay which universally detects and quantifies T. orientalis and identifies the clinically associated Ikeda and Chitose genotypes (UIC assay). Comparison of the UIC assay results with previously validated universal and genotype-specific cPCR results demonstrated that qPCR detects and differentiates T. orientalis with high sensitivity and specificiy. Comparison of quantitative results based on percent parasitemia, determined via blood film analysis and packed cell volume (PCV) revealed significant positive and negative correlations, respectively. One-way analysis of variance (ANOVA) indicated that blood samples from animals with clinical signs of disease contained statistically higher concentrations of T. orientalis DNA than animals with subclinical infections. We propose clinical thresholds to assist in classifying high-, moderate-, and low-level infections and describe how parasite load and the presence of the Ikeda and Chitose genotypes relate to disease. PMID:25588653

  17. A quantitative comparison of transesophageal and epicardial color Doppler echocardiography in the intraoperative assessment of mitral regurgitation.

    PubMed

    Kleinman, J P; Czer, L S; DeRobertis, M; Chaux, A; Maurer, G

    1989-11-15

    Epicardial and transesophageal color Doppler echocardiography are both widely used for the intraoperative assessment of mitral regurgitation (MR); however, it has not been established whether grading of regurgitation is comparable when evaluated by these 2 techniques. MR jet size was quantitatively compared in 29 hemodynamically and temporally matched open-chest epicardial and transesophageal color Doppler echocardiography studies from 22 patients (18 with native and 4 with porcine mitral valves) scheduled to undergo mitral valve repair or replacement. Jet area, jet length and left atrial area were analyzed. Comparison of jet area measurements as assessed by epicardial and transesophageal color flow mapping revealed an excellent correlation between the techniques (r = 0.95, p less than 0.001). Epicardial and transesophageal jet length measurements were also similar (r = 0.77, p less than 0.001). Left atrial area could not be measured in 18 transesophageal studies (62%) due to foreshortening, and in 5 epicardial studies (17%) due to poor image resolution. Acoustic interference with left atrial and color flow mapping signals was noted in all patients with mitral valve prostheses when imaged by epicardial echocardiography, but this did not occur with transesophageal imaging. Thus, in patients undergoing valve repair or replacement, transesophageal and epicardial color flow mapping provide similar quantitative assessment of MR jet size. Jet area to left atrial area ratios have limited applicability in transesophageal color flow mapping, due to foreshortening of the left atrial borders in transesophageal views. Transesophageal color flow mapping may be especially useful in assessing dysfunctional mitral prostheses due to the lack of left atrial acoustic interference.

  18. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    PubMed

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. The impact of injector-based contrast agent administration in time-resolved MRA.

    PubMed

    Budjan, Johannes; Attenberger, Ulrike I; Schoenberg, Stefan O; Pietsch, Hubertus; Jost, Gregor

    2018-05-01

    Time-resolved contrast-enhanced MR angiography (4D-MRA), which allows the simultaneous visualization of the vasculature and blood-flow dynamics, is widely used in clinical routine. In this study, the impact of two different contrast agent injection methods on 4D-MRA was examined in a controlled, standardized setting in an animal model. Six anesthetized Goettingen minipigs underwent two identical 4D-MRA examinations at 1.5 T in a single session. The contrast agent (0.1 mmol/kg body weight gadobutrol, followed by 20 ml saline) was injected using either manual injection or an automated injection system. A quantitative comparison of vascular signal enhancement and quantitative renal perfusion analyses were performed. Analysis of signal enhancement revealed higher peak enhancements and shorter time to peak intervals for the automated injection. Significantly different bolus shapes were found: automated injection resulted in a compact first-pass bolus shape clearly separated from the recirculation while manual injection resulted in a disrupted first-pass bolus with two peaks. In the quantitative perfusion analyses, statistically significant differences in plasma flow values were found between the injection methods. The results of both qualitative and quantitative 4D-MRA depend on the contrast agent injection method, with automated injection providing more defined bolus shapes and more standardized examination protocols. • Automated and manual contrast agent injection result in different bolus shapes in 4D-MRA. • Manual injection results in an undefined and interrupted bolus with two peaks. • Automated injection provides more defined bolus shapes. • Automated injection can lead to more standardized examination protocols.

  20. Determination of Bifidobacterium and Lactobacillus in breast milk of healthy women by digital PCR.

    PubMed

    Qian, L; Song, H; Cai, W

    2016-09-01

    Breast milk is one of the most important sources of postnatal microbes. Quantitative real-time polymerase chain reaction (qRT-PCR) is currently used for the quantitative analysis of bacterial 16S rRNA genes in breast milk. However, this method relies on the use of standard curves and is imprecise when quantitating target DNA of low abundance. In contrast, droplet digital PCR (DD-PCR) provides an absolute quantitation without the need for calibration curves. A comparison between DD-PCR and qRT-PCR was conducted for the quantitation of Bifidobacterium and Lactobacillus 16S RNA genes in human breast milk, and the impacts of selected maternal factors were studied on the composition of these two bacteria in breast milk. From this study, DD-PCR reported between 0-34,460 16S rRNA gene copies of Bifidobacterium genera and between 1,108-634,000 16S rRNA gene copies of Lactobacillus genera in 1 ml breast milk. The 16S rRNA gene copy number of Lactobacillus genera was much greater than that of Bifidobacterium genera in breast milk. DD-PCR showed a 10-fold lower limit of quantitation as compared to qRT-PCR. A higher correlation and agreement was observed between qRT-PCR and DD-PCR in Lactobacillus quantitation as compared to Bifidobacterium quantitation. Based on our DD-PCR quantitation, a low abundance of Bifidobacterium bacteria in breast milk was correlated to higher pre-pregnancy body mass index (BMI). However, no significant difference was observed for these two bacteria in breast milk between mothers who had vaginal deliveries and caesarean deliveries. This study suggests that DD-PCR is a better tool to quantitate the bacterial load of breast milk compared to the conventional qRT-PCR method. The number of breast milk Bifidobacterium bacteria is influenced by maternal pre-pregnancy BMI.

  1. Quantitative comparison of cognitive behavioral therapy and music therapy research: a methodological best-practices analysis to guide future investigation for adult psychiatric patients.

    PubMed

    Silverman, Michael J

    2008-01-01

    While the music therapy profession is relatively young and small in size, it can treat a variety of clinical populations and has established a diverse research base. However, although the profession originated working with persons diagnosed with mental illnesses, there is a considerable lack of quantitative research concerning the effects of music therapy with this population. Music therapy clinicians and researchers have reported on this lack of evidence and the difficulty in conducting psychosocial research on their interventions (Choi, 1997; Silverman, 2003a). While published studies have provided suggestions for future research, no studies have provided detailed propositions for the methodology and design of meticulous high quality randomized controlled psychiatric music therapy research. How do other psychotherapies accomplish their databases and could the music therapy field borrow from their rigorous "methodological best practices" to strengthen its own literature base? Therefore, as the National Institutes of Mental Health state the treatment of choice for evidence-based psychotherapy is cognitive behavioral therapy (CBT), aspects of this psychotherapy's literature base were analyzed. The purpose of this literature analysis was to (a) analyze and identify components of high-quality quantitative CBT research for adult psychiatric consumers, (b) analyze and identify the variables and other elements of existing quantitative psychiatric music therapy research for adult consumers, and (c) compare the two data sets to identify the best methodological designs and variables for future quantitative music therapy research with the mental health population. A table analyzing randomized and thoroughly controlled studies involving the use of CBT for persons with severe mental illnesses is included to determine chief components of high-quality experimental research designs and implementation of quantitative clinical research. The table also shows the same analyzed components for existing quantitative psychiatric music therapy research with adult consumers, thus highlighting potential areas and elements for future investigations. A second table depicts a number of potential dependent measures and their sources to be evaluated in future music therapy studies. A third table providing suggestions for future research is derived from a synthesis of the tables and is included to guide researchers and encourage the advancement and expansion of the current literature base. The body of the paper is a discussion of the results of the literature analysis derived from the tables, meta-analyses, and reviews of literature. It is hoped that this report will lead to the addition of future high-quality quantitative research to the psychiatric music therapy literature base and thus provide evidence-based services to as many persons with mental illnesses as possible.

  2. Evidences of local adaptation in quantitative traits in Prosopis alba (Leguminosae).

    PubMed

    Bessega, C; Pometti, C; Ewens, M; Saidman, B O; Vilardi, J C

    2015-02-01

    Signals of selection on quantitative traits can be detected by the comparison between the genetic differentiation of molecular (neutral) markers and quantitative traits, by multivariate extensions of the same model and by the observation of the additive covariance among relatives. We studied, by three different tests, signals of occurrence of selection in Prosopis alba populations over 15 quantitative traits: three economically important life history traits: height, basal diameter and biomass, 11 leaf morphology traits that may be related with heat-tolerance and physiological responses and spine length that is very important from silvicultural purposes. We analyzed 172 G1-generation trees growing in a common garden belonging to 32 open pollinated families from eight sampling sites in Argentina. The multivariate phenotypes differ significantly among origins, and the highest differentiation corresponded to foliar traits. Molecular genetic markers (SSR) exhibited significant differentiation and allowed us to provide convincing evidence that natural selection is responsible for the patterns of morphological differentiation. The heterogeneous selection over phenotypic traits observed suggested different optima in each population and has important implications for gene resource management. The results suggest that the adaptive significance of traits should be considered together with population provenance in breeding program as a crucial point prior to any selecting program, especially in Prosopis where the first steps are under development.

  3. Comparison of three quantitative phosphoproteomic strategies to study receptor tyrosine kinase signaling.

    PubMed

    Zhang, Guoan; Neubert, Thomas A

    2011-12-02

    There are three quantitative phosphoproteomic strategies most commonly used to study receptor tyrosine kinase (RTK) signaling. These strategies quantify changes in: (1) all three forms of phosphosites (phosphoserine, phosphothreonine and phosphotyrosine) following enrichment of phosphopeptides by titanium dioxide or immobilized metal affinity chromatography; (2) phosphotyrosine sites following anti- phosphotyrosine antibody enrichment of phosphotyrosine peptides; or (3) phosphotyrosine proteins and their binding partners following anti-phosphotyrosine protein immunoprecipitation. However, it is not clear from literature which strategy is more effective. In this study, we assessed the utility of these three phosphoproteomic strategies in RTK signaling studies by using EphB receptor signaling as an example. We used all three strategies with stable isotope labeling with amino acids in cell culture (SILAC) to compare changes in phosphoproteomes upon EphB receptor activation. We used bioinformatic analysis to compare results from the three analyses. Our results show that the three strategies provide complementary information about RTK pathways.

  4. Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).

    PubMed

    Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd

    2011-01-01

    The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.

  5. Quantitative description of ion transport via plasma membrane of yeast and small cells.

    PubMed

    Volkov, Vadim

    2015-01-01

    Modeling of ion transport via plasma membrane needs identification and quantitative understanding of the involved processes. Brief characterization of main ion transport systems of a yeast cell (Pma1, Ena1, TOK1, Nha1, Trk1, Trk2, non-selective cation conductance) and determining the exact number of molecules of each transporter per a typical cell allow us to predict the corresponding ion flows. In this review a comparison of ion transport in small yeast cell and several animal cell types is provided. The importance of cell volume to surface ratio is emphasized. The role of cell wall and lipid rafts is discussed in respect to required increase in spatial and temporary resolution of measurements. Conclusions are formulated to describe specific features of ion transport in a yeast cell. Potential directions of future research are outlined based on the assumptions.

  6. Utilisation of the c-fos immunohistochemical method: a 2004 quantitative study.

    PubMed

    Robert, C; Arreto, C D; Gaudy, J F; Wilson, C S

    2007-10-01

    The aim of this study was to provide a quantitative view of the utilisation of the c-fos immunohistochemical method. Articles including the term "c-fos" in their title, abstract or keywords and published in 2004 were retrieved from the Current Content/Life Sciences or Current Content/Clinical Medicine collection of the SCI database. The 933 article-type documents retained were distributed in almost all the sub-disciplines of the Life Sciences and Clinical Medicine, but were principally published in the field of neuroscience. They were authored by researchers from 44 countries - the most prolific were the USA (435 articles), Japan (135) and the UK (55). The 933 articles were published in 283 different journals; all but one of the top-20 most prolific journals are in the Life Sciences discipline, and their Impact Factors ranged from 2.0 to 7.9. A comparison of the USA and the European Union scientific profiles is also made.

  7. Quantitative description of ion transport via plasma membrane of yeast and small cells

    PubMed Central

    Volkov, Vadim

    2015-01-01

    Modeling of ion transport via plasma membrane needs identification and quantitative understanding of the involved processes. Brief characterization of main ion transport systems of a yeast cell (Pma1, Ena1, TOK1, Nha1, Trk1, Trk2, non-selective cation conductance) and determining the exact number of molecules of each transporter per a typical cell allow us to predict the corresponding ion flows. In this review a comparison of ion transport in small yeast cell and several animal cell types is provided. The importance of cell volume to surface ratio is emphasized. The role of cell wall and lipid rafts is discussed in respect to required increase in spatial and temporary resolution of measurements. Conclusions are formulated to describe specific features of ion transport in a yeast cell. Potential directions of future research are outlined based on the assumptions. PMID:26113853

  8. Digital detection of endonuclease mediated gene disruption in the HIV provirus

    PubMed Central

    Sedlak, Ruth Hall; Liang, Shu; Niyonzima, Nixon; De Silva Feelixge, Harshana S.; Roychoudhury, Pavitra; Greninger, Alexander L.; Weber, Nicholas D.; Boissel, Sandrine; Scharenberg, Andrew M.; Cheng, Anqi; Magaret, Amalia; Bumgarner, Roger; Stone, Daniel; Jerome, Keith R.

    2016-01-01

    Genome editing by designer nucleases is a rapidly evolving technology utilized in a highly diverse set of research fields. Among all fields, the T7 endonuclease mismatch cleavage assay, or Surveyor assay, is the most commonly used tool to assess genomic editing by designer nucleases. This assay, while relatively easy to perform, provides only a semi-quantitative measure of mutation efficiency that lacks sensitivity and accuracy. We demonstrate a simple droplet digital PCR assay that quickly quantitates a range of indel mutations with detection as low as 0.02% mutant in a wild type background and precision (≤6%CV) and accuracy superior to either mismatch cleavage assay or clonal sequencing when compared to next-generation sequencing. The precision and simplicity of this assay will facilitate comparison of gene editing approaches and their optimization, accelerating progress in this rapidly-moving field. PMID:26829887

  9. Index of Theta/Alpha Ratio of the Quantitative Electroencephalogram in Alzheimer's Disease: A Case-Control Study.

    PubMed

    Fahimi, Golshan; Tabatabaei, Seyed Mahmoud; Fahimi, Elnaz; Rajebi, Hamid

    2017-08-01

    Alzheimer's disease (AD) is a devastating neurodegenerative disorder in human beings associated with cognitive, behavioral and motor impairments. The main symptom of AD is dementia, which causes difficulties in carrying out daily practices. Brain waves are altered in people with AD. Relative indices of brain waves can be beneficial in the diagnosis of AD. In this case-control study, 50 patients with AD and 50 matched healthy individuals were enrolled in case and control groups respectively. With recording and analyzing of brain waves with the utilization of quantitative electroencephalogram (QEEG), index of theta/alpha ratio was assessed in both groups. The index of theta/alpha ratio was significantly higher in patients with AD in comparison to healthy individuals (P<0.05). Index of theta/alpha ratio obtained by QEEG provides a non-invasive diagnostic marker of AD, which may be helpful in identification of non-advanced disease in susceptible individuals.

  10. Rapid determination of free fatty acid content in waste deodorizer distillates using single bounce-attenuated total reflectance-FTIR spectroscopy.

    PubMed

    Naz, Saba; Sherazi, Sayed Tufail Hussain; Talpur, Farah N; Mahesar, Sarfaraz A; Kara, Huseyin

    2012-01-01

    A simple, rapid, economical, and environmentally friendly analytical method was developed for the quantitative assessment of free fatty acids (FFAs) present in deodorizer distillates and crude oils by single bounce-attenuated total reflectance-FTIR spectroscopy. Partial least squares was applied for the calibration model based on the peak region of the carbonyl group (C=O) from 1726 to 1664 cm(-1) associated with the FFAs. The proposed method totally avoided the use of organic solvents or costly standards and could be applied easily in the oil processing industry. The accuracy of the method was checked by comparison to a conventional standard American Oil Chemists' Society (AOCS) titrimetric procedure, which provided good correlation (R = 0.99980), with an SD of +/- 0.05%. Therefore, the proposed method could be used as an alternate to the AOCS titrimetric method for the quantitative determination of FFAs especially in deodorizer distillates.

  11. Mothering occupations when parenting children with feeding concerns: a mixed methods study.

    PubMed

    Winston, Kristin A; Dunbar, Sandra B; Reed, Carol N; Francis-Connolly, Elizabeth

    2010-06-01

    The occupations of mothering have gained attention in occupation-based research and literature; however, many aspects of mothering remain unexplored. PURPOSE; The purpose of this study was to gain insight into mothers' perceptions of their occupations when mothering a child with feeding difficulties. Study design used mixed methodology utilizing the Parental Stress Scale (PSS), Life Satisfaction Index for Parents (LSI-P), and phenomenological interviews. Comparison of the datasets illuminated the quantitative findings with the words of the women interviewed. Although there was only one statistically significant finding in the quantitative data in terms of satisfaction with leisure and recreation, the qualitative data provided rich descriptions of mothers' perceptions of stress and life satisfaction. Mixed methods data analysis revealed the complex nature of the interaction between mothering occupations and mothering a child with feeding concerns as well as how these concerns might influence occupational therapy practice.

  12. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  13. Fast quantitative optical detection of heat dissipation by surface plasmon polaritons.

    PubMed

    Möller, Thomas B; Ganser, Andreas; Kratt, Martina; Dickreuter, Simon; Waitz, Reimar; Scheer, Elke; Boneberg, Johannes; Leiderer, Paul

    2018-06-13

    Heat management at the nanoscale is an issue of increasing importance. In optoelectronic devices the transport and decay of plasmons contribute to the dissipation of heat. By comparison of experimental data and simulations we demonstrate that it is possible to gain quantitative information about excitation, propagation and decay of surface plasmon polaritons (SPPs) in a thin gold stripe supported by a silicon membrane. The temperature-dependent optical transmissivity of the membrane is used to determine the temperature distribution around the metal stripe with high spatial and temporal resolution. This method is complementary to techniques where the propagation of SPPs is monitored optically, and provides additional information which is not readily accessible by other means. In particular, we demonstrate that the thermal conductivity of the membrane can also be derived from our analysis. The results presented here show the high potential of this tool for heat management studies in nanoscale devices.

  14. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  15. Multisite formative assessment for the Pathways study to prevent obesity in American Indian schoolchildren123

    PubMed Central

    Gittelsohn, Joel; Evans, Marguerite; Story, Mary; Davis, Sally M; Metcalfe, Lauve; Helitzer, Deborah L; Clay, Theresa E

    2016-01-01

    We describe the formative assessment process, using an approach based on social learning theory, for the development of a school-based obesity-prevention intervention into which cultural perspectives are integrated. The feasibility phase of the Pathways study was conducted in multiple settings in 6 American Indian nations. The Pathways formative assessment collected both qualitative and quantitative data. The qualitative data identified key social and environmental issues and enabled local people to express their own needs and views. The quantitative, structured data permitted comparison across sites. Both types of data were integrated by using a conceptual and procedural model. The formative assessment results were used to identify and rank the behavioral risk factors that were to become the focus of the Pathways intervention and to provide guidance on developing common intervention strategies that would be culturally appropriate and acceptable to all sites. PMID:10195601

  16. A comparative uncertainty study of the calibration of macrolide antibiotic reference standards using quantitative nuclear magnetic resonance and mass balance methods.

    PubMed

    Liu, Shu-Yu; Hu, Chang-Qin

    2007-10-17

    This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.

  17. Multilayer Markov Random Field models for change detection in optical remote sensing images

    NASA Astrophysics Data System (ADS)

    Benedek, Csaba; Shadaydeh, Maha; Kato, Zoltan; Szirányi, Tamás; Zerubia, Josiane

    2015-09-01

    In this paper, we give a comparative study on three Multilayer Markov Random Field (MRF) based solutions proposed for change detection in optical remote sensing images, called Multicue MRF, Conditional Mixed Markov model, and Fusion MRF. Our purposes are twofold. On one hand, we highlight the significance of the focused model family and we set them against various state-of-the-art approaches through a thematic analysis and quantitative tests. We discuss the advantages and drawbacks of class comparison vs. direct approaches, usage of training data, various targeted application fields and different ways of Ground Truth generation, meantime informing the Reader in which roles the Multilayer MRFs can be efficiently applied. On the other hand we also emphasize the differences between the three focused models at various levels, considering the model structures, feature extraction, layer interpretation, change concept definition, parameter tuning and performance. We provide qualitative and quantitative comparison results using principally a publicly available change detection database which contains aerial image pairs and Ground Truth change masks. We conclude that the discussed models are competitive against alternative state-of-the-art solutions, if one uses them as pre-processing filters in multitemporal optical image analysis. In addition, they cover together a large range of applications, considering the different usage options of the three approaches.

  18. Impact of different meander sizes on the RF transmit performance and coupling of microstrip line elements at 7 T.

    PubMed

    Rietsch, Stefan H G; Quick, Harald H; Orzada, Stephan

    2015-08-01

    In this work, the transmit performance and interelement coupling characteristics of radio frequency (RF) antenna microstrip line elements are examined in simulations and measurements. The initial point of the simulations is a microstrip line element loaded with a phantom. Meander structures are then introduced at the end of the element. The size of the meanders is increased in fixed steps and the magnetic field is optimized. In continuative simulations, the coupling between identical elements is evaluated for different element spacing and loading conditions. Verification of the simulation results is accomplished in measurements of the coupling between two identical elements for four different meander sizes. Image acquisition on a 7 T magnetic resonance imaging (MRI) system provides qualitative and quantitative comparisons to confirm the simulation results. Simulations point out an optimum range of meander sizes concerning coupling in all chosen geometric setups. Coupling measurement results are in good agreement with the simulations. Qualitative and quantitative comparisons of the acquired MRI images substantiate the coupling results. The coupling between coil elements in RF antenna arrays consisting of the investigated element types can be optimized under consideration of the central magnetic field strength or efficiency depending on the desired application.

  19. Comparison of Pancreas Juice Proteins from Cancer Versus Pancreatitis Using Quantitative Proteomic Analysis

    PubMed Central

    Chen, Ru; Pan, Sheng; Cooke, Kelly; Moyes, Kara White; Bronner, Mary P.; Goodlett, David R.; Aebersold, Ruedi; Brentnall, Teresa A.

    2008-01-01

    Objectives Pancreatitis is an inflammatory condition of the pancreas. However, it often shares many molecular features with pancreatic cancer. Biomarkers present in pancreatic cancer frequently occur in the setting of pancreatitis. The efforts to develop diagnostic biomarkers for pancreatic cancer have thus been complicated by the false-positive involvement of pancreatitis. Methods In an attempt to develop protein biomarkers for pancreatic cancer, we previously use quantitative proteomics to identify and quantify the proteins from pancreatic cancer juice. Pancreatic juice is a rich source of proteins that are shed by the pancreatic ductal cells. In this study, we used a similar approach to identify and quantify proteins from pancreatitis juice. Results In total, 72 proteins were identified and quantified in the comparison of pancreatic juice from pancreatitis patients versus pooled normal control juice. Nineteen of the juice proteins were overexpressed, and 8 were underexpressed in pancreatitis juice by at least 2-fold compared with normal pancreatic juice. Of these 27 differentially expressed proteins in pancreatitis, 9 proteins were also differentially expressed in the pancreatic juice from pancreatic cancer patient. Conclusions Identification of these differentially expressed proteins from pancreatitis juice provides useful information for future study of specific pancreatitis-associated proteins and to eliminate potential false-positive biomarkers for pancreatic cancer. PMID:17198186

  20. Comparison of pancreas juice proteins from cancer versus pancreatitis using quantitative proteomic analysis.

    PubMed

    Chen, Ru; Pan, Sheng; Cooke, Kelly; Moyes, Kara White; Bronner, Mary P; Goodlett, David R; Aebersold, Ruedi; Brentnall, Teresa A

    2007-01-01

    Pancreatitis is an inflammatory condition of the pancreas. However, it often shares many molecular features with pancreatic cancer. Biomarkers present in pancreatic cancer frequently occur in the setting of pancreatitis. The efforts to develop diagnostic biomarkers for pancreatic cancer have thus been complicated by the false-positive involvement of pancreatitis. In an attempt to develop protein biomarkers for pancreatic cancer, we previously use quantitative proteomics to identify and quantify the proteins from pancreatic cancer juice. Pancreatic juice is a rich source of proteins that are shed by the pancreatic ductal cells. In this study, we used a similar approach to identify and quantify proteins from pancreatitis juice. In total, 72 proteins were identified and quantified in the comparison of pancreatic juice from pancreatitis patients versus pooled normal control juice. Nineteen of the juice proteins were overexpressed, and 8 were underexpressed in pancreatitis juice by at least 2-fold compared with normal pancreatic juice. Of these 27 differentially expressed proteins in pancreatitis, 9 proteins were also differentially expressed in the pancreatic juice from pancreatic cancer patient. Identification of these differentially expressed proteins from pancreatitis juice provides useful information for future study of specific pancreatitis-associated proteins and to eliminate potential false-positive biomarkers for pancreatic cancer.

  1. Inter-model comparison of the landscape determinants of vector-borne disease: implications for epidemiological and entomological risk modeling.

    PubMed

    Lorenz, Alyson; Dhingra, Radhika; Chang, Howard H; Bisanzio, Donal; Liu, Yang; Remais, Justin V

    2014-01-01

    Extrapolating landscape regression models for use in assessing vector-borne disease risk and other applications requires thoughtful evaluation of fundamental model choice issues. To examine implications of such choices, an analysis was conducted to explore the extent to which disparate landscape models agree in their epidemiological and entomological risk predictions when extrapolated to new regions. Agreement between six literature-drawn landscape models was examined by comparing predicted county-level distributions of either Lyme disease or Ixodes scapularis vector using Spearman ranked correlation. AUC analyses and multinomial logistic regression were used to assess the ability of these extrapolated landscape models to predict observed national data. Three models based on measures of vegetation, habitat patch characteristics, and herbaceous landcover emerged as effective predictors of observed disease and vector distribution. An ensemble model containing these three models improved precision and predictive ability over individual models. A priori assessment of qualitative model characteristics effectively identified models that subsequently emerged as better predictors in quantitative analysis. Both a methodology for quantitative model comparison and a checklist for qualitative assessment of candidate models for extrapolation are provided; both tools aim to improve collaboration between those producing models and those interested in applying them to new areas and research questions.

  2. Quantitative proteomic profiling of paired cancerous and normal colon epithelial cells isolated freshly from colorectal cancer patients.

    PubMed

    Tu, Chengjian; Mojica, Wilfrido; Straubinger, Robert M; Li, Jun; Shen, Shichen; Qu, Miao; Nie, Lei; Roberts, Rick; An, Bo; Qu, Jun

    2017-05-01

    The heterogeneous structure in tumor tissues from colorectal cancer (CRC) patients excludes an informative comparison between tumors and adjacent normal tissues. Here, we develop and apply a strategy to compare paired cancerous (CEC) versus normal (NEC) epithelial cells enriched from patients and discover potential biomarkers and therapeutic targets for CRC. CEC and NEC cells are respectively isolated from five different tumor and normal locations in the resected colon tissue from each patient (N = 12 patients) using an optimized epithelial cell adhesion molecule (EpCAM)-based enrichment approach. An ion current-based quantitative method is employed to perform comparative proteomic analysis for each patient. A total of 458 altered proteins that are common among >75% of patients are observed and selected for further investigation. Besides known findings such as deregulation of mitochondrial function, tricarboxylic acid cycle, and RNA post-transcriptional modification, functional analysis further revealed RAN signaling pathway, small nucleolar ribonucleoproteins (snoRNPs), and infection by RNA viruses are altered in CEC cells. A selection of the altered proteins of interest is validated by immunohistochemistry analyses. The informative comparison between matched CEC and NEC enhances our understanding of molecular mechanisms of CRC development and provides biomarker candidates and new pathways for therapeutic intervention. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics

    NASA Astrophysics Data System (ADS)

    El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  4. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics.

    PubMed

    El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  5. Engineering Digestion: Multiscale Processes of Food Digestion.

    PubMed

    Bornhorst, Gail M; Gouseti, Ourania; Wickham, Martin S J; Bakalis, Serafim

    2016-03-01

    Food digestion is a complex, multiscale process that has recently become of interest to the food industry due to the developing links between food and health or disease. Food digestion can be studied by using either in vitro or in vivo models, each having certain advantages or disadvantages. The recent interest in food digestion has resulted in a large number of studies in this area, yet few have provided an in-depth, quantitative description of digestion processes. To provide a framework to develop these quantitative comparisons, a summary is given here between digestion processes and parallel unit operations in the food and chemical industry. Characterization parameters and phenomena are suggested for each step of digestion. In addition to the quantitative characterization of digestion processes, the multiscale aspect of digestion must also be considered. In both food systems and the gastrointestinal tract, multiple length scales are involved in food breakdown, mixing, absorption. These different length scales influence digestion processes independently as well as through interrelated mechanisms. To facilitate optimized development of functional food products, a multiscale, engineering approach may be taken to describe food digestion processes. A framework for this approach is described in this review, as well as examples that demonstrate the importance of process characterization as well as the multiple, interrelated length scales in the digestion process. © 2016 Institute of Food Technologists®

  6. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  7. Advanced forensic validation for human spermatozoa identification using SPERM HY-LITER™ Express with quantitative image analysis.

    PubMed

    Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko

    2017-07-01

    Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.

  8. Methodology for the evaluation of the Stephanie Alexander Kitchen Garden program.

    PubMed

    Gibbs, L; Staiger, P K; Townsend, M; Macfarlane, S; Gold, L; Block, K; Johnson, B; Kulas, J; Waters, E

    2013-04-01

    Community and school cooking and gardening programs have recently increased internationally. However, despite promising indications, there is limited evidence of their effectiveness. This paper presents the evaluation framework and methods negotiated and developed to meet the information needs of all stakeholders for the Stephanie Alexander Kitchen Garden (SAKG) program, a combined cooking and gardening program implemented in selectively funded primary schools across Australia. The evaluation used multiple aligned theoretical frameworks and models, including a public health ecological approach, principles of effective health promotion and models of experiential learning. The evaluation is a non-randomised comparison of six schools receiving the program (intervention) and six comparison schools (all government-funded primary schools) in urban and rural areas of Victoria, Australia. A mixed-methods approach was used, relying on qualitative measures to understand changes in school cultures and the experiential impacts on children, families, teachers, parents and volunteers, and quantitative measures at baseline and 1 year follow up to provide supporting information regarding patterns of change. The evaluation study design addressed the limitations of many existing evaluation studies of cooking or garden programs. The multistrand approach to the mixed methodology maintained the rigour of the respective methods and provided an opportunity to explore complexity in the findings. Limited sensitivity of some of the quantitative measures was identified, as well as the potential for bias in the coding of the open-ended questions. The SAKG evaluation methodology will address the need for appropriate evaluation approaches for school-based kitchen garden programs. It demonstrates the feasibility of a meaningful, comprehensive evaluation of school-based programs and also demonstrates the central role qualitative methods can have in a mixed-method evaluation. So what? This paper contributes to debate about appropriate evaluation approaches to meet the information needs of all stakeholders and will support the sharing of measures and potential comparisons between program outcomes for comparable population groups and settings.

  9. Dispersal kernel estimation: A comparison of empirical and modelled particle dispersion in a coastal marine system

    NASA Astrophysics Data System (ADS)

    Hrycik, Janelle M.; Chassé, Joël; Ruddick, Barry R.; Taggart, Christopher T.

    2013-11-01

    Early life-stage dispersal influences recruitment and is of significance in explaining the distribution and connectivity of marine species. Motivations for quantifying dispersal range from biodiversity conservation to the design of marine reserves and the mitigation of species invasions. Here we compare estimates of real particle dispersion in a coastal marine environment with similar estimates provided by hydrodynamic modelling. We do so by using a system of magnetically attractive particles (MAPs) and a magnetic-collector array that provides measures of Lagrangian dispersion based on the time-integration of MAPs dispersing through the array. MAPs released as a point source in a coastal marine location dispersed through the collector array over a 5-7 d period. A virtual release and observed (real-time) environmental conditions were used in a high-resolution three-dimensional hydrodynamic model to estimate the dispersal of virtual particles (VPs). The number of MAPs captured throughout the collector array and the number of VPs that passed through each corresponding model location were enumerated and compared. Although VP dispersal reflected several aspects of the observed MAP dispersal, the comparisons demonstrated model sensitivity to the small-scale (random-walk) particle diffusivity parameter (Kp). The one-dimensional dispersal kernel for the MAPs had an e-folding scale estimate in the range of 5.19-11.44 km, while those from the model simulations were comparable at 1.89-6.52 km, and also demonstrated sensitivity to Kp. Variations among comparisons are related to the value of Kp used in modelling and are postulated to be related to MAP losses from the water column and (or) shear dispersion acting on the MAPs; a process that is constrained in the model. Our demonstration indicates a promising new way of 1) quantitatively and empirically estimating the dispersal kernel in aquatic systems, and 2) quantitatively assessing and (or) improving regional hydrodynamic models.

  10. Analysis of longitudinal "time series" data in toxicology.

    PubMed

    Cox, C; Cory-Slechta, D A

    1987-02-01

    Studies focusing on chronic toxicity or on the time course of toxicant effect often involve repeated measurements or longitudinal observations of endpoints of interest. Experimental design considerations frequently necessitate between-group comparisons of the resulting trends. Typically, procedures such as the repeated-measures analysis of variance have been used for statistical analysis, even though the required assumptions may not be satisfied in some circumstances. This paper describes an alternative analytical approach which summarizes curvilinear trends by fitting cubic orthogonal polynomials to individual profiles of effect. The resulting regression coefficients serve as quantitative descriptors which can be subjected to group significance testing. Randomization tests based on medians are proposed to provide a comparison of treatment and control groups. Examples from the behavioral toxicology literature are considered, and the results are compared to more traditional approaches, such as repeated-measures analysis of variance.

  11. Cone Beam CT vs. Fan Beam CT: A Comparison of Image Quality and Dose Delivered Between Two Differing CT Imaging Modalities.

    PubMed

    Lechuga, Lawrence; Weidlich, Georg A

    2016-09-12

    A comparison of image quality and dose delivered between two differing computed tomography (CT) imaging modalities-fan beam and cone beam-was performed. A literature review of quantitative analyses for various image quality aspects such as uniformity, signal-to-noise ratio, artifact presence, spatial resolution, modulation transfer function (MTF), and low contrast resolution was generated. With these aspects quantified, cone beam computed tomography (CBCT) shows a superior spatial resolution to that of fan beam, while fan beam shows a greater ability to produce clear and anatomically correct images with better soft tissue differentiation. The results indicate that fan beam CT produces superior images to that of on-board imaging (OBI) cone beam CT systems, while providing a considerably less dose to the patient.

  12. Cone Beam CT vs. Fan Beam CT: A Comparison of Image Quality and Dose Delivered Between Two Differing CT Imaging Modalities

    PubMed Central

    Weidlich, Georg A.

    2016-01-01

    A comparison of image quality and dose delivered between two differing computed tomography (CT) imaging modalities—fan beam and cone beam—was performed. A literature review of quantitative analyses for various image quality aspects such as uniformity, signal-to-noise ratio, artifact presence, spatial resolution, modulation transfer function (MTF), and low contrast resolution was generated. With these aspects quantified, cone beam computed tomography (CBCT) shows a superior spatial resolution to that of fan beam, while fan beam shows a greater ability to produce clear and anatomically correct images with better soft tissue differentiation. The results indicate that fan beam CT produces superior images to that of on-board imaging (OBI) cone beam CT systems, while providing a considerably less dose to the patient. PMID:27752404

  13. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  14. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  15. Comparative study of the dynamics of lipid membrane phase decomposition in experiment and simulation.

    PubMed

    Burger, Stefan; Fraunholz, Thomas; Leirer, Christian; Hoppe, Ronald H W; Wixforth, Achim; Peter, Malte A; Franke, Thomas

    2013-06-25

    Phase decomposition in lipid membranes has been the subject of numerous investigations by both experiment and theoretical simulation, yet quantitative comparisons of the simulated data to the experimental results are rare. In this work, we present a novel way of comparing the temporal development of liquid-ordered domains obtained from numerically solving the Cahn-Hilliard equation and by inducing a phase transition in giant unilamellar vesicles (GUVs). Quantitative comparison is done by calculating the structure factor of the domain pattern. It turns out that the decomposition takes place in three distinct regimes in both experiment and simulation. These regimes are characterized by different rates of growth of the mean domain diameter, and there is quantitative agreement between experiment and simulation as to the duration of each regime and the absolute rate of growth in each regime.

  16. Comparison of 13C Nuclear Magnetic Resonance and Fourier Transform Infrared spectroscopy for estimating humification and aromatization of soil organic matter

    NASA Astrophysics Data System (ADS)

    Rogers, K.; Cooper, W. T.; Hodgkins, S. B.; Verbeke, B. A.; Chanton, J.

    2017-12-01

    Solid state direct polarization 13C NMR spectroscopy (DP-NMR) is generally considered the most quantitatively reliable method for soil organic matter (SOM) characterization, including determination of the relative abundances of carbon functional groups. These functional abundances can then be used to calculate important soil parameters such as degree of humification and extent of aromaticity that reveal differences in reactivity or compositional changes along gradients (e.g. thaw chronosequence in permafrost). Unfortunately, the 13C NMR DP-NMR experiment is time-consuming, with a single sample often requiring over 24 hours of instrument time. Alternatively, solid state cross polarization 13C NMR (CP-NMR) can circumvent this problem, reducing analyses times to 4-6 hours but with some loss of quantitative reliability. Attenuated Total Reflectance Fourier Transform Infrared spectroscopy (ATR-FTIR) is a quick and relatively inexpensive method for characterizing solid materials, and has been suggested as an alternative to NMR for analysis of soil organic matter and determination of humification (HI) and aromatization (AI) indices. However, the quantitative reliability of ATR-FTIR for SOM analyses has never been verified, nor have any ATR-FTIR data been compared to similar measurements by NMR. In this work we focused on FTIR vibrational bands that correspond to the three functional groups used to calculate HI and AI values: carbohydrates (1030 cm-1), aromatics (1510, 1630 cm-1), and aliphatics (2850, 2920 cm-1). Data from ATR-FTIR measurements were compared to analogous quantitation by DP- and CP-NMR using peat samples from Sweden, Minnesota, and North Carolina. DP- and CP-NMR correlate very strongly, although the correlations are not always 1:1. Direct comparison of relative abundances of the three functional groups determined by NMR and ATR-FTIR yielded satisfactory results for carbohydrates (r2= 0.78) and aliphatics (r2=0.58), but less so for aromatics (r2= 0.395). ATR-FTIR has to this point been used primarily for relative abundance analyses (e.g. calculating HI and AI values), but these results suggest FTIR can provide quantitative reliability that approaches that of NMR.

  17. Radar QPE for hydrological design: Intensity-Duration-Frequency curves

    NASA Astrophysics Data System (ADS)

    Marra, Francesco; Morin, Efrat

    2015-04-01

    Intensity-duration-frequency (IDF) curves are widely used in flood risk management since they provide an easy link between the characteristics of a rainfall event and the probability of its occurrence. They are estimated analyzing the extreme values of rainfall records, usually basing on raingauge data. This point-based approach raises two issues: first, hydrological design applications generally need IDF information for the entire catchment rather than a point, second, the representativeness of point measurements decreases with the distance from measure location, especially in regions characterized by steep climatological gradients. Weather radar, providing high resolution distributed rainfall estimates over wide areas, has the potential to overcome these issues. Two objections usually restrain this approach: (i) the short length of data records and (ii) the reliability of quantitative precipitation estimation (QPE) of the extremes. This work explores the potential use of weather radar estimates for the identification of IDF curves by means of a long length radar archive and a combined physical- and quantitative- adjustment of radar estimates. Shacham weather radar, located in the eastern Mediterranean area (Tel Aviv, Israel), archives data since 1990 providing rainfall estimates for 23 years over a region characterized by strong climatological gradients. Radar QPE is obtained correcting the effects of pointing errors, ground echoes, beam blockage, attenuation and vertical variations of reflectivity. Quantitative accuracy is then ensured with a range-dependent bias adjustment technique and reliability of radar QPE is assessed by comparison with gauge measurements. IDF curves are derived from the radar data using the annual extremes method and compared with gauge-based curves. Results from 14 study cases will be presented focusing on the effects of record length and QPE accuracy, exploring the potential application of radar IDF curves for ungauged locations and providing insights on the use of radar QPE for hydrological design studies.

  18. Inter- and intra-observer agreement of BI-RADS-based subjective visual estimation of amount of fibroglandular breast tissue with magnetic resonance imaging: comparison to automated quantitative assessment.

    PubMed

    Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja

    2016-11-01

    To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.

  19. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method ...

  20. Employment from Solar Energy: A Bright but Partly Cloudy Future.

    ERIC Educational Resources Information Center

    Smeltzer, K. K.; Santini, D. J.

    A comparison of quantitative and qualitative employment effects of solar and conventional systems can prove the increased employment postulated as one of the significant secondary benefits of a shift from conventional to solar energy use. Current quantitative employment estimates show solar technology-induced employment to be generally greater…

  1. Using Facebook as a LMS?

    ERIC Educational Resources Information Center

    Arabacioglu, Taner; Akar-Vural, Ruken

    2014-01-01

    The main purpose of this research was to compare the communication media according to effective teaching. For this purpose, in the research, the mixed method, including quantitative and qualitative data collecting techniques, was applied. For the quantitative part of the research, the static group comparison design was implemented as one of the…

  2. Single Laboratory Comparison of Quantitative Real-time PCR Assays for the Detection of Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) assays available to detect and enumerate fecal pollution in ambient waters. Each assay employs distinct primers and probes that target different rRNA genes and microorganisms leading to potential variations in concentration es...

  3. Comparison of genetic diversity and population structure of Pacific Coast whitebark pine across multiple markers

    Treesearch

    Andrew D. Bower; Bryce A. Richardson; Valerie Hipkins; Regina Rochefort; Carol Aubry

    2011-01-01

    Analysis of "neutral" molecular markers and "adaptive" quantitative traits are common methods of assessing genetic diversity and population structure. Molecular markers typically reflect the effects of demographic and stochastic processes but are generally assumed to not reflect natural selection. Conversely, quantitative (or "adaptive")...

  4. Geological characterization and statistical comparison of outcrop and subsurface facies: Shannon Shelf sand ridges

    NASA Astrophysics Data System (ADS)

    Jackson, S.; Szpaklewicz, M.; Tomutsa, L.

    1987-09-01

    The primary objective of this research is to develop a methodology for constructing accurate quantitative models of reservoir heterogeneities. The resulting models are expected to improve predictions of flow patterns, spatial distribution of residual oil after secondary and tertiary recovery operations, and ultimate oil recovery. The purpose of this study is to provide preliminary evaluation of the usefulness of outcrop information in characterizing analogous reservoirs and to develop research techniques necessary for model development. The Shannon Sandstone, a shelf sand ridge deposit in the Powder River Basin, Wyoming, was studied. Sedimentologic and petrophysical features of an outcrop exposure of the High-Energy Ridge-Margin facies (HERM) within the Shannon were compared with those from a Shannon sandstone reservoir in Teapot Dome field. Comparisons of outcrop and subsurface permeability and porosity histograms, cumulative distribution functions, correlation lengths and natural logarithm of permeability versus porosity plots indicate a strong similarity between Shannon outcrop and Teapot Dome HERM facies petrophysical properties. Permeability classes found in outcrop samples can be related to crossbedded zones and shaley, rippled, and bioturbated zones. Similar permeability classes related to similar sedimentologic features were found in Teapot Dome field. The similarities of outcrop and Teapot Dome petrophysical properties, which are from the same geologic facies but from different depositional episodes, suggest that rocks deposited under similar depositional processes within a given deposystem have similar reservoir properties. The results of the study indicate that the use of quantitative outcrop information in characterizing reservoirs may provide a significant improvement in reservoir characterization.

  5. Testing a Preliminary Live with Love Conceptual Framework for cancer couple dyads: A mixed-methods study.

    PubMed

    Li, Qiuping; Xu, Yinghua; Zhou, Huiya; Loke, Alice Yuen

    2015-12-01

    The purpose of this study was to test the previous proposed Preliminary Live with Love Conceptual Framework (P-LLCF) that focuses on spousal caregiver-patient couples in their journey of coping with cancer as dyads. A mixed-methods study that included qualitative and quantitative approaches was conducted. Methods of concept and theory analysis, and structural equation modeling (SEM) were applied in testing the P-LLCF. In the qualitative approach in testing the concepts included in the P-LLCF, a comparison was made between the P-LLCF with a preliminary conceptual framework derived from focus group interviews among Chinese couples' coping with cancer. The comparison showed that the concepts identified in the P-LLCF are relevant to the phenomenon under scrutiny, and attributes of the concepts are consistent with those identified among Chinese cancer couple dyads. In the quantitative study, 117 cancer couples were recruited. The findings showed that inter-relationships exist among the components included in the P-LLCF: event situation, dyadic mediators, dyadic appraisal, dyadic coping, and dyadic outcomes. In that the event situation will impact the dyadic outcomes directly or indirectly through Dyadic Mediators. The dyadic mediators, dyadic appraisal, and dyadic coping are interrelated and work together to benefit the dyadic outcomes. This study provides evidence that supports the interlinked components and the relationship included in the P-LLCF. The findings of this study are important in that they provide healthcare professionals with guidance and directions according to the P-LLCF on how to plan supportive programs for couples coping with cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    PubMed

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  7. Spatially Regularized Machine Learning for Task and Resting-state fMRI

    PubMed Central

    Song, Xiaomu; Panych, Lawrence P.; Chen, Nan-kuei

    2015-01-01

    Background Reliable mapping of brain function across sessions and/or subjects in task- and resting-state has been a critical challenge for quantitative fMRI studies although it has been intensively addressed in the past decades. New Method A spatially regularized support vector machine (SVM) technique was developed for the reliable brain mapping in task- and resting-state. Unlike most existing SVM-based brain mapping techniques, which implement supervised classifications of specific brain functional states or disorders, the proposed method performs a semi-supervised classification for the general brain function mapping where spatial correlation of fMRI is integrated into the SVM learning. The method can adapt to intra- and inter-subject variations induced by fMRI nonstationarity, and identify a true boundary between active and inactive voxels, or between functionally connected and unconnected voxels in a feature space. Results The method was evaluated using synthetic and experimental data at the individual and group level. Multiple features were evaluated in terms of their contributions to the spatially regularized SVM learning. Reliable mapping results in both task- and resting-state were obtained from individual subjects and at the group level. Comparison with Existing Methods A comparison study was performed with independent component analysis, general linear model, and correlation analysis methods. Experimental results indicate that the proposed method can provide a better or comparable mapping performance at the individual and group level. Conclusions The proposed method can provide accurate and reliable mapping of brain function in task- and resting-state, and is applicable to a variety of quantitative fMRI studies. PMID:26470627

  8. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  9. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    PubMed

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Quantitative and Qualitative Differences in Morphological Traits Revealed between Diploid Fragaria Species

    PubMed Central

    SARGENT, DANIEL J.; GEIBEL, M.; HAWKINS, J. A.; WILKINSON, M. J.; BATTEY, N. H.; SIMPSON, D. W.

    2004-01-01

    • Background and Aims The aims of this investigation were to highlight the qualitative and quantitative diversity apparent between nine diploid Fragaria species and produce interspecific populations segregating for a large number of morphological characters suitable for quantitative trait loci analysis. • Methods A qualitative comparison of eight described diploid Fragaria species was performed and measurements were taken of 23 morphological traits from 19 accessions including eight described species and one previously undescribed species. A principal components analysis was performed on 14 mathematically unrelated traits from these accessions, which partitioned the species accessions into distinct morphological groups. Interspecific crosses were performed with accessions of species that displayed significant quantitative divergence and, from these, populations that should segregate for a range of quantitative traits were raised. • Key Results Significant differences between species were observed for all 23 morphological traits quantified and three distinct groups of species accessions were observed after the principal components analysis. Interspecific crosses were performed between these groups, and F2 and backcross populations were raised that should segregate for a range of morphological characters. In addition, the study highlighted a number of distinctive morphological characters in many of the species studied. • Conclusions Diploid Fragaria species are morphologically diverse, yet remain highly interfertile, making the group an ideal model for the study of the genetic basis of phenotypic differences between species through map-based investigation using quantitative trait loci. The segregating interspecific populations raised will be ideal for such investigations and could also provide insights into the nature and extent of genome evolution within this group. PMID:15469944

  11. Interdisciplinary research on patient-provider communication: a cross-method comparison.

    PubMed

    Chou, Wen-Ying Sylvia; Han, Paul; Pilsner, Alison; Coa, Kisha; Greenberg, Larrie; Blatt, Benjamin

    2011-01-01

    Patient-provider communication, a key aspect of healthcare delivery, has been assessed through multiple methods for purposes of research, education, and quality control. Common techniques include satisfaction ratings and quantitatively- and qualitatively-oriented direct observations. Identifying the strengths and weaknesses of different approaches is critically important in determining the appropriate assessment method for a specific research or practical goal. Analyzing ten videotaped simulated encounters between medical students and Standardized Patients (SPs), this study compared three existing assessment methods through the same data set. Methods included: (1) dichotomized SP ratings on students' communication skills; (2) Roter Interaction Analysis System (RIAS) analysis; and (3) inductive discourse analysis informed by sociolinguistic theories. The large dichotomous contrast between good and poor ratings in (1) was not evidenced in any of the other methods. Following a discussion of strengths and weaknesses of each approach, we pilot-tested a combined assessment done by coders blinded to results of (1)-(3). This type of integrative approach has the potential of adding a quantifiable dimension to qualitative, discourse-based observations. Subjecting the same data set to separate analytic methods provides an excellent opportunity for methodological comparisons with the goal of informing future assessment of clinical encounters.

  12. Primary production in the Delta: Then and now

    USGS Publications Warehouse

    Cloern, James E.; Robinson, April; Richey, Amy; Grenier, Letitia; Grossinger, Robin; Boyer, Katharyn E.; Burau, Jon; Canuel, Elizabeth A.; DeGeorge, John F.; Drexler, Judith Z.; Enright, Chris; Howe, Emily R.; Kneib, Ronald; Mueller-Solger, Anke; Naiman, Robert J.; Pinckney, James L.; Safran, Samuel M.; Schoellhamer, David H.; Simenstad, Charles A.

    2016-01-01

    To evaluate the role of restoration in the recovery of the Delta ecosystem, we need to have clear targets and performance measures that directly assess ecosystem function. Primary production is a crucial ecosystem process, which directly limits the quality and quantity of food available for secondary consumers such as invertebrates and fish. The Delta has a low rate of primary production, but it is unclear whether this was always the case. Recent analyses from the Historical Ecology Team and Delta Landscapes Project provide quantitative comparisons of the areal extent of 14 habitat types in the modern Delta versus the historical Delta (pre-1850). Here we describe an approach for using these metrics of land use change to: (1) produce the first quantitative estimates of how Delta primary production and the relative contributions from five different producer groups have been altered by large-scale drainage and conversion to agriculture; (2) convert these production estimates into a common currency so the contributions of each producer group reflect their food quality and efficiency of transfer to consumers; and (3) use simple models to discover how tidal exchange between marshes and open water influences primary production and its consumption. Application of this approach could inform Delta management in two ways. First, it would provide a quantitative estimate of how large-scale conversion to agriculture has altered the Delta's capacity to produce food for native biota. Second, it would provide restoration practitioners with a new approach—based on ecosystem function—to evaluate the success of restoration projects and gauge the trajectory of ecological recovery in the Delta region.

  13. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.

  14. GLM Proxy Data Generation: Methods for Stroke/Pulse Level Inter-Comparison of Ground-Based Lightning Reference Networks

    NASA Technical Reports Server (NTRS)

    Cummins, Kenneth L.; Carey, Lawrence D.; Schultz, Christopher J.; Bateman, Monte G.; Cecil, Daniel J.; Rudlosky, Scott D.; Petersen, Walter Arthur; Blakeslee, Richard J.; Goodman, Steven J.

    2011-01-01

    In order to produce useful proxy data for the GOES-R Geostationary Lightning Mapper (GLM) in regions not covered by VLF lightning mapping systems, we intend to employ data produced by ground-based (regional or global) VLF/LF lightning detection networks. Before using these data in GLM Risk Reduction tasks, it is necessary to have a quantitative understanding of the performance of these networks, in terms of CG flash/stroke DE, cloud flash/pulse DE, location accuracy, and CLD/CG classification error. This information is being obtained through inter-comparison with LMAs and well-quantified VLF/LF lightning networks. One of our approaches is to compare "bulk" counting statistics on the spatial scale of convective cells, in order to both quantify relative performance and observe variations in cell-based temporal trends provided by each network. In addition, we are using microsecond-level stroke/pulse time correlation to facilitate detailed inter-comparisons at a more-fundamental level. The current development status of our ground-based inter-comparison and evaluation tools will be presented, and performance metrics will be discussed through a comparison of Vaisala s Global Lightning Dataset (GLD360) with the NLDN at locations within and outside the U.S.

  15. GLM Proxy Data Generation: Methods for Stroke/Pulse Level Inter-comparison of Ground-based Lightning Reference Networks

    NASA Astrophysics Data System (ADS)

    Cummins, K. L.; Carey, L. D.; Schultz, C. J.; Bateman, M. G.; Cecil, D. J.; Rudlosky, S. D.; Petersen, W. A.; Blakeslee, R. J.; Goodman, S. J.

    2011-12-01

    In order to produce useful proxy data for the GOES-R Geostationary Lightning Mapper (GLM) in regions not covered by VLF lightning mapping systems, we intend to employ data produced by ground-based (regional or global) VLF/LF lightning detection networks. Before using these data in GLM Risk Reduction tasks, it is necessary to have a quantitative understanding of the performance of these networks, in terms of CG flash/stroke DE, cloud flash/pulse DE, location accuracy, and CLD/CG classification error. This information is being obtained through inter-comparison with LMAs and well-quantified VLF/LF lightning networks. One of our approaches is to compare "bulk" counting statistics on the spatial scale of convective cells, in order to both quantify relative performance and observe variations in cell-based temporal trends provided by each network. In addition, we are using microsecond-level stroke/pulse time correlation to facilitate detailed inter-comparisons at a more-fundamental level. The current development status of our ground-based inter-comparison and evaluation tools will be presented, and performance metrics will be discussed through a comparison of Vaisala's Global Lightning Dataset (GLD360) with the NLDN at locations within and outside the U.S.

  16. Noise Maps for Quantitative and Clinical Severity Towards Long-Term ECG Monitoring.

    PubMed

    Everss-Villalba, Estrella; Melgarejo-Meseguer, Francisco Manuel; Blanco-Velasco, Manuel; Gimeno-Blanes, Francisco Javier; Sala-Pla, Salvador; Rojo-Álvarez, José Luis; García-Alberola, Arcadi

    2017-10-25

    Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.

  17. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography.

    PubMed

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Ake; Winter, Reidar

    2009-08-25

    Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 +/- 3.7% and -0.2 +/- 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant.

  18. Comparison of LEWICE and GlennICE in the SLD Regime

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Potapczuk, Mark G.; Levinson, Laurie H.

    2008-01-01

    A research project is underway at the NASA Glenn Research Center (GRC) to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from two different computer programs. The first program, LEWICE version 3.2.2, has been reported on previously. The second program is GlennICE version 0.1. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the GRC Icing Research Tunnel (IRT) has also been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. This paper will show the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. This report will also provide a description of both programs. Comparisons are then made to recent additions to the SLD database and selected previous cases. Quantitative comparisons are shown for horn height, horn angle, icing limit, area, and leading edge thickness. The results show that the predicted results for both programs are within the accuracy limits of the experimental data for the majority of cases.

  19. Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing

    USDA-ARS?s Scientific Manuscript database

    Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...

  20. Cerebral Metabolic Rate of Oxygen (CMRO2 ) Mapping by Combining Quantitative Susceptibility Mapping (QSM) and Quantitative Blood Oxygenation Level-Dependent Imaging (qBOLD).

    PubMed

    Cho, Junghun; Kee, Youngwook; Spincemaille, Pascal; Nguyen, Thanh D; Zhang, Jingwei; Gupta, Ajay; Zhang, Shun; Wang, Yi

    2018-03-07

    To map the cerebral metabolic rate of oxygen (CMRO 2 ) by estimating the oxygen extraction fraction (OEF) from gradient echo imaging (GRE) using phase and magnitude of the GRE data. 3D multi-echo gradient echo imaging and perfusion imaging with arterial spin labeling were performed in 11 healthy subjects. CMRO 2 and OEF maps were reconstructed by joint quantitative susceptibility mapping (QSM) to process GRE phases and quantitative blood oxygen level-dependent (qBOLD) modeling to process GRE magnitudes. Comparisons with QSM and qBOLD alone were performed using ROI analysis, paired t-tests, and Bland-Altman plot. The average CMRO 2 value in cortical gray matter across subjects were 140.4 ± 14.9, 134.1 ± 12.5, and 184.6 ± 17.9 μmol/100 g/min, with corresponding OEFs of 30.9 ± 3.4%, 30.0 ± 1.8%, and 40.9 ± 2.4% for methods based on QSM, qBOLD, and QSM+qBOLD, respectively. QSM+qBOLD provided the highest CMRO 2 contrast between gray and white matter, more uniform OEF than QSM, and less noisy OEF than qBOLD. Quantitative CMRO 2 mapping that fits the entire complex GRE data is feasible by combining QSM analysis of phase and qBOLD analysis of magnitude. © 2018 International Society for Magnetic Resonance in Medicine.

  1. A comprehensive screen for volatile organic compounds in biological fluids.

    PubMed

    Sharp, M E

    2001-10-01

    A headspace gas chromatographic (GC) screen for common volatile organic compounds in biological fluids is reported. Common GC phases, DB-1 and DB-WAX, with split injection provide separation and identification of more than 40 compounds in a single 20-min run. In addition, this method easily accommodates quantitation. The screen detects commonly encountered volatile compounds at levels below 4 mg%. A control mixture, providing qualitative and semiquantitative information, is described. For comparison, elution of the volatiles on a specialty phase, DB-624, is reported. This method is an expansion and modification of a screen that had been used for more than 20 years. During its first year of use, the expanded screen has proven to be advantageous in routine forensic casework.

  2. Teleradiology Via The Naval Remote Medical Diagnosis System (RMDS)

    NASA Astrophysics Data System (ADS)

    Rasmussen, Will; Stevens, Ilya; Gerber, F. H.; Kuhlman, Jayne A.

    1982-01-01

    Testing was conducted to obtain qualitative and quantitative (statistical) data on radiology performance using the Remote Medical Diagnosis System (RMDS) Advanced Development Models (ADMs)1. Based upon data collected during testing with professional radiologists, this analysis addresses the clinical utility of radiographic images transferred through six possible RMDS transmission modes. These radiographs were also viewed under closed-circuit television (CCTV) and lightbox conditions to provide a basis for comparison. The analysis indicates that the RMDS ADM terminals (with a system video resolution of 525 x 256 x 6) would provide satisfactory radiographic images for radiology consultations in emergency cases with gross pathological disorders. However, in cases involving more subtle findings, a system video resolution of 525 x 512 x 8 would be preferable.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterialsmore » or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.« less

  4. A Quantitative Theoretical Framework For Protein-Induced Fluorescence Enhancement-Förster-Type Resonance Energy Transfer (PIFE-FRET).

    PubMed

    Lerner, Eitan; Ploetz, Evelyn; Hohlbein, Johannes; Cordes, Thorben; Weiss, Shimon

    2016-07-07

    Single-molecule, protein-induced fluorescence enhancement (PIFE) serves as a molecular ruler at molecular distances inaccessible to other spectroscopic rulers such as Förster-type resonance energy transfer (FRET) or photoinduced electron transfer. In order to provide two simultaneous measurements of two distances on different molecular length scales for the analysis of macromolecular complexes, we and others recently combined measurements of PIFE and FRET (PIFE-FRET) on the single molecule level. PIFE relies on steric hindrance of the fluorophore Cy3, which is covalently attached to a biomolecule of interest, to rotate out of an excited-state trans isomer to the cis isomer through a 90° intermediate. In this work, we provide a theoretical framework that accounts for relevant photophysical and kinetic parameters of PIFE-FRET, show how this framework allows the extraction of the fold-decrease in isomerization mobility from experimental data, and show how these results provide information on changes in the accessible volume of Cy3. The utility of this model is then demonstrated for experimental results on PIFE-FRET measurement of different protein-DNA interactions. The proposed model and extracted parameters could serve as a benchmark to allow quantitative comparison of PIFE effects in different biological systems.

  5. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution - Poster

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method p...

  6. Exciting New Images | Lunar Reconnaissance Orbiter Camera

    Science.gov Websites

    slowly and relentlessly reshapes the Moon's topography. Comparative study of the shapes of lunar craters , quantitative comparison be derived? And how can we quantify and compare the topography of a large number of for quantitative characterization of impact crater topography (Mahanti, P. et al., 2014, Icarus v. 241

  7. A Comparison of Learning Cultures in Different Sizes and Types

    ERIC Educational Resources Information Center

    Brown, Paula D.; Finch, Kim S.; MacGregor, Cynthia

    2012-01-01

    This study compared relevant data and information about leadership and learning cultures in different sizes and types of high schools. Research was conducted using a quantitative design with a qualitative element. Quantitative data were gathered using a researcher-created survey. Independent sample t-tests were conducted to analyze the means of…

  8. Does Pre-Service Preparation Matter? Examining an Old Question in New Ways

    ERIC Educational Resources Information Center

    Ronfeldt, Matthew

    2014-01-01

    Background: Over the past decade, most of the quantitative studies on teacher preparation have focused on comparisons between alternative and traditional routes. There has been relatively little quantitative research on specific features of teacher education that might cause certain pathways into teaching to be more effective than others. The vast…

  9. Detection limits and cost comparisons of human- and gull-associated conventional and quantitative PCR assays in artificial and environmental waters

    EPA Science Inventory

    Modern techniques for tracking fecal pollution in environmental waters require investing in DNA-based methods to determine the presence of specific fecal sources. To help water quality managers decide whether to employ routine polymerase chain reaction (PCR) or quantitative PC...

  10. Comparison of quantitative PCR assays for Escherichia coli targeting ribosomal RNA and single copy genes

    EPA Science Inventory

    Aims: Compare specificity and sensitivity of quantitative PCR (qPCR) assays targeting single and multi-copy gene regions of Escherichia coli. Methods and Results: A previously reported assay targeting the uidA gene (uidA405) was used as the basis for comparing the taxono...

  11. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  12. Quantitative comparison of tumor delivery for multiple targeted nanoparticles simultaneously by multiplex ICP-MS.

    PubMed

    Elias, Andrew; Crayton, Samuel H; Warden-Rothman, Robert; Tsourkas, Andrew

    2014-07-28

    Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples.

  13. LCSH and PRECIS in Music: A Comparison.

    ERIC Educational Resources Information Center

    Gabbard, Paula Beversdorf

    1985-01-01

    By studying examples of their applications by two major English language bibliographic agencies, this article compares strengths and weaknesses of PRECIS and Library of Congress Subject Headings for books about music. Highlights include quantitative and qualitative analysis, comparison of number of subject statements, and terminology problems in…

  14. Comparison of Quantitative Antifungal Testing Methods for Textile Fabrics.

    PubMed

    Imoto, Yasuo; Seino, Satoshi; Nakagawa, Takashi; Yamamoto, Takao A

    2017-01-01

     Quantitative antifungal testing methods for textile fabrics under growth-supportive conditions were studied. Fungal growth activities on unfinished textile fabrics and textile fabrics modified with Ag nanoparticles were investigated using the colony counting method and the luminescence method. Morphological changes of the fungi during incubation were investigated by microscopic observation. Comparison of the results indicated that the fungal growth activity values obtained with the colony counting method depended on the morphological state of the fungi on textile fabrics, whereas those obtained with the luminescence method did not. Our findings indicated that unique characteristics of each testing method must be taken into account for the proper evaluation of antifungal activity.

  15. Nanoscale mapping of electromechanical response in ionic conductive ceramics with piezoelectric inclusions

    DOE PAGES

    Seol, Daehee; Seo, Hosung; Jesse, Stephen; ...

    2015-08-19

    Electromechanical (EM) response in ion conductive ceramics with piezoelectric inclusions was spatially explored using strain-based atomic force microscopy. Since the sample is composed of two dominant phases of ionic and piezoelectric phases, it allows us to explore two different EM responses of electrically induced ionic response and piezoresponse over the same surface. Furthermore, EM response of the ionic phase, i.e., electrochemical strain, was quantitatively investigated from the comparison with that of the piezoelectric phase, i.e., piezoresponse. Finally, these results could provide additional information on the EM properties, including the electrochemical strain at nanoscale.

  16. Nanoscale mapping of electromechanical response in ionic conductive ceramics with piezoelectric inclusions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seol, Daehee; Seo, Hosung; Jesse, Stephen

    Electromechanical (EM) response in ion conductive ceramics with piezoelectric inclusions was spatially explored using strain-based atomic force microscopy. Since the sample is composed of two dominant phases of ionic and piezoelectric phases, it allows us to explore two different EM responses of electrically induced ionic response and piezoresponse over the same surface. Furthermore, EM response of the ionic phase, i.e., electrochemical strain, was quantitatively investigated from the comparison with that of the piezoelectric phase, i.e., piezoresponse. Finally, these results could provide additional information on the EM properties, including the electrochemical strain at nanoscale.

  17. Nanoscale mapping of electromechanical response in ionic conductive ceramics with piezoelectric inclusions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seol, Daehee; Seo, Hosung; Kim, Yunseok, E-mail: yunseokkim@skku.edu

    Electromechanical (EM) response in ion conductive ceramics with piezoelectric inclusions was spatially explored using strain-based atomic force microscopy. Since the sample is composed of two dominant phases of ionic and piezoelectric phases, it allows us to explore two different EM responses of electrically induced ionic response and piezoresponse over the same surface. Furthermore, EM response of the ionic phase, i.e., electrochemical strain, was quantitatively investigated from the comparison with that of the piezoelectric phase, i.e., piezoresponse. These results could provide additional information on the EM properties, including the electrochemical strain at nanoscale.

  18. A summary of the vocabulary research with students who are deaf or hard of hearing.

    PubMed

    Luckner, John L; Cooke, Christine

    2010-01-01

    Vocabulary is essential for communicating, reading, thinking, and learning. In comparison to typical hearing peers, students who are deaf or hard of hearing demonstrate vocabulary knowledge that is quantitatively reduced. The authors review and summarize research studies published in peer-reviewed journals between 1967 and 2008 focusing on vocabulary and students who are deaf or hard of hearing. Forty-one studies are examined. A summary of each study is presented in a table, and potential educational implications are described. The authors note the paucity of research to guide instruction and provide suggestions for future research.

  19. Quantitative comparison of measurements of urgent care service quality.

    PubMed

    Qin, Hong; Prybutok, Victor; Prybutok, Gayle

    2016-01-01

    Service quality and patient satisfaction are essential to health care organization success. Parasuraman, Zeithaml, and Berry introduced SERVQUAL, a prominent service quality measure not yet applied to urgent care. We develop an instrument to measure perceived service quality and identify the determinants of patient satisfaction/ behavioral intentions. We examine the relationships among perceived service quality, patient satisfaction and behavioral intentions, and demonstrate that urgent care service quality is not equivalent using measures of perceptions only, differences of expectations minus perceptions, ratio of perceptions to expectations, and the log of the ratio. Perceptions provide the best measure of urgent care service quality.

  20. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Comparison of different approaches to quantitative adenovirus detection in stool specimens of hematopoietic stem cell transplant recipients.

    PubMed

    Kosulin, K; Dworzak, S; Lawitschka, A; Matthes-Leodolter, S; Lion, T

    2016-12-01

    Adenoviruses almost invariably proliferate in the gastrointestinal tract prior to dissemination, and critical threshold concentrations in stool correlate with the risk of viremia. Monitoring of adenovirus loads in stool may therefore be important for timely initiation of treatment in order to prevent invasive infection. Comparison of a manual DNA extraction kit in combination with a validated in-house PCR assay with automated extraction on the NucliSENS-EasyMAG device coupled with the Adenovirus R-gene kit (bioMérieux) for quantitative adenovirus analysis in stool samples. Stool specimens spiked with adenovirus concentrations in a range from 10E2-10E11 copies/g and 32 adenovirus-positive clinical stool specimens from pediatric stem cell transplant recipients were tested along with appropriate negative controls. Quantitative analysis of viral load in adenovirus-positive stool specimens revealed a median difference of 0.5 logs (range 0.1-2.2) between the detection systems tested and a difference of 0.3 logs (range 0.0-1.7) when the comparison was restricted to the PCR assays only. Spiking experiments showed a detection limit of 10 2 -10 3 adenovirus copies/g stool revealing a somewhat higher sensitivity offered by the automated extraction. The dynamic range of accurate quantitative analysis by both systems investigated was between 10 3 and 10 8 virus copies/g. The differences in quantitative analysis of adenovirus copy numbers between the systems tested were primarily attributable to the DNA extraction method used, while the qPCR assays revealed a high level of concordance. Both systems showed adequate performance for detection and monitoring of adenoviral load in stool specimens. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Position-Specific Hydrogen and Carbon Isotope Fractionations of Light Hydrocarbons by Quantitative NMR

    NASA Astrophysics Data System (ADS)

    Liu, C.; Mcgovern, G. P.; Horita, J.

    2015-12-01

    Traditional isotope ratio mass spectrometry methods to measure 2H/1H and 13C/12C ratios of organic molecules only provide average isotopic values of whole molecules. During the measurement process, valuable information of position-specific isotope fractionations (PSIF) between non-equivalent H and C positions is lost, which can provide additional very useful information about the origins and history of organic molecules. Quantitative nuclear magnetic resonance (NMR) spectrometry can measure 2H and 13C PSIF of organic molecules without destruction. The 2H and 13C signals from different positions of a given molecule show up as distinctive peaks in an NMR spectrum, and their peak areas are proportional to the 2H and 13C populations at each position. Moreover, quantitative NMR can be applied to a wide variety of organic molecules. We have been developing quantitative NMR methods to determine 2H and 13C PSIF of light hydrocarbons (propane, butane and pentane), using J-Young and custom-made high-pressure NMR cells. With careful conditioning of the NMR spectrometer (e.g. tuning, shimming) and effective 1H -13C decoupling, precision of ± <10‰ (2H) and ± <1‰ (13C) can be readily attainable after several hours of acquisition. Measurement time depends on the relaxation time of interested nucleus and the total number of scans needed for high signal-to-noise ratios. Our data for commercial, pure hydrocarbon samples showed that 2H PSIF in the hydrocarbons can be larger than 60‰ and that 13C PSIF can be as large as 15‰. Comparison with theoretical calculations indicates that the PSIF patterns of some hydrocarbon samples reflect non-equilibrium processes in their productions.

  3. The role of PET quantification in cardiovascular imaging.

    PubMed

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries has been demonstrated.

  4. Assessing agreement between preclinical magnetic resonance imaging and histology: An evaluation of their image qualities and quantitative results

    PubMed Central

    Elschner, Cindy; Korn, Paula; Hauptstock, Maria; Schulz, Matthias C.; Range, Ursula; Jünger, Diana; Scheler, Ulrich

    2017-01-01

    One consequence of demographic change is the increasing demand for biocompatible materials for use in implants and prostheses. This is accompanied by a growing number of experimental animals because the interactions between new biomaterials and its host tissue have to be investigated. To evaluate novel materials and engineered tissues the use of non-destructive imaging modalities have been identified as a strategic priority. This provides the opportunity for studying interactions repeatedly with individual animals, along with the advantages of reduced biological variability and decreased number of laboratory animals. However, histological techniques are still the golden standard in preclinical biomaterial research. The present article demonstrates a detailed method comparison between histology and magnetic resonance imaging. This includes the presentation of their image qualities as well as the detailed statistical analysis for assessing agreement between quantitative measures. Exemplarily, the bony ingrowth of tissue engineered bone substitutes for treatment of a cleft-like maxillary bone defect has been evaluated. By using a graphical concordance analysis the mean difference between MRI results and histomorphometrical measures has been examined. The analysis revealed a slightly but significant bias in the case of the bone volume (biasHisto−MRI:Bone volume=2.40 %, p<0.005) and a clearly significant deviation for the remaining defect width (biasHisto−MRI:Defect width=−6.73 %, p≪0.005). But the study although showed a considerable effect of the analyzed section position to the quantitative result. It could be proven, that the bias of the data sets was less originated due to the imaging modalities, but mainly on the evaluation of different slice positions. The article demonstrated that method comparisons not always need the use of an independent animal study, additionally. PMID:28666026

  5. Evaluating reporter genes of different luciferases for optimized in vivo bioluminescence imaging of transplanted neural stem cells in the brain.

    PubMed

    Mezzanotte, Laura; Aswendt, Markus; Tennstaedt, Annette; Hoeben, Rob; Hoehn, Mathias; Löwik, Clemens

    2013-01-01

    Bioluminescence imaging (BLI) has become the method of choice for optical tracking of cells in small laboratory animals. However, the use of luciferases from different species, depending on different substrates and emitting at distinct wavelengths, has not been optimized for sensitive neuroimaging. In order to identify the most suitable luciferase, this quantitative study compared the luciferases Luc2, CBG99, PpyRE9 and hRluc. Human embryonic kidney (HEK-293) cells and mouse neural stem cells were transduced by lentiviral vector-mediated transfer to express one of the four luciferases, together with copGFP. A T2A peptide linker promoted stoichiometric expression between both imaging reporters and the comparison of cell populations upon flow cytometry. Cell dilution series were used to determine highest BLI sensitivity in vitro for Luc2. However, Coelenterazine h-dependent hRluc signals clearly exceeded d-luciferin-dependent BLI in vitro. For the quantitative in vivo analysis, cells were transplanted into mouse brain and BLI was performed including the recording of emission kinetics and spectral characteristics. Differences in light kinetics were observed for d-luciferin vs Coelenterazine h. The emission spectra of Luc2 and PpyRE9 remained almost unchanged, while the emission spectrum of CBG99 became biphasic. Most importantly, photon emission decreased in the order of Luc2, CBG99, PpyRE9 to hRluc. The feasibility of combining different luciferases for dual color and dual substrate neuroimaging was tested and discussed. This investigation provides the first complete quantitative comparison of different luciferases expressed by neural stem cells. It results in a clear recommendation of Luc2 as the best luciferase selection for in vivo neuroimaging. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Infrared thermography quantitative image processing

    NASA Astrophysics Data System (ADS)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  7. A typology of street patterns.

    PubMed

    Louf, Rémi; Barthelemy, Marc

    2014-12-06

    We propose a quantitative method to classify cities according to their street pattern. We use the conditional probability distribution of shape factor of blocks with a given area and define what could constitute the 'fingerprint' of a city. Using a simple hierarchical clustering method, these fingerprints can then serve as a basis for a typology of cities. We apply this method to a set of 131 cities in the world, and at an intermediate level of the dendrogram, we observe four large families of cities characterized by different abundances of blocks of a certain area and shape. At a lower level of the classification, we find that most European cities and American cities in our sample fall in their own sub-category, highlighting quantitatively the differences between the typical layouts of cities in both regions. We also show with the example of New York and its different boroughs, that the fingerprint of a city can be seen as the sum of the ones characterizing the different neighbourhoods inside a city. This method provides a quantitative comparison of urban street patterns, which could be helpful for a better understanding of the causes and mechanisms behind their distinct shapes. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  8. Detailed Quantitative Classifications of Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Nair, Preethi

    2018-01-01

    Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.

  9. Background of SAM atom-fraction profiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernst, Frank

    Atom-fraction profiles acquired by SAM (scanning Auger microprobe) have important applications, e.g. in the context of alloy surface engineering by infusion of carbon or nitrogen through the alloy surface. However, such profiles often exhibit an artifact in form of a background with a level that anti-correlates with the local atom fraction. This article presents a theory explaining this phenomenon as a consequence of the way in which random noise in the spectrum propagates into the discretized differentiated spectrum that is used for quantification. The resulting model of “energy channel statistics” leads to a useful semi-quantitative background reduction procedure, which ismore » validated by applying it to simulated data. Subsequently, the procedure is applied to an example of experimental SAM data. The analysis leads to conclusions regarding optimum experimental acquisition conditions. The proposed method of background reduction is based on general principles and should be useful for a broad variety of applications. - Highlights: • Atom-fraction–depth profiles of carbon measured by scanning Auger microprobe • Strong background, varies with local carbon concentration. • Needs correction e.g. for quantitative comparison with simulations • Quantitative theory explains background. • Provides background removal strategy and practical advice for acquisition.« less

  10. Verification of continuum drift kinetic equation solvers in NIMROD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Held, E. D.; Ji, J.-Y.; Kruger, S. E.

    Verification of continuum solutions to the electron and ion drift kinetic equations (DKEs) in NIMROD [C. R. Sovinec et al., J. Comp. Phys. 195, 355 (2004)] is demonstrated through comparison with several neoclassical transport codes, most notably NEO [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)]. The DKE solutions use NIMROD's spatial representation, 2D finite-elements in the poloidal plane and a 1D Fourier expansion in toroidal angle. For 2D velocity space, a novel 1D expansion in finite elements is applied for the pitch angle dependence and a collocation grid is used for the normalized speedmore » coordinate. The full, linearized Coulomb collision operator is kept and shown to be important for obtaining quantitative results. Bootstrap currents, parallel ion flows, and radial particle and heat fluxes show quantitative agreement between NIMROD and NEO for a variety of tokamak equilibria. In addition, velocity space distribution function contours for ions and electrons show nearly identical detailed structure and agree quantitatively. A Θ-centered, implicit time discretization and a block-preconditioned, iterative linear algebra solver provide efficient electron and ion DKE solutions that ultimately will be used to obtain closures for NIMROD's evolving fluid model.« less

  11. Morphology enabled dipole inversion (MEDI) from a single-angle acquisition: comparison with COSMOS in human brain imaging.

    PubMed

    Liu, Tian; Liu, Jing; de Rochefort, Ludovic; Spincemaille, Pascal; Khalidov, Ildar; Ledoux, James Robert; Wang, Yi

    2011-09-01

    Magnetic susceptibility varies among brain structures and provides insights into the chemical and molecular composition of brain tissues. However, the determination of an arbitrary susceptibility distribution from the measured MR signal phase is a challenging, ill-conditioned inverse problem. Although a previous method named calculation of susceptibility through multiple orientation sampling (COSMOS) has solved this inverse problem both theoretically and experimentally using multiple angle acquisitions, it is often impractical to carry out on human subjects. Recently, the feasibility of calculating the brain susceptibility distribution from a single-angle acquisition was demonstrated using morphology enabled dipole inversion (MEDI). In this study, we further improved the original MEDI method by sparsifying the edges in the quantitative susceptibility map that do not have a corresponding edge in the magnitude image. Quantitative susceptibility maps generated by the improved MEDI were compared qualitatively and quantitatively with those generated by calculation of susceptibility through multiple orientation sampling. The results show a high degree of agreement between MEDI and calculation of susceptibility through multiple orientation sampling, and the practicality of MEDI allows many potential clinical applications. Copyright © 2011 Wiley-Liss, Inc.

  12. Mapping the function of neuronal ion channels in model and experiment

    PubMed Central

    Podlaski, William F; Seeholzer, Alexander; Groschner, Lukas N; Miesenböck, Gero; Ranjan, Rajnish; Vogels, Tim P

    2017-01-01

    Ion channel models are the building blocks of computational neuron models. Their biological fidelity is therefore crucial for the interpretation of simulations. However, the number of published models, and the lack of standardization, make the comparison of ion channel models with one another and with experimental data difficult. Here, we present a framework for the automated large-scale classification of ion channel models. Using annotated metadata and responses to a set of voltage-clamp protocols, we assigned 2378 models of voltage- and calcium-gated ion channels coded in NEURON to 211 clusters. The IonChannelGenealogy (ICGenealogy) web interface provides an interactive resource for the categorization of new and existing models and experimental recordings. It enables quantitative comparisons of simulated and/or measured ion channel kinetics, and facilitates field-wide standardization of experimentally-constrained modeling. DOI: http://dx.doi.org/10.7554/eLife.22152.001 PMID:28267430

  13. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  14. Computational comparison of aortic root stresses in presence of stentless and stented aortic valve bio-prostheses.

    PubMed

    Nestola, M G C; Faggiano, E; Vergara, C; Lancellotti, R M; Ippolito, S; Antona, C; Filippi, S; Quarteroni, A; Scrofani, R

    2017-02-01

    We provide a computational comparison of the performance of stentless and stented aortic prostheses, in terms of aortic root displacements and internal stresses. To this aim, we consider three real patients; for each of them, we draw the two prostheses configurations, which are characterized by different mechanical properties and we also consider the native configuration. For each of these scenarios, we solve the fluid-structure interaction problem arising between blood and aortic root, through Finite Elements. In particular, the Arbitrary Lagrangian-Eulerian formulation is used for the numerical solution of the fluid-dynamic equations and a hyperelastic material model is adopted to predict the mechanical response of the aortic wall and the two prostheses. The computational results are analyzed in terms of aortic flow, internal wall stresses and aortic wall/prosthesis displacements; a quantitative comparison of the mechanical behavior of the three scenarios is reported. The numerical results highlight a good agreement between stentless and native displacements and internal wall stresses, whereas higher/non-physiological stresses are found for the stented case.

  15. Validation metrics for turbulent plasma transport

    DOE PAGES

    Holland, C.

    2016-06-22

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  16. Comparison of manual and homogenizer methods for preparation of tick-derived stabilates of Theileria parva: equivalence testing using an in vitro titration model.

    PubMed

    Mbao, V; Speybroeck, N; Berkvens, D; Dolan, T; Dorny, P; Madder, M; Mulumba, M; Duchateau, L; Brandt, J; Marcotty, T

    2005-07-01

    Theileria parva sporozoite stabilates are used in the infection and treatment method of immunization, a widely accepted control option for East Coast fever in cattle. T. parva sporozoites are extracted from infected adult Rhipicephalus appendiculatus ticks either manually, using a pestle and a mortar, or by use of an electric homogenizer. A comparison of the two methods as a function of stabilate infectivity has never been documented. This study was designed to provide a quantitative comparison of stabilates produced by the two methods. The approach was to prepare batches of stabilate by both methods and then subject them to in vitro titration. Equivalence testing was then performed on the average effective doses (ED). The ratio of infective sporozoites yielded by the two methods was found to be 1.14 in favour of the manually ground stabilate with an upper limit of the 95% confidence interval equal to 1.3. We conclude that the choice of method rests more on costs, available infrastructure and standardization than on which method produces a richer sporozoite stabilate.

  17. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C.

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  18. A Database of Reaction Monitoring Mass Spectrometry Assays for Elucidating Therapeutic Response in Cancer

    PubMed Central

    Remily-Wood, Elizabeth R.; Liu, Richard Z.; Xiang, Yun; Chen, Yi; Thomas, C. Eric; Rajyaguru, Neal; Kaufman, Laura M.; Ochoa, Joana E.; Hazlehurst, Lori; Pinilla-Ibarz, Javier; Lancet, Jeffrey; Zhang, Guolin; Haura, Eric; Shibata, David; Yeatman, Timothy; Smalley, Keiran S.M.; Dalton, William S.; Huang, Emina; Scott, Ed; Bloom, Gregory C.; Eschrich, Steven A.; Koomen, John M.

    2012-01-01

    Purpose The Quantitative Assay Database (QuAD), http://proteome.moffitt.org/QUAD/, facilitates widespread implementation of quantitative mass spectrometry in cancer biology and clinical research through sharing of methods and reagents for monitoring protein expression and modification. Experimental Design Liquid chromatography coupled to multiple reaction monitoring mass spectrometry (LC-MRM) assays are developed using SDS-PAGE fractionated lysates from cancer cell lines. Pathway maps created using GeneGO Metacore provide the biological relationships between proteins and illustrate concepts for multiplexed analysis; each protein can be selected to examine assay development at the protein and peptide level. Results The coupling of SDS-PAGE and LC-MRM screening has been used to detect 876 peptides from 218 cancer-related proteins in model systems including colon, lung, melanoma, leukemias, and myeloma, which has led to the development of 95 quantitative assays including stable-isotope labeled peptide standards. Methods are published online and peptide standards are made available to the research community. Protein expression measurements for heat shock proteins, including a comparison with ELISA and monitoring response to the HSP90 inhibitor, 17-DMAG, are used to illustrate the components of the QuAD and its potential utility. Conclusions and Clinical Relevance This resource enables quantitative assessment of protein components of signaling pathways and biological processes and holds promise for systematic investigation of treatment responses in cancer. PMID:21656910

  19. Coupling of oceanic carbon and nitrogen: A window to spatially resolved quantitative reconstruction of nitrate inventories

    NASA Astrophysics Data System (ADS)

    Glock, N.; Liebetrau, V.; Gorb, S.; Wallmann, K. J. G.; Erdem, Z.; Schönfeld, J.; Eisenhauer, A.

    2017-12-01

    Anthropogenic impact has led to a severe acceleration of the global nitrogen cycle. Every second nitrogen atom in the biosphere may now originate from anthropogenic sources such as chemical fertilizers and the burning of fossil fuels. A quantitative reconstruction of past reactive nitrogen inventories is invaluable to facilitate projections for future scenarios and calibrations for such paleoproxies should be done as long the natural signature is still visible. Here we present a first quantitative reconstruction of nitrate concentrations in intermediate water depths of the Peruvian oxygen minimum zone over the last deglaciation using the pore density in the benthic foraminiferal species Bolivina spissa. A comparison of the nitrate reconstruction to the stable carbon isotope (δ13C) record reveals a strong coupling between the carbon and nitrogen cycles. The linear correlation between δ13C and nitrate availability remained stable over the last 22,000 years, facilitating the use of δ13C records as a quantitative nitrate proxy. The combination of the pore density record with δ13C records shows an elevated oceanic nitrate inventory during the Last Glacial Maximum as compared to the Holocene. Our novel proxy approach is consistent with the results of previous δ15N-based biogeochemical modeling studies, and thus provides sound estimates of the nitrate inventory in the glacial and deglacial ocean.

  20. Examining Provider Perspectives within Housing First and Traditional Programs

    PubMed Central

    Henwood, Benjamin F.; Shinn, Marybeth; Tsemberis, Sam; Padgett, Deborah K.

    2014-01-01

    Pathways’ Housing First represents a radical departure from traditional programs that serve individuals experiencing homelessness and co-occurring psychiatric and substance use disorders. This paper considered two federally funded comparison studies of Pathways’ Housing First and traditional programs to examine whether differences were reflected in the perspectives of frontline providers. Both quantitative analysis of responses to structured questions with close-ended responses and qualitative analysis of open-ended responses to semistructured questions showed that Pathways providers had greater endorsement of consumer values, lesser endorsement of systems values, and greater tolerance for abnormal behavior that did not result in harm to others than their counterparts in traditional programs. Comparing provider perspectives also revealed an “implementation paradox”; traditional providers were inhibited from engaging consumers in treatment and services without housing, whereas HF providers could focus on issues other than securing housing. As programs increasingly adopt a Housing First approach, implementation challenges remain due to an existing workforce habituated to traditional services. PMID:24659925

  1. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    NASA Astrophysics Data System (ADS)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative humidity (<20%) combined with elevated temperatures (>25°C) could cause sufficient cavitation to reduce hydraulic conductivity by 50%. This suggests that the Early Devonian environments that supported the earliest vascular plants were not subject to prolonged midseason droughts, or, alternatively, that the growing season was short. This places minimum constraints on water availability (e.g., groundwater hydration, relative humidity) in locations where Asteroxylon fossils are found; these environments must have had high relative humidities, comparable to tropical riparian environments. Given these constraints, biome-scale paleovegetation models that place early vascular plants distal to water sources can be revised to account for reduced drought tolerance. Paleoclimate proxies that treat early terrestrial plants as functionally interchangeable can incorporate physiological differences in a quantitatively meaningful way. Application of hydraulic models to fossil plants provides an additional perspective on the 475 million-year history of terrestrial photosynthetic environments and has potential to corroborate other plant-based paleoclimate proxies.

  2. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography

    PubMed Central

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Åke; Winter, Reidar

    2009-01-01

    Background Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Methods Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. Results There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 ± 3.7% and -0.2 ± 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Conclusion Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant. PMID:19706183

  3. Quantitative chemical imaging of the intracellular spatial distribution of fundamental elements and light metals in single cells.

    PubMed

    Malucelli, Emil; Iotti, Stefano; Gianoncelli, Alessandra; Fratini, Michela; Merolle, Lucia; Notargiacomo, Andrea; Marraccini, Chiara; Sargenti, Azzurra; Cappadone, Concettina; Farruggia, Giovanna; Bukreeva, Inna; Lombardo, Marco; Trombini, Claudio; Maier, Jeanette A; Lagomarsino, Stefano

    2014-05-20

    We report a method that allows a complete quantitative characterization of whole single cells, assessing the total amount of carbon, nitrogen, oxygen, sodium, and magnesium and providing submicrometer maps of element molar concentration, cell density, mass, and volume. This approach allows quantifying elements down to 10(6) atoms/μm(3). This result was obtained by applying a multimodal fusion approach that combines synchrotron radiation microscopy techniques with off-line atomic force microscopy. The method proposed permits us to find the element concentration in addition to the mass fraction and provides a deeper and more complete knowledge of cell composition. We performed measurements on LoVo human colon cancer cells sensitive (LoVo-S) and resistant (LoVo-R) to doxorubicin. The comparison of LoVo-S and LoVo-R revealed different patterns in the maps of Mg concentration with higher values within the nucleus in LoVo-R and in the perinuclear region in LoVo-S cells. This feature was not so evident for the other elements, suggesting that Mg compartmentalization could be a significant trait of the drug-resistant cells.

  4. X-ray Phase Contrast Allows Three Dimensional, Quantitative Imaging of Hydrogel Implants

    DOE PAGES

    Appel, Alyssa A.; Larson, Jeffrey C.; Jiang, Bin; ...

    2015-10-20

    Three dimensional imaging techniques are needed for the evaluation and assessment of biomaterials used for tissue engineering and drug delivery applications. Hydrogels are a particularly popular class of materials for medical applications but are difficult to image in tissue using most available imaging modalities. Imaging techniques based on X-ray Phase Contrast (XPC) have shown promise for tissue engineering applications due to their ability to provide image contrast based on multiple X-ray properties. In this manuscript we describe results using XPC to image a model hydrogel and soft tissue structure. Porous fibrin loaded poly(ethylene glycol) hydrogels were synthesized and implanted inmore » a rodent subcutaneous model. Samples were explanted and imaged with an analyzer-based XPC technique and processed and stained for histology for comparison. Both hydrogel and soft tissues structures could be identified in XPC images. Structure in skeletal muscle adjacent could be visualized and invading fibrovascular tissue could be quantified. In quantitative results, there were no differences between XPC and the gold-standard histological measurements. These results provide evidence of the significant potential of techniques based on XPC for 3D imaging of hydrogel structure and local tissue response.« less

  5. EmbryoMiner: A new framework for interactive knowledge discovery in large-scale cell tracking data of developing embryos.

    PubMed

    Schott, Benjamin; Traub, Manuel; Schlagenhauf, Cornelia; Takamiya, Masanari; Antritter, Thomas; Bartschat, Andreas; Löffler, Katharina; Blessing, Denis; Otte, Jens C; Kobitski, Andrei Y; Nienhaus, G Ulrich; Strähle, Uwe; Mikut, Ralf; Stegmaier, Johannes

    2018-04-01

    State-of-the-art light-sheet and confocal microscopes allow recording of entire embryos in 3D and over time (3D+t) for many hours. Fluorescently labeled structures can be segmented and tracked automatically in these terabyte-scale 3D+t images, resulting in thousands of cell migration trajectories that provide detailed insights to large-scale tissue reorganization at the cellular level. Here we present EmbryoMiner, a new interactive open-source framework suitable for in-depth analyses and comparisons of entire embryos, including an extensive set of trajectory features. Starting at the whole-embryo level, the framework can be used to iteratively focus on a region of interest within the embryo, to investigate and test specific trajectory-based hypotheses and to extract quantitative features from the isolated trajectories. Thus, the new framework provides a valuable new way to quantitatively compare corresponding anatomical regions in different embryos that were manually selected based on biological prior knowledge. As a proof of concept, we analyzed 3D+t light-sheet microscopy images of zebrafish embryos, showcasing potential user applications that can be performed using the new framework.

  6. Preliminary Comparison of Multi-scale and Multi-model Direct Inversion Algorithms for 3T MR Elastography.

    PubMed

    Yoshimitsu, Kengo; Shinagawa, Yoshinobu; Mitsufuji, Toshimichi; Mutoh, Emi; Urakawa, Hiroshi; Sakamoto, Keiko; Fujimitsu, Ritsuko; Takano, Koichi

    2017-01-10

    To elucidate whether any differences are present in the stiffness map obtained with a multiscale direct inversion algorithm (MSDI) vs that with a multimodel direct inversion algorithm (MMDI), both qualitatively and quantitatively. The MR elastography (MRE) data of 37 consecutive patients who underwent liver MR elastography between September and October 2014 were retrospectively analyzed by using both MSDI and MMDI. Two radiologists qualitatively assessed the stiffness maps for the image quality in consensus, and the measured liver stiffness and measurable areas were quantitatively compared between MSDI and MMDI. MMDI provided a stiffness map of better image quality, with comparable or slightly less artifacts. Measurable areas by MMDI (43.7 ± 17.8 cm 2 ) was larger than that by MSDI (37.5 ± 14.7 cm 2 ) (P < 0.05). Liver stiffness measured by MMDI (4.51 ± 2.32 kPa) was slightly (7%), but significantly less than that by MSDI (4.86 ± 2.44 kPa) (P < 0.05). MMDI can provide stiffness map of better image quality, and slightly lower stiffness values as compared to MSDI at 3T MRE, which radiologists should be aware of.

  7. Comparison of Quantitative and Qualitative Research Traditions: Epistemological, Theoretical, and Methodological Differences

    ERIC Educational Resources Information Center

    Yilmaz, Kaya

    2013-01-01

    There has been much discussion about quantitative and qualitative approaches to research in different disciplines. In the behavioural and social sciences, these two paradigms are compared to reveal their relative strengths and weaknesses. But the debate about both traditions has commonly taken place in academic books. It is hard to find an article…

  8. Multi-laboratory comparison of quantitative PCR assays for detection and quantification of Fusarium virguliforme from soybean roots and soil

    USDA-ARS?s Scientific Manuscript database

    Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...

  9. COMPARISON OF POPULATIONS OF MOULD SPECIES IN HOMES IN THE UK AND US USING MOLD-SPECIFIC QUANTITATIVE PCR (MSQPCR)

    EPA Science Inventory

    The goal of this research was to compare the populations of 81 mold species in homes in USA and UK using mould specific quantitative polymerase chain reaction (MSQPCR) technology. Dust samples were obtained from randomly selected homes in Great Britain (n=11). The mould populat...

  10. Identities and Transformational Experiences for Quantitative Problem Solving: Gender Comparisons of First-Year University Science Students

    ERIC Educational Resources Information Center

    Hudson, Peter; Matthews, Kelly

    2012-01-01

    Women are underrepresented in science, technology, engineering and mathematics (STEM) areas in university settings; however this may be the result of attitude rather than aptitude. There is widespread agreement that quantitative problem-solving is essential for graduate competence and preparedness in science and other STEM subjects. The research…

  11. COMPARISON OF ENTEROCOCCUS MEASUREMENTS IN FRESHWATER AT TWO RECREATIONAL BEACHES BY QUANTITATIVE POLYMERASE CHAIN REACTION AND MEMBRANE FILER CULTURE ANALYSIS

    EPA Science Inventory

    Cell densities of the fecal pollution indicator genus, Enterococcus, were determined by a rapid (2-3 hr) quantitative PCR (QPCR) analysis based method in 100 ml water samples collected from recreational beaches on Lake Michigan and Lake Erie during the summer of 2003. Enumeration...

  12. Examining the Inclusion of Quantitative Research in a Meta-Ethnographic Review

    ERIC Educational Resources Information Center

    Booker, Rhae-Ann Richardson

    2010-01-01

    This study explored how one might extend meta-ethnography to quantitative research for the advancement of interpretive review methods. Using the same population of 139 studies on racial-ethnic matching as data, my investigation entailed an extended meta-ethnography (EME) and comparison of its results to a published meta-analysis (PMA). Adhering to…

  13. Inclusion and Student Learning: A Quantitative Comparison of Special and General Education Student Performance Using Team and Solo-Teaching

    ERIC Educational Resources Information Center

    Jamison, Joseph A.

    2013-01-01

    This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…

  14. Quantitative Chemical Imaging and Unsupervised Analysis Using Hyperspectral Coherent Anti-Stokes Raman Scattering Microscopy

    PubMed Central

    2013-01-01

    In this work, we report a method to acquire and analyze hyperspectral coherent anti-Stokes Raman scattering (CARS) microscopy images of organic materials and biological samples resulting in an unbiased quantitative chemical analysis. The method employs singular value decomposition on the square root of the CARS intensity, providing an automatic determination of the components above noise, which are retained. Complex CARS susceptibility spectra, which are linear in the chemical composition, are retrieved from the CARS intensity spectra using the causality of the susceptibility by two methods, and their performance is evaluated by comparison with Raman spectra. We use non-negative matrix factorization applied to the imaginary part and the nonresonant real part of the susceptibility with an additional concentration constraint to obtain absolute susceptibility spectra of independently varying chemical components and their absolute concentration. We demonstrate the ability of the method to provide quantitative chemical analysis on known lipid mixtures. We then show the relevance of the method by imaging lipid-rich stem-cell-derived mouse adipocytes as well as differentiated embryonic stem cells with a low density of lipids. We retrieve and visualize the most significant chemical components with spectra given by water, lipid, and proteins segmenting the image into the cell surrounding, lipid droplets, cytosol, and the nucleus, and we reveal the chemical structure of the cells, with details visualized by the projection of the chemical contrast into a few relevant channels. PMID:24099603

  15. Quantitative investigation of the edge enhancement in in-line phase contrast projections and tomosynthesis provided by distributing microbubbles on the interface between two tissues: a phantom study

    NASA Astrophysics Data System (ADS)

    Wu, Di; Donovan Wong, Molly; Li, Yuhua; Fajardo, Laurie; Zheng, Bin; Wu, Xizeng; Liu, Hong

    2017-12-01

    The objective of this study was to quantitatively investigate the ability to distribute microbubbles along the interface between two tissues, in an effort to improve the edge and/or boundary features in phase contrast imaging. The experiments were conducted by employing a custom designed tissue simulating phantom, which also simulated a clinical condition where the ligand-targeted microbubbles are self-aggregated on the endothelium of blood vessels surrounding malignant cells. Four different concentrations of microbubble suspensions were injected into the phantom: 0%, 0.1%, 0.2%, and 0.4%. A time delay of 5 min was implemented before image acquisition to allow the microbubbles to become distributed at the interface between the acrylic and the cavity simulating a blood vessel segment. For comparison purposes, images were acquired using three system configurations for both projection and tomosynthesis imaging with a fixed radiation dose delivery: conventional low-energy contact mode, low-energy in-line phase contrast and high-energy in-line phase contrast. The resultant images illustrate the edge feature enhancements in the in-line phase contrast imaging mode when the microbubble concentration is extremely low. The quantitative edge-enhancement-to-noise ratio calculations not only agree with the direct image observations, but also indicate that the edge feature enhancement can be improved by increasing the microbubble concentration. In addition, high-energy in-line phase contrast imaging provided better performance in detecting low-concentration microbubble distributions.

  16. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    NASA Astrophysics Data System (ADS)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  17. Analyses of Disruption of Cerebral White Matter Integrity in Schizophrenia with MR Diffusion Tensor Fiber Tracking Method

    NASA Astrophysics Data System (ADS)

    Yamamoto, Utako; Kobayashi, Tetsuo; Kito, Shinsuke; Koga, Yoshihiko

    We have analyzed cerebral white matter using magnetic resonance diffusion tensor imaging (MR-DTI) to measure the diffusion anisotropy of water molecules. The goal of this study is the quantitative evaluation of schizophrenia. Diffusion tensor images are acquired for patients with schizophrenia and healthy comparison subjects, group-matched for age, sex, and handedness. Fiber tracking is performed on the superior longitudinal fasciculus for the comparison between the patient and comparison groups. We have analysed and compared the cross-sectional area on the starting coronal plane and the mean and standard deviation of the fractional anisotropy and the apparent diffusion coefficient along fibers in the right and left hemispheres. In the right hemisphere, the cross-sectional areas in patient group are significantly smaller than those in the comparison group. Furthermore, in the comparison group, the cross-sectional areas in the right hemisphere are significantly larger than those in the left hemisphere, whereas there is no significant difference in the patient group. These results suggest that we may evaluate the disruption in white matter integrity in schizophrenic patients quantitatively by comparing the cross-sectional area of the superior longitudinal fasciculus in the right and left hemispheres.

  18. A Backscatter-Lidar Forward-Operator

    NASA Astrophysics Data System (ADS)

    Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland

    2015-04-01

    We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.

  19. Three-dimensional structural modelling and calculation of electrostatic potentials of HLA Bw4 and Bw6 epitopes to explain the molecular basis for alloantibody binding: toward predicting HLA antigenicity and immunogenicity.

    PubMed

    Mallon, Dermot H; Bradley, J Andrew; Winn, Peter J; Taylor, Craig J; Kosmoliaptsis, Vasilis

    2015-02-01

    We have previously shown that qualitative assessment of surface electrostatic potential of HLA class I molecules helps explain serological patterns of alloantibody binding. We have now used a novel computational approach to quantitate differences in surface electrostatic potential of HLA B-cell epitopes and applied this to explain HLA Bw4 and Bw6 antigenicity. Protein structure models of HLA class I alleles expressing either the Bw4 or Bw6 epitope (defined by sequence motifs at positions 77 to 83) were generated using comparative structure prediction. The electrostatic potential in 3-dimensional space encompassing the Bw4/Bw6 epitope was computed by solving the Poisson-Boltzmann equation and quantitatively compared in a pairwise, all-versus-all fashion to produce distance matrices that cluster epitopes with similar electrostatics properties. Quantitative comparison of surface electrostatic potential at the carboxyl terminal of the α1-helix of HLA class I alleles, corresponding to amino acid sequence motif 77 to 83, produced clustering of HLA molecules in 3 principal groups according to Bw4 or Bw6 epitope expression. Remarkably, quantitative differences in electrostatic potential reflected known patterns of serological reactivity better than Bw4/Bw6 amino acid sequence motifs. Quantitative assessment of epitope electrostatic potential allowed the impact of known amino acid substitutions (HLA-B*07:02 R79G, R82L, G83R) that are critical for antibody binding to be predicted. We describe a novel approach for quantitating differences in HLA B-cell epitope electrostatic potential. Proof of principle is provided that this approach enables better assessment of HLA epitope antigenicity than amino acid sequence data alone, and it may allow prediction of HLA immunogenicity.

  20. Gyrokinetic modeling of impurity peaking in JET H-mode plasmas

    NASA Astrophysics Data System (ADS)

    Manas, P.; Camenen, Y.; Benkadda, S.; Weisen, H.; Angioni, C.; Casson, F. J.; Giroud, C.; Gelfusa, M.; Maslov, M.

    2017-06-01

    Quantitative comparisons are presented between gyrokinetic simulations and experimental values of the carbon impurity peaking factor in a database of JET H-modes during the carbon wall era. These plasmas feature strong NBI heating and hence high values of toroidal rotation and corresponding gradient. Furthermore, the carbon profiles present particularly interesting shapes for fusion devices, i.e., hollow in the core and peaked near the edge. Dependencies of the experimental carbon peaking factor ( R / L nC ) on plasma parameters are investigated via multilinear regressions. A marked correlation between R / L nC and the normalised toroidal rotation gradient is observed in the core, which suggests an important role of the rotation in establishing hollow carbon profiles. The carbon peaking factor is then computed with the gyrokinetic code GKW, using a quasi-linear approach, supported by a few non-linear simulations. The comparison of the quasi-linear predictions to the experimental values at mid-radius reveals two main regimes. At low normalised collisionality, ν * , and T e / T i < 1 , the gyrokinetic simulations quantitatively recover experimental carbon density profiles, provided that rotodiffusion is taken into account. In contrast, at higher ν * and T e / T i > 1 , the very hollow experimental carbon density profiles are never predicted by the simulations and the carbon density peaking is systematically over estimated. This points to a possible missing ingredient in this regime.

  1. Grating-based tomography applications in biomedical engineering

    NASA Astrophysics Data System (ADS)

    Schulz, Georg; Thalmann, Peter; Khimchenko, Anna; Müller, Bert

    2017-10-01

    For the investigation of soft tissues or tissues consisting of soft and hard tissues on the microscopic level, hard X-ray phase tomography has become one of the most suitable imaging techniques. Besides other phase contrast methods grating interferometry has the advantage of higher sensitivity than inline methods and the quantitative results. One disadvantage of the conventional double-grating setup (XDGI) compared to inline methods is the limitation of the spatial resolution. This limitation can be overcome by removing the analyser grating resulting in a single-grating setup (XSGI). In order to verify the performance of XSGI concerning contrast and spatial resolution, a quantitative comparison of XSGI and XDGI tomograms of a human nerve was performed. Both techniques provide sufficient contrast to allow for the distinction of tissue types. The spatial resolution of the two-fold binned XSGI data set is improved by a factor of two in comparison to XDGI which underlies its performance in tomography of soft tissues. Another application for grating-based X-ray phase tomography is the simultaneous visualization of soft and hard tissues of a plaque-containing coronary artery. The simultaneous visualization of both tissues is important for the segmentation of the lumen. The segmented data can be used for flow simulations in order to obtain information about the three-dimensional wall shear stress distribution needed for the optimization of mechano-sensitive nanocontainers used for drug delivery.

  2. Randomized controlled trials and meta-analysis in medical education: what role do they play?

    PubMed

    Cook, David A

    2012-01-01

    Education researchers seek to understand what works, for whom, in what circumstances. Unfortunately, educational environments are complex and research itself is highly context dependent. Faced with these challenges, some have argued that qualitative methods should supplant quantitative methods such as randomized controlled trials (RCTs) and meta-analysis. I disagree. Good qualitative and mixed-methods research are complementary to, rather than exclusive of, quantitative methods. The complexity and challenges we face should not beguile us into ignoring methods that provide strong evidence. What, then, is the proper role for RCTs and meta-analysis in medical education? First, the choice of study design depends on the research question. RCTs and meta-analysis are appropriate for many, but not all, study goals. They have compelling strengths but also numerous limitations. Second, strong methods will not compensate for a pointless question. RCTs do not advance the science when they make confounded comparisons, or make comparison with no intervention. Third, clinical medicine now faces many of the same challenges we encounter in education. We can learn much from other fields about how to handle complexity in RCTs. Finally, no single study will definitively answer any research question. We need carefully planned, theory-building, programmatic research, reflecting a variety of paradigms and approaches, as we accumulate evidence to change the art and science of education.

  3. A quantitative and qualitative comparison of illumina MiSeq and 454 amplicon sequencing for genotyping the highly polymorphic major histocompatibility complex (MHC) in a non-model species.

    PubMed

    Razali, Haslina; O'Connor, Emily; Drews, Anna; Burke, Terry; Westerdahl, Helena

    2017-07-28

    High-throughput sequencing enables high-resolution genotyping of extremely duplicated genes. 454 amplicon sequencing (454) has become the standard technique for genotyping the major histocompatibility complex (MHC) genes in non-model organisms. However, illumina MiSeq amplicon sequencing (MiSeq), which offers a much higher read depth, is now superseding 454. The aim of this study was to quantitatively and qualitatively evaluate the performance of MiSeq in relation to 454 for genotyping MHC class I alleles using a house sparrow (Passer domesticus) dataset with pedigree information. House sparrows provide a good study system for this comparison as their MHC class I genes have been studied previously and, consequently, we had prior expectations concerning the number of alleles per individual. We found that 454 and MiSeq performed equally well in genotyping amplicons with low diversity, i.e. amplicons from individuals that had fewer than 6 alleles. Although there was a higher rate of failure in the 454 dataset in resolving amplicons with higher diversity (6-9 alleles), the same genotypes were identified by both 454 and MiSeq in 98% of cases. We conclude that low diversity amplicons are equally well genotyped using either 454 or MiSeq, but the higher coverage afforded by MiSeq can lead to this approach outperforming 454 in amplicons with higher diversity.

  4. Are Neurodynamic Organizations A Fundamental Property of Teamwork?

    PubMed Central

    Stevens, Ronald H.; Galloway, Trysha L.

    2017-01-01

    When performing a task it is important for teams to optimize their strategies and actions to maximize value and avoid the cost of surprise. The decisions teams make sometimes have unintended consequences and they must then reorganize their thinking, roles and/or configuration into corrective structures more appropriate for the situation. In this study we ask: What are the neurodynamic properties of these reorganizations and how do they relate to the moment-by-moment, and longer, performance-outcomes of teams?. We describe an information-organization approach for detecting and quantitating the fluctuating neurodynamic organizations in teams. Neurodynamic organization is the propensity of team members to enter into prolonged (minutes) metastable neurodynamic relationships as they encounter and resolve disturbances to their normal rhythms. Team neurodynamic organizations were detected and modeled by transforming the physical units of each team member's EEG power levels into Shannon entropy-derived information units about the team's organization and synchronization. Entropy is a measure of the variability or uncertainty of information in a data stream. This physical unit to information unit transformation bridges micro level social coordination events with macro level expert observations of team behavior allowing multimodal comparisons across the neural, cognitive and behavioral time scales of teamwork. The measures included the entropy of each team member's data stream, the overall team entropy and the mutual information between dyad pairs of the team. Mutual information can be thought of as periods related to team member synchrony. Comparisons between individual entropy and mutual information levels for the dyad combinations of three-person teams provided quantitative estimates of the proportion of a person's neurodynamic organizations that represented periods of synchrony with other team members, which in aggregate provided measures of the overall degree of neurodynamic interactions of the team. We propose that increased neurodynamic organization occurs when a team's operating rhythm can no longer support the complexity of the task and the team needs to expend energy to re-organize into structures that better minimize the “surprise” in the environment. Consistent with this hypothesis, the frequency and magnitude of neurodynamic organizations were less in experienced military and healthcare teams than they were in more junior teams. Similar dynamical properties of neurodynamic organization were observed in models of the EEG data streams of military, healthcare and high school science teams suggesting that neurodynamic organization may be a common property of teamwork. The innovation of this study is the potential it raises for developing globally applicable quantitative models of team dynamics that will allow comparisons to be made across teams, tasks and training protocols. PMID:28512438

  5. Quantitative comparison of some aesthetic factors among rivers

    USGS Publications Warehouse

    Leopold, Luna Bergere

    1969-01-01

    It is difficult to evaluate the factors contributing to aesthetic or nonmonetary aspects of a landscape. In contrast, aspects which lend themselves to cost-benefit comparisons are now treated in a routine way. As a result, nonmonetary values are described either in emotion-loaded words or else are mentioned and thence forgotten.The present report is a preliminary attempt to quantify some elements of aesthetic appeal while eliminating, insofar as possible, value judgments or personal preferences. If methods of recording such factors can be developed, the results promise to be a useful, new kind of basic data needed in many planning and decision-making circumstances. Such data would be especially useful when choices must be made among alternative courses of action. Such data would tend to provide a more prominent consideration of the nonmonetary aspects of a landscape.Assignment of quantitative estimates to aesthetic factors leads not so much to ratios of value as to relative rank positions. In fact, value itself tends to carry a connotation of preference, whereas ranking can more easily be used for categorization without attribution of preference and thus it tends to a void the introduction at too early a stage of differences in preference. Because the Federal Power Commission has been studying an application for a permit to construct one or more additional hydropower dams in the vicinity of Hells Canyon of the Snake River, the localities studied for the present discussion are in that region of Idaho. Hopefully, the data collected will provide some useful information on factors related to nonmonetary values in the region. The present discussion has been kept free of the preference judgments of the writer, and throughout the discussions observations are treated as facts.

  6. Comparison of Grid Nudging and Spectral Nudging Techniques for Dynamical Climate Downscaling within the WRF Model

    NASA Astrophysics Data System (ADS)

    Fan, X.; Chen, L.; Ma, Z.

    2010-12-01

    Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.

  7. A comparison of visual and quantitative methods to identify interstitial lung abnormalities.

    PubMed

    Kliment, Corrine R; Araki, Tetsuro; Doyle, Tracy J; Gao, Wei; Dupuis, Josée; Latourelle, Jeanne C; Zazueta, Oscar E; Fernandez, Isis E; Nishino, Mizuki; Okajima, Yuka; Ross, James C; Estépar, Raúl San José; Diaz, Alejandro A; Lederer, David J; Schwartz, David A; Silverman, Edwin K; Rosas, Ivan O; Washko, George R; O'Connor, George T; Hatabu, Hiroto; Hunninghake, Gary M

    2015-10-29

    Evidence suggests that individuals with interstitial lung abnormalities (ILA) on a chest computed tomogram (CT) may have an increased risk to develop a clinically significant interstitial lung disease (ILD). Although methods used to identify individuals with ILA on chest CT have included both automated quantitative and qualitative visual inspection methods, there has been not direct comparison between these two methods. To investigate this relationship, we created lung density metrics and compared these to visual assessments of ILA. To provide a comparison between ILA detection methods based on visual assessment we generated measures of high attenuation areas (HAAs, defined by attenuation values between -600 and -250 Hounsfield Units) in >4500 participants from both the COPDGene and Framingham Heart studies (FHS). Linear and logistic regressions were used for analyses. Increased measures of HAAs (in ≥ 10 % of the lung) were significantly associated with ILA defined by visual inspection in both cohorts (P < 0.0001); however, the positive predictive values were not very high (19 % in COPDGene and 13 % in the FHS). In COPDGene, the association between HAAs and ILA defined by visual assessment were modified by the percentage of emphysema and body mass index. Although increased HAAs were associated with reductions in total lung capacity in both cohorts, there was no evidence for an association between measurement of HAAs and MUC5B promoter genotype in the FHS. Our findings demonstrate that increased measures of lung density may be helpful in determining the severity of lung volume reduction, but alone, are not strongly predictive of ILA defined by visual assessment. Moreover, HAAs were not associated with MUC5B promoter genotype.

  8. Quantitative Validation of the Integrated Medical Model (IMM) for ISS Missions

    NASA Technical Reports Server (NTRS)

    Young, Millennia; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    Lifetime Surveillance of Astronaut Health (LSAH) provided observed medical event data on 33 ISS and 111 STS person-missions for use in further improving and validating the Integrated Medical Model (IMM). Using only the crew characteristics from these observed missions, the newest development version, IMM v4.0, will simulate these missions to predict medical events and outcomes. Comparing IMM predictions to the actual observed medical event counts will provide external validation and identify areas of possible improvement. In an effort to improve the power of detecting differences in this validation study, the total over each program ISS and STS will serve as the main quantitative comparison objective, specifically the following parameters: total medical events (TME), probability of loss of crew life (LOCL), and probability of evacuation (EVAC). Scatter plots of observed versus median predicted TMEs (with error bars reflecting the simulation intervals) will graphically display comparisons while linear regression will serve as the statistical test of agreement. Two scatter plots will be analyzed 1) where each point reflects a mission and 2) where each point reflects a condition-specific total number of occurrences. The coefficient of determination (R2) resulting from a linear regression with no intercept bias (intercept fixed at zero) will serve as an overall metric of agreement between IMM and the real world system (RWS). In an effort to identify as many possible discrepancies as possible for further inspection, the -level for all statistical tests comparing IMM predictions to observed data will be set to 0.1. This less stringent criterion, along with the multiple testing being conducted, should detect all perceived differences including many false positive signals resulting from random variation. The results of these analyses will reveal areas of the model requiring adjustment to improve overall IMM output, which will thereby provide better decision support for mission critical applications.

  9. dCLIP: a computational approach for comparative CLIP-seq analyses

    PubMed Central

    2014-01-01

    Although comparison of RNA-protein interaction profiles across different conditions has become increasingly important to understanding the function of RNA-binding proteins (RBPs), few computational approaches have been developed for quantitative comparison of CLIP-seq datasets. Here, we present an easy-to-use command line tool, dCLIP, for quantitative CLIP-seq comparative analysis. The two-stage method implemented in dCLIP, including a modified MA normalization method and a hidden Markov model, is shown to be able to effectively identify differential binding regions of RBPs in four CLIP-seq datasets, generated by HITS-CLIP, iCLIP and PAR-CLIP protocols. dCLIP is freely available at http://qbrc.swmed.edu/software/. PMID:24398258

  10. Comparing Observations, 1st Experimental Edition.

    ERIC Educational Resources Information Center

    Butts, David P.

    Objectives for this module include the ability to: (1) order objects by comparing a property which the objects have in common (such as length, area, volume or mass), (2) describe objects (length, area, volume, mass, etc.) by comparing them quantitatively using either arbitrary units of comparison or standard units of comparison, and (3) describe…

  11. A Comparison of Behavioral and Emotional Characteristics in Children with Autism, Prader-Willi Syndrome, and Williams Syndrome

    ERIC Educational Resources Information Center

    Dimitropoulos, Anastasia; Ho, Alan Y.; Klaiman, Cheryl; Koenig, Kathy; Schultz, Robert T.

    2009-01-01

    In order to investigate unique and shared characteristics and to determine factors predictive of group classification, quantitative comparisons of behavioral and emotional problems were assessed using the Developmental Behavior Checklist (DBC-P) and the Vineland Adaptive Behavior Scales in autistic disorder, Williams syndrome (WS), and…

  12. Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate.

    PubMed

    Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine

    2016-01-01

    Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans , and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations.

  13. Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate

    PubMed Central

    Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine

    2016-01-01

    Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans, and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations. PMID:28066365

  14. Information-theoretic model comparison unifies saliency metrics

    PubMed Central

    Kümmerer, Matthias; Wallis, Thomas S. A.; Bethge, Matthias

    2015-01-01

    Learning the properties of an image associated with human gaze placement is important both for understanding how biological systems explore the environment and for computer vision applications. There is a large literature on quantitative eye movement models that seeks to predict fixations from images (sometimes termed “saliency” prediction). A major problem known to the field is that existing model comparison metrics give inconsistent results, causing confusion. We argue that the primary reason for these inconsistencies is because different metrics and models use different definitions of what a “saliency map” entails. For example, some metrics expect a model to account for image-independent central fixation bias whereas others will penalize a model that does. Here we bring saliency evaluation into the domain of information by framing fixation prediction models probabilistically and calculating information gain. We jointly optimize the scale, the center bias, and spatial blurring of all models within this framework. Evaluating existing metrics on these rephrased models produces almost perfect agreement in model rankings across the metrics. Model performance is separated from center bias and spatial blurring, avoiding the confounding of these factors in model comparison. We additionally provide a method to show where and how models fail to capture information in the fixations on the pixel level. These methods are readily extended to spatiotemporal models of fixation scanpaths, and we provide a software package to facilitate their use. PMID:26655340

  15. FT-IR imaging for quantitative determination of liver fat content in non-alcoholic fatty liver.

    PubMed

    Kochan, K; Maslak, E; Chlopicki, S; Baranska, M

    2015-08-07

    In this work we apply FT-IR imaging of large areas of liver tissue cross-section samples (∼5 cm × 5 cm) for quantitative assessment of steatosis in murine model of Non-Alcoholic Fatty Liver (NAFLD). We quantified the area of liver tissue occupied by lipid droplets (LDs) by FT-IR imaging and Oil Red O (ORO) staining for comparison. Two alternative FT-IR based approaches are presented. The first, straightforward method, was based on average spectra from tissues and provided values of the fat content by using a PLS regression model and the reference method. The second one – the chemometric-based method – enabled us to determine the values of the fat content, independently of the reference method by means of k-means cluster (KMC) analysis. In summary, FT-IR images of large size liver sections may prove to be useful for quantifying liver steatosis without the need of tissue staining.

  16. Quantitative intact specimen magnetic resonance microscopy at 3.0 T.

    PubMed

    Bath, Kevin G; Voss, Henning U; Jing, Deqiang; Anderson, Stewart; Hempstead, Barbara; Lee, Francis S; Dyke, Jonathan P; Ballon, Douglas J

    2009-06-01

    In this report, we discuss the application of a methodology for high-contrast, high-resolution magnetic resonance microscopy (MRM) of murine tissue using a 3.0-T imaging system. We employed a threefold strategy that included customized specimen preparation to maximize image contrast, three-dimensional data acquisition to minimize scan time and custom radiofrequency resonator design to maximize signal sensitivity. Images had a resolution of 100 x 78 x 78 microm(3) with a signal-to-noise ratio per voxel greater than 25:1 and excellent contrast-to-noise ratios over a 30-min acquisition. We quantitatively validated the methods through comparisons of neuroanatomy across two lines of genetically engineered mice. Specifically, we were able to detect volumetric differences of as little as 9% between genetically engineered mouse strains in multiple brain regions that were predictive of underlying impairments in brain development. The overall methodology was straightforward to implement and provides ready access to basic MRM at field strengths that are widely available in both the laboratory and the clinic.

  17. Positive visualization of implanted devices with susceptibility gradient mapping using the original resolution.

    PubMed

    Varma, Gopal; Clough, Rachel E; Acher, Peter; Sénégas, Julien; Dahnke, Hannes; Keevil, Stephen F; Schaeffter, Tobias

    2011-05-01

    In magnetic resonance imaging, implantable devices are usually visualized with a negative contrast. Recently, positive contrast techniques have been proposed, such as susceptibility gradient mapping (SGM). However, SGM reduces the spatial resolution making positive visualization of small structures difficult. Here, a development of SGM using the original resolution (SUMO) is presented. For this, a filter is applied in k-space and the signal amplitude is analyzed in the image domain to determine quantitatively the susceptibility gradient for each pixel. It is shown in simulations and experiments that SUMO results in a better visualization of small structures in comparison to SGM. SUMO is applied to patient datasets for visualization of stent and prostate brachytherapy seeds. In addition, SUMO also provides quantitative information about the number of prostate brachytherapy seeds. The method might be extended to application for visualization of other interventional devices, and, like SGM, it might also be used to visualize magnetically labelled cells. Copyright © 2010 Wiley-Liss, Inc.

  18. Quantitative assessment of groundwater quality using a biological indicator: some preliminary observations.

    PubMed

    Pfeil, R M; Venkat, J A; Plimmer, J R; Sham, S; Davis, K; Nair, P P

    1994-02-01

    The genotoxicity of groundwater was evaluated, using a novel application of the SOS microplate assay (SOSMA). Organic residues were extracted from groundwater samples from Maryland, Pennsylvania, and Delaware by using C-18 bonded silica solid phase extraction tubes. Total organic carbon content (TOC) of water samples was also determined. The genotoxicity of the extracts was determined by the SOSMA. Relative activity (RA) as determined by the SOSMA is a quantitative measure of genotoxicity based on a comparison to the activity of the mutagen, 4-nitroquinoline oxide. Low levels of RA (about 2x background) were detected in waters from sites within these states. There was considerable temporal and spatial variation in the observed RA, but no definite patterns were observed in the variation. Between sampling sites there was a positive correlation between RA and TOC; however, this relationship appeared to be reversed occasionally within a sampling site. The extraction and bioassay methods provide an easy and relatively inexpensive means of determining water quality.

  19. AESOP: A Python Library for Investigating Electrostatics in Protein Interactions.

    PubMed

    Harrison, Reed E S; Mohan, Rohith R; Gorham, Ronald D; Kieslich, Chris A; Morikis, Dimitrios

    2017-05-09

    Electric fields often play a role in guiding the association of protein complexes. Such interactions can be further engineered to accelerate complex association, resulting in protein systems with increased productivity. This is especially true for enzymes where reaction rates are typically diffusion limited. To facilitate quantitative comparisons of electrostatics in protein families and to describe electrostatic contributions of individual amino acids, we previously developed a computational framework called AESOP. We now implement this computational tool in Python with increased usability and the capability of performing calculations in parallel. AESOP utilizes PDB2PQR and Adaptive Poisson-Boltzmann Solver to generate grid-based electrostatic potential files for protein structures provided by the end user. There are methods within AESOP for quantitatively comparing sets of grid-based electrostatic potentials in terms of similarity or generating ensembles of electrostatic potential files for a library of mutants to quantify the effects of perturbations in protein structure and protein-protein association. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  20. An Overview of data science uses in bioimage informatics.

    PubMed

    Chessel, Anatole

    2017-02-15

    This review aims at providing a practical overview of the use of statistical features and associated data science methods in bioimage informatics. To achieve a quantitative link between images and biological concepts, one typically replaces an object coming from an image (a segmented cell or intracellular object, a pattern of expression or localisation, even a whole image) by a vector of numbers. They range from carefully crafted biologically relevant measurements to features learnt through deep neural networks. This replacement allows for the use of practical algorithms for visualisation, comparison and inference, such as the ones from machine learning or multivariate statistics. While originating mainly, for biology, in high content screening, those methods are integral to the use of data science for the quantitative analysis of microscopy images to gain biological insight, and they are sure to gather more interest as the need to make sense of the increasing amount of acquired imaging data grows more pressing. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Monitoring Peptidase Activities in Complex Proteomes by MALDI-TOF Mass Spectrometry

    PubMed Central

    Villanueva, Josep; Nazarian, Arpi; Lawlor, Kevin; Tempst, Paul

    2009-01-01

    Measuring enzymatic activities in biological fluids is a form of activity-based proteomics and may be utilized as a means of developing disease biomarkers. Activity-based assays allow amplification of output signals, thus potentially visualizing low-abundant enzymes on a virtually transparent whole-proteome background. The protocol presented here describes a semi-quantitative in vitro assay of proteolytic activities in complex proteomes by monitoring breakdown of designer peptide-substrates using robotic extraction and a MALDI-TOF mass spectrometric read-out. Relative quantitation of the peptide metabolites is done by comparison with spiked internal standards, followed by statistical analysis of the resulting mini-peptidome. Partial automation provides reproducibility and throughput essential for comparing large sample sets. The approach may be employed for diagnostic or predictive purposes and enables profiling of 96 samples in 30 hours. It could be tailored to many diagnostic and pharmaco-dynamic purposes, as a read-out of catalytic and metabolic activities in body fluids or tissues. PMID:19617888

  2. Towards quantitative off-axis electron holographic mapping of the electric field around the tip of a sharp biased metallic needle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beleggia, M.; Helmholtz-Zentrum Berlin für Materialien und Energie, Berlin; Kasama, T.

    We apply off-axis electron holography and Lorentz microscopy in the transmission electron microscope to map the electric field generated by a sharp biased metallic tip. A combination of experimental data and modelling provides quantitative information about the potential and the field around the tip. Close to the tip apex, we measure a maximum field intensity of 82 MV/m, corresponding to a field k factor of 2.5, in excellent agreement with theory. In order to verify the validity of the measurements, we use the inferred charge density distribution in the tip region to generate simulated phase maps and Fresnel (out-of-focus) imagesmore » for comparison with experimental measurements. While the overall agreement is excellent, the simulations also highlight the presence of an unexpected astigmatic contribution to the intensity in a highly defocused Fresnel image, which is thought to result from the geometry of the applied field.« less

  3. Surface temperature/heat transfer measurement using a quantitative phosphor thermography system

    NASA Technical Reports Server (NTRS)

    Buck, G. M.

    1991-01-01

    A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.

  4. NIST Efforts to Quality-Assure Gunpowder Measurements

    NASA Technical Reports Server (NTRS)

    MacCrehan, William A.; Reardon, Michelle R.

    2000-01-01

    In the past few years, the National Institute for Standards and Technology (NIST) has been promoting the idea of quantitatively determining the additives in smokeless gunpowder using micellar capillary electrophoresis as a means of investigating the criminal use of hand guns and pipe bombs. As a part of this effort, we have evaluated both supercritical fluid and ultrasonic solvent extractions for the quantitative recovery of nitroglycerin (NG), diphenylamine (DPA), N-nitrosodiphenylamine (NnDPA), and ethyl centralite (EC) from gunpowder. Recoveries were evaluated by repeat extraction and matrix spiking experiments. The final extraction protocol provides greater than 95 percent recoveries. To help other researches validate their own analytical method for additive determinations, NIST is exploring the development of a standard reference material, Additives in Smokeless Gunpowder. The evaluated method is being applied to two double-base (NG-containing) powders, one stabilized with diphenylamine and the other with ethyl centralite. As part of this reference material development effort, we are conducting an interlaboratory comparison exercise among the forensic and military gunpowder measurement community.

  5. New Insights Toward Quantitative Relationships between Lignin Reactivity to Monomers and Their Structural Characteristics.

    PubMed

    Ma, Ruoshui; Zhang, Xiumei; Wang, Yi; Zhang, Xiao

    2018-04-27

    The heterogeneous and complex structural characteristics of lignin present a significant challenge to predict its processability (e.g. depolymerization, modifications etc) to valuable products. This study provides a detailed characterization and comparison of structural properties of seven representative biorefinery lignin samples derived from forest and agricultural residues, which were subjected to representative pretreatment methods. A range of wet chemistry and spectroscopy methods were applied to determine specific lignin structural characteristics such as functional groups, inter-unit linkages and peak molecular weight. In parallel, oxidative depolymerization of these lignin samples to either monomeric phenolic compounds or dicarboxylic acids were conducted, and the product yields were quantified. Based on these results (lignin structural characteristics and monomer yields), we demonstrated for the first time to apply multiple-variable linear estimations (MVLE) approach using R statistics to gain insight toward a quantitative correlation between lignin structural properties and their conversion reactivity toward oxidative depolymerization to monomers. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Study on Quality Standard of Processed Curcuma Longa Radix

    PubMed Central

    Zhao, Yongfeng; Quan, Liang; Zhou, Haiting; Cao, Dong; Li, Wenbing; Yang, Zhuo

    2017-01-01

    To control the quality of Curcuma Longa Radix by establishing quality standards, this paper increased the contents of extract and volatile oil determination. Meanwhile, the curcumin was selected as the internal marker, and the relative correlation factors (RCFs) of demethoxycurcumin and bisdemethoxycurcumin were established by high performance liquid chromatography (HPLC). The contents of multicomponents were calculated based on their RCFs. The rationality and feasibility of the methods were evaluated by comparison of the quantitative results between external standard method (ESM) and quantitative analysis of multicomponents by single-marker (QAMS). Ethanol extracts ranged from 9.749 to 15.644% and the mean value was 13.473%. The volatile oil ranged from 0.45 to 0.90 mL/100 g and the mean value was 0.66 mL/100 g. This method was accurate and feasible and could provide a reference for further comprehensive and effective control of the quality standard of Curcuma Longa Radix and its processed products. PMID:29375640

  7. Automatic spatiotemporal matching of detected pleural thickenings

    NASA Astrophysics Data System (ADS)

    Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas

    2014-01-01

    Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).

  8. Some aspects of robotics calibration, design and control

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1990-01-01

    The main objective is to introduce techniques in the areas of testing and calibration, design, and control of robotic systems. A statistical technique is described that analyzes a robot's performance and provides quantitative three-dimensional evaluation of its repeatability, accuracy, and linearity. Based on this analysis, a corrective action should be taken to compensate for any existing errors and enhance the robot's overall accuracy and performance. A comparison between robotics simulation software packages that were commercially available (SILMA, IGRIP) and that of Kennedy Space Center (ROBSIM) is also included. These computer codes simulate the kinematics and dynamics patterns of various robot arm geometries to help the design engineer in sizing and building the robot manipulator and control system. A brief discussion on an adaptive control algorithm is provided.

  9. Maintenance = reuse-oriented software development

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1989-01-01

    Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.

  10. Counseling Persons with Comorbid Disorders: A Quantitative Comparison of Counselor Active Rehabilitation Service and Standard Rehabilitation Counseling Approaches

    ERIC Educational Resources Information Center

    Ferdinandi, Andrew D.; Li, Ming Hui

    2007-01-01

    The purpose of this quantitative study was to investigate the effect of counselor active rehabilitation service compared with the effect of standard rehabilitation counseling in assisting individuals with coexisting psychiatric and substance abuse disorders in attaining desired life roles. This study was conducted during a 6-month period in a…

  11. Clinical applications of a quantitative analysis of regional lift ventricular wall motion

    NASA Technical Reports Server (NTRS)

    Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.

    1975-01-01

    Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.

  12. An Exploration of a Quantitative Reasoning Instructional Approach to Linear Equations in Two Variables with Community College Students

    ERIC Educational Resources Information Center

    Belue, Paul T.; Cavey, Laurie Overman; Kinzel, Margaret T.

    2017-01-01

    In this exploratory study, we examined the effects of a quantitative reasoning instructional approach to linear equations in two variables on community college students' conceptual understanding, procedural fluency, and reasoning ability. This was done in comparison to the use of a traditional procedural approach for instruction on the same topic.…

  13. Changes in landscape patterns and associated forest succession on the western slope of the Rocky Mountains, Colorado

    Treesearch

    Daniel J. Manier; Richard D. Laven

    2001-01-01

    Using repeat photography, we conducted a qualitative and quantitative analysis of changes in forest cover on the western slope of the Rocky Mountains in Colorado. For the quantitative analysis, both images in a pair were classified using remote sensing and geographic information system (GIS) technologies. Comparisons were made using three landscape metrics: total...

  14. Evaluation of revised polymerase chain reaction primers for more inclusive quantification of ammonia-oxidizing archaea and bacteria.

    PubMed

    Meinhardt, Kelley A; Bertagnolli, Anthony; Pannu, Manmeet W; Strand, Stuart E; Brown, Sally L; Stahl, David A

    2015-04-01

    Ammonia-oxidizing archaea (AOA) and bacteria (AOB) fill key roles in the nitrogen cycle. Thus, well-vetted methods for characterizing their distribution are essential for framing studies of their significance in natural and managed systems. Quantification of the gene coding for one subunit of the ammonia monooxygenase (amoA) by polymerase chain reaction is frequently employed to enumerate the two groups. However, variable amplification of sequence variants comprising this conserved genetic marker for ammonia oxidizers potentially compromises within- and between-system comparisons. We compared the performance of newly designed non-degenerate quantitative polymerase chain reaction primer sets to existing primer sets commonly used to quantify the amoA of AOA and AOB using a collection of plasmids and soil DNA samples. The new AOA primer set provided improved quantification of model mixtures of different amoA sequence variants and increased detection of amoA in DNA recovered from soils. Although both primer sets for the AOB provided similar results for many comparisons, the new primers demonstrated increased detection in environmental application. Thus, the new primer sets should provide a useful complement to primers now commonly used to characterize the environmental distribution of AOA and AOB. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.

  15. Cold Season QPF: Sensitivities to Snow Parameterizations and Comparisons to NASA CloudSat Observations

    NASA Technical Reports Server (NTRS)

    Molthan, A. L.; Haynes, J. A.; Jedlovec, G. L.; Lapenta, W. M.

    2009-01-01

    As operational numerical weather prediction is performed at increasingly finer spatial resolution, precipitation traditionally represented by sub-grid scale parameterization schemes is now being calculated explicitly through the use of single- or multi-moment, bulk water microphysics schemes. As computational resources grow, the real-time application of these schemes is becoming available to a broader audience, ranging from national meteorological centers to their component forecast offices. A need for improved quantitative precipitation forecasts has been highlighted by the United States Weather Research Program, which advised that gains in forecasting skill will draw upon improved simulations of clouds and cloud microphysical processes. Investments in space-borne remote sensing have produced the NASA A-Train of polar orbiting satellites, specially equipped to observe and catalog cloud properties. The NASA CloudSat instrument, a recent addition to the A-Train and the first 94 GHz radar system operated in space, provides a unique opportunity to compare observed cloud profiles to their modeled counterparts. Comparisons are available through the use of a radiative transfer model (QuickBeam), which simulates 94 GHz radar returns based on the microphysics of cloudy model profiles and the prescribed characteristics of their constituent hydrometeor classes. CloudSat observations of snowfall are presented for a case in the central United States, with comparisons made to precipitating clouds as simulated by the Weather Research and Forecasting Model and the Goddard single-moment microphysics scheme. An additional forecast cycle is performed with a temperature-based parameterization of the snow distribution slope parameter, with comparisons to CloudSat observations provided through the QuickBeam simulator.

  16. Nontargeted quantitation of lipid classes using hydrophilic interaction liquid chromatography-electrospray ionization mass spectrometry with single internal standard and response factor approach.

    PubMed

    Cífková, Eva; Holčapek, Michal; Lísa, Miroslav; Ovčačíková, Magdaléna; Lyčka, Antonín; Lynen, Frédéric; Sandra, Pat

    2012-11-20

    The identification and quantitation of a wide range of lipids in complex biological samples is an essential requirement for the lipidomic studies. High-performance liquid chromatography-mass spectrometry (HPLC/MS) has the highest potential to obtain detailed information on the whole lipidome, but the reliable quantitation of multiple lipid classes is still a challenging task. In this work, we describe a new method for the nontargeted quantitation of polar lipid classes separated by hydrophilic interaction liquid chromatography (HILIC) followed by positive-ion electrospray ionization mass spectrometry (ESI-MS) using a single internal lipid standard to which all class specific response factors (RFs) are related to. The developed method enables the nontargeted quantitation of lipid classes and molecules inside these classes in contrast to the conventional targeted quantitation, which is based on predefined selected reaction monitoring (SRM) transitions for selected lipids only. In the nontargeted quantitation method described here, concentrations of lipid classes are obtained by the peak integration in HILIC chromatograms multiplied by their RFs related to the single internal standard (i.e., sphingosyl PE, d17:1/12:0) used as common reference for all polar lipid classes. The accuracy, reproducibility and robustness of the method have been checked by various means: (1) the comparison with conventional lipidomic quantitation using SRM scans on a triple quadrupole (QqQ) mass analyzer, (2) (31)P nuclear magnetic resonance (NMR) quantitation of the total lipid extract, (3) method robustness test using subsequent measurements by three different persons, (4) method transfer to different HPLC/MS systems using different chromatographic conditions, and (5) comparison with previously published results for identical samples, especially human reference plasma from the National Institute of Standards and Technology (NIST human plasma). Results on human plasma, egg yolk and porcine liver extracts are presented and discussed.

  17. Assessment of myocardial viability: comparison of echocardiography versus cardiac magnetic resonance imaging in the current era.

    PubMed

    Tomlinson, David R; Becher, Harald; Selvanayagam, Joseph B

    2008-06-01

    Detecting viable myocardium, whether hibernating or stunned, is of clinical significance in patients with coronary artery disease and left ventricular dysfunction. Echocardiographic assessments of myocardial thickening and endocardial excursion during dobutamine infusion provide a highly specific marker for myocardial viability, but with relatively less sensitivity. The additional modalities of myocardial contrast echocardiography and tissue Doppler have recently been proposed to provide further, quantitative measures of myocardial viability assessment. Cardiac magnetic resonance (CMR) has become popular for the assessment of myocardial viability as it can assess cardiac function, volumes, myocardial scar, and perfusion with high-spatial resolution. Both 'delayed enhancement' CMR and dobutamine stress CMR have important roles in the assessment of patients with ischaemic cardiomyopathy. This article reviews the recent advances in both echocardiography and CMR for the clinical assessment of myocardial viability. It attempts to provide a pragmatic approach toward the patient-specific assessment of this important clinical problem.

  18. Thermal infrared reflectance and emission spectroscopy of quartzofeldspathic glasses

    USGS Publications Warehouse

    Byrnes, J.M.; Ramsey, M.S.; King, P.L.; Lee, R.J.

    2007-01-01

    This investigation seeks to better understand the thermal infrared (TIR) spectral characteristics of naturally-occurring amorphous materials through laboratory synthesis and analysis of glasses. Because spectra of glass phases differ markedly from their mineral counterparts, examination of glasses is important to accurately determine the composition of amorphous surface materials using remote sensing datasets. Quantitatively characterizing TIR (5-25 ??m) spectral changes that accompany structural changes between glasses and mineral crystals provides the means to understand natural glasses on Earth and Mars. A suite of glasses with compositions analogous to common terrestrial volcanic glasses was created and analyzed using TIR reflectance and emission techniques. Documented spectral characteristics provide a basis for comparison with TIR spectra of other amorphous materials (glasses, clays, etc.). Our results provide the means to better detect and characterize glasses associated with terrestrial volcanoes, as well as contribute toward understanding the nature of amorphous silicates detected on Mars. Copyright 2007 by the American Geophysical Union.

  19. Numerical and Qualitative Contrasts of Two Statistical Models ...

    EPA Pesticide Factsheets

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and products. This study provided an empirical and qualitative comparison of both models using 29 years of data for two discrete time series of chlorophyll-a (chl-a) in the Patuxent River estuary. Empirical descriptions of each model were based on predictive performance against the observed data, ability to reproduce flow-normalized trends with simulated data, and comparisons of performance with validation datasets. Between-model differences were apparent but minor and both models had comparable abilities to remove flow effects from simulated time series. Both models similarly predicted observations for missing data with different characteristics. Trends from each model revealed distinct mainstem influences of the Chesapeake Bay with both models predicting a roughly 65% increase in chl-a over time in the lower estuary, whereas flow-normalized predictions for the upper estuary showed a more dynamic pattern, with a nearly 100% increase in chl-a in the last 10 years. Qualitative comparisons highlighted important differences in the statistical structure, available products, and characteristics of the data and desired analysis. This manuscript describes a quantitative comparison of two recently-

  20. Experimental comparison of landmark-based methods for 3D elastic registration of pre- and postoperative liver CT data

    NASA Astrophysics Data System (ADS)

    Lange, Thomas; Wörz, Stefan; Rohr, Karl; Schlag, Peter M.

    2009-02-01

    The qualitative and quantitative comparison of pre- and postoperative image data is an important possibility to validate surgical procedures, in particular, if computer assisted planning and/or navigation is performed. Due to deformations after surgery, partially caused by the removal of tissue, a non-rigid registration scheme is a prerequisite for a precise comparison. Interactive landmark-based schemes are a suitable approach, if high accuracy and reliability is difficult to achieve by automatic registration approaches. Incorporation of a priori knowledge about the anatomical structures to be registered may help to reduce interaction time and improve accuracy. Concerning pre- and postoperative CT data of oncological liver resections the intrahepatic vessels are suitable anatomical structures. In addition to using branching landmarks for registration, we here introduce quasi landmarks at vessel segments with high localization precision perpendicular to the vessels and low precision along the vessels. A comparison of interpolating thin-plate splines (TPS), interpolating Gaussian elastic body splines (GEBS) and approximating GEBS on landmarks at vessel branchings as well as approximating GEBS on the introduced vessel segment landmarks is performed. It turns out that the segment landmarks provide registration accuracies as good as branching landmarks and can improve accuracy if combined with branching landmarks. For a low number of landmarks segment landmarks are even superior.

  1. Quantitative Measurements of Nitric Oxide Concentration in High-Pressure, Swirl-Stabilized Spray Flames

    NASA Technical Reports Server (NTRS)

    Cooper, Clayton S.; Laurendeau, Normand M.; Hicks, Yolanda R. (Technical Monitor)

    2000-01-01

    Lean direct-injection (LDI) spray flames offer the possibility of reducing NO(sub x) emissions from gas turbines by rapid mixing of the liquid fuel and air so as to drive the flame structure toward partially-premixed conditions. We consider the technical approaches required to utilize laser-induced fluorescence methods for quantitatively measuring NO concentrations in high-pressure LDI spray flames. In the progression from atmospheric to high-pressure measurements, the LIF method requires a shift from the saturated to the linear regime of fluorescence measurements. As such, we discuss quantitative, spatially resolved laser-saturated fluorescence (LSF), linear laser-induced fluorescence (LIF), and planar laser-induced fluorescence (PLIF) measurements of NO concentration in LDI spray flames. Spatially-resolved LIF measurements of NO concentration (ppm) are reported for preheated, LDI spray flames at pressures of two to five atmospheres. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q(sub 2)(26.5) transition of the gamma(0,0) band. Detection is performed in a two nanometer region centered on the gamma(0,1) band. A complete scheme is developed by which quantitative NO concentrations in high-pressure LDI spray flames can be measured by applying linear LIF. NO is doped into the reactants and convected through the flame with no apparent destruction, thus allowing a NO fluorescence calibration to be taken inside the flame environment. The in-situ calibration scheme is validated by comparisons to a reference flame. Quantitative NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors. Moreover, parametric studies are provided for variations in pressure, air-preheat temperature, and equivalence ratio. Similar parametric studies are performed for lean, premixed-prevaporized flames to permit comparisons to those for LDI flames. Finally, PLIF is expanded to high pressure in an effort to quantify the detected fluorescence image for LDI flames. Success is achieved by correcting the PLIF calibration via a single-point LIF measurement. This procedure removes the influence of any preferential background that occurs in the PLIF detection window. In general, both the LIF and PLIF measurements verify that the LDI strategy could be used to reduce NO(sub x) emissions in future gas turbine combustors.

  2. Quantitative Comparison of Tumor Delivery for Multiple Targeted Nanoparticles Simultaneously by Multiplex ICP-MS

    PubMed Central

    Elias, Andrew; Crayton, Samuel H.; Warden-Rothman, Robert; Tsourkas, Andrew

    2014-01-01

    Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples. PMID:25068300

  3. Comparison of quantitative and qualitative tests for glucose-6-phosphate dehydrogenase deficiency.

    PubMed

    LaRue, Nicole; Kahn, Maria; Murray, Marjorie; Leader, Brandon T; Bansil, Pooja; McGray, Sarah; Kalnoky, Michael; Zhang, Hao; Huang, Huiqiang; Jiang, Hui; Domingo, Gonzalo J

    2014-10-01

    A barrier to eliminating Plasmodium vivax malaria is inadequate treatment of infected patients. 8-Aminoquinoline-based drugs clear the parasite; however, people with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk for hemolysis from these drugs. Understanding the performance of G6PD deficiency tests is critical for patient safety. Two quantitative assays and two qualitative tests were evaluated. The comparison of quantitative assays gave a Pearson correlation coefficient of 0.7585 with significant difference in mean G6PD activity, highlighting the need to adhere to a single reference assay. Both qualitative tests had high sensitivity and negative predictive value at a cutoff G6PD value of 40% of normal activity if interpreted conservatively and performed under laboratory conditions. The performance of both tests dropped at a cutoff level of 45%. Cytochemical staining of specimens confirmed that heterozygous females with > 50% G6PD-deficient cells can seem normal by phenotypic tests. © The American Society of Tropical Medicine and Hygiene.

  4. Detached rock evaluation device

    DOEpatents

    Hanson, David R.

    1986-01-01

    A rock detachment evaluation device (10) having an energy transducer unit 1) for sensing vibrations imparted to a subject rock (172) for converting the sensed vibrations into electrical signals, a low band pass filter unit (12) for receiving the electrical signal and transmitting only a low frequency segment thereof, a high band pass filter unit (13) for receiving the electrical signals and for transmitting only a high frequency segment thereof, a comparison unit (14) for receiving the low frequency and high frequency signals and for determining the difference in power between the signals, and a display unit (16) for displaying indicia of the difference, which provides a quantitative measure of rock detachment.

  5. System architectures for telerobotic research

    NASA Technical Reports Server (NTRS)

    Harrison, F. Wallace

    1989-01-01

    Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.

  6. Climate change lessons from a warm world

    USGS Publications Warehouse

    Dowsett, Harry J.

    2010-01-01

    In the early 1970’s to early 1980’s Soviet climatologists were making comparisons to past intervals of warmth in the geologic record and suggesting that these intervals could be possible analogs for 21st century “greenhouse” conditions. Some saw regional warming as a benefit to the Soviet Union and made comments along the lines of “Set fire to the coal mines!” These sentiments were alarming to some, and the United States Geological Survey (USGS) leadership thought they could provide a more quantitative analysis of the data the Soviets were using for the most recent of these warm intervals, the Early Pliocene.

  7. The feeling of the story: Narrating to regulate anger and sadness.

    PubMed

    Pasupathi, Monisha; Wainryb, Cecilia; Mansfield, Cade D; Bourne, Stacia

    2017-04-01

    Admonitions to tell one's story in order to feel better reflect the belief that narrative is an effective emotion regulation tool. The present studies evaluate the effectiveness of narrative for regulating sadness and anger, and provide quantitative comparisons of narrative with distraction, reappraisal, and reexposure. The results for sadness (n = 93) and anger (n = 89) reveal that narrative is effective at down-regulating negative emotions, particularly when narratives place events in the past tense and include positive emotions. The results suggest that if people tell the "right" kind of story about their experiences, narrative reduces emotional distress linked to those experiences.

  8. The Feeling of the Story: Narrating to Regulate Anger and Sadness

    PubMed Central

    Pasupathi, Monisha; Wainryb, Cecilia; Mansfield, Cade D.; Bourne, Stacia

    2017-01-01

    Admonitions to tell one’s story in order to feel better reflect the belief that narrative is an effective emotion regulation tool. The present studies evaluate the effectiveness of narrative for regulating sadness and anger, and provide quantitative comparisons of narrative with distraction, reappraisal, and reexposure. The results for sadness (n = 93) and anger (n = 89) reveal that narrative is effective at down-regulating negative emotions, particularly when narratives place events in the past tense and include positive emotions. The results suggest that if people tell the “right” kind of story about their experiences, narrative reduces emotional distress linked to those experiences. PMID:26745208

  9. Applications of the hybrid coordinate method to the TOPS autopilot

    NASA Technical Reports Server (NTRS)

    Fleischer, G. E.

    1978-01-01

    Preliminary results are presented from the application of the hybrid coordinate method to modeling TOPS (thermoelectric outer planet spacecraft) structural dynamics. Computer simulated responses of the vehicle are included which illustrate the interaction of relatively flexible appendages with an autopilot control system. Comparisons were made between simplified single-axis models of the control loop, with spacecraft flexibility represented by hinged rigid bodies, and a very detailed three-axis spacecraft model whose flexible portions are described by modal coordinates. While single-axis system, root loci provided reasonable qualitative indications of stability margins in this case, they were quantitatively optimistic when matched against responses of the detailed model.

  10. Simbol-X Core Science in a Context

    NASA Astrophysics Data System (ADS)

    Fiore, F.; Arnaud, M.; Briel, U.; Cavazzuti, E.; Cledassou, R.; Counil, J. L.; Comastri, A.; Ferrando, P.; Giommi, P.; Goldwurm, A.; Lamarle, O.; Lanzuisi, G.; Laurent, P.; Lebrun, F.; Malaguti, G.; Mereghetti, S.; Micela, G.; Pareschi, G.; Piconcelli, E.; Piermaria, M.; Puccetti, S.; Roques, J.-P.; Tagliaferri, G.; Vignali, C.

    2009-05-01

    Taking advantage of emerging technology in mirror manufacturing and spacecraft formation flying, Simbol-X will push grazing incidence imaging up to ~80 keV, providing an improvement of roughly three orders of magnitude in sensitivity and angular resolution compared to all instruments that have operated so far above 10 keV. This will open a new window in X-ray astronomy, allowing breakthrough studies on black hole physics and census and particle acceleration mechanisms. We discuss here synergies between Simbol-X and the main multiwavelength facilities that will operate in the next decade, and present a quantitative comparison between Simbol-X and its main competitors, NuStar and Astro-H.

  11. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  12. Comparison of symptomatology and performance degradation for motion and radiation sickness. Technical report, 6 January 1984-31 March 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClellan, G.E.; Wiker, S.F.

    1985-05-31

    This report quantifies for the first time the relationship between the signs and symptoms of acute radiation sickness and those of motion sickness. With this relationship, a quantitative comparison is made between data on human performance degradation during motion sickness and estimates of performance degradation during radiation sickness. The comparison validates estimates made by the Intermediate Dose Program on the performance degradation from acute radiation sickness.

  13. Leadership in Dental Hygiene Degree Completion Programs: A Pilot Study Comparing Stand-Alone Leadership Courses and Leadership-Infused Curricula.

    PubMed

    Smith, Michelle L; Gurenlian, JoAnn R; Freudenthal, Jacqueline J; Farnsworth, Tracy J

    2016-05-01

    The aim of this study was to define the extent to which leadership and leadership skills are taught in dental hygiene degree completion programs by comparing stand-alone leadership courses/hybrid programs with programs that infuse leadership skills throughout the curricula. The study involved a mixed-methods approach using qualitative and quantitative data. Semi-structured interviews were conducted with program directors and faculty members who teach a stand-alone leadership course, a hybrid program, or leadership-infused courses in these programs. A quantitative comparison of course syllabi determined differences in the extent of leadership content and experiences between stand-alone leadership courses and leadership-infused curricula. Of the 53 U.S. dental hygiene programs that offer degree completion programs, 49 met the inclusion criteria, and 19 programs provided course syllabi. Of the program directors and faculty members who teach a stand-alone leadership course or leadership-infused curriculum, 16 participated in the interview portion of the study. The results suggested that competencies related to leadership were not clearly defined or measurable in current teaching. Reported barriers to incorporating a stand-alone leadership course included overcrowded curricula, limited qualified faculty, and lack of resources. The findings of this study provide a synopsis of leadership content and gaps in leadership education for degree completion programs. Suggested changes included defining a need for leadership competencies and providing additional resources to educators such as courses provided by the American Dental Education Association and the American Dental Hygienists' Association.

  14. SU-E-I-15: Quantitative Evaluation of Dose Distributions From Axial, Helical and Cone-Beam CT Imaging by Measurement Using a Two-Dimensional Diode-Array Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chacko, M; Aldoohan, S; Sonnad, J

    2015-06-15

    Purpose: To evaluate quantitatively dose distributions from helical, axial and cone-beam CT clinical imaging techniques by measurement using a two-dimensional (2D) diode-array detector. Methods: 2D-dose distributions from selected clinical protocols used for axial, helical and cone-beam CT imaging were measured using a diode-array detector (MapCheck2). The MapCheck2 is composed from solid state diode detectors that are arranged in horizontal and vertical lines with a spacing of 10 mm. A GE-Light-Speed CT-simulator was used to acquire axial and helical CT images and a kV on-board-imager integrated with a Varian TrueBeam-STx machine was used to acquire cone-beam CT (CBCT) images. Results: Themore » dose distributions from axial, helical and cone-beam CT were non-uniform over the region-of-interest with strong spatial and angular dependence. In axial CT, a large dose gradient was measured that decreased from lateral sides to the middle of the phantom due to large superficial dose at the side of the phantom in comparison with larger beam attenuation at the center. The dose decreased at the superior and inferior regions in comparison to the center of the phantom in axial CT. An asymmetry was found between the right-left or superior-inferior sides of the phantom which possibly to angular dependence in the dose distributions. The dose level and distribution varied from one imaging technique into another. For the pelvis technique, axial CT deposited a mean dose of 3.67 cGy, helical CT deposited a mean dose of 1.59 cGy, and CBCT deposited a mean dose of 1.62 cGy. Conclusions: MapCheck2 provides a robust tool to measure directly 2D-dose distributions for CT imaging with high spatial resolution detectors in comparison with ionization chamber that provides a single point measurement or an average dose to the phantom. The dose distributions measured with MapCheck2 consider medium heterogeneity and can represent specific patient dose.« less

  15. CCQM-K126: low polarity organic in water: carbamazepine in surface water

    NASA Astrophysics Data System (ADS)

    Wai-mei Sin, Della; Wong, Yiu-chung; Lehmann, Andreas; Schneider, Rudolf J.; Kakoulides, Elias; Tang Lin, Teo; Qinde, Liu; Cabillic, Julie; Lardy-fontan, Sophie; Nammoonnoy, Jintana; Prevoo-Franzsen, Désirée; López, Eduardo Emilio; Alberti, Cecilia; Su, Fuhai

    2017-01-01

    The key comparison CCQM-K126 low polarity organic in water: carbamazepine in surface water was coordinated by Government Laboratory Hong Kong under the auspices of the Organic Analysis Working Group (OAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM). Eight National Metrology institutes or Designated Institutes participated and participants were requested to report the mass fraction of carbamazepine in surface water study material. The surface water sample was collected in Hong Kong and was gravimetrically spiked with standard solution. This study provided the means for assessing measurement capabilities for determination of low molecular weight analytes (mass range 100-500) and low polarity (pKOW<= -2) in aqueous matrix. Nine NMIs/DIs registered for the KC and one withdrew before results were submitted. Nine results were submitted by the eight participants. Eight results applied the LC-MS/MS method and Isotope Dilution Mass Spectrometry approach for quantification. BAM additionally submitted a result from ELISA that was not included in the key comparison reference values (KCRV) calculation as is provided in the report for information. One further result was not included as the participant withdrew their result from the calculation after further analysis. The assigned KCRV was the median of the seven results and was assigned a KCRV of 250.2 ng/kg with a combined standard uncertainty of 3.6 ng/kg, The k-factor for the estimation of the expanded uncertainty of the KCRVs was chosen as k = 2. The degree of equivalence (with the KCRV) and its uncertainty was calculated for each result. Seven of the participants were able to demonstrate the ability to quantitatively determine low-polarity analyte in aqueous matrix by applying LC-MS/MS technique at a very low level. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  16. Quantitative Comparison of Three Standardization Methods Using a One-Way ANOVA for Multiple Mean Comparisons

    ERIC Educational Resources Information Center

    Barrows, Russell D.

    2007-01-01

    A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…

  17. Longitudinal evaluation of corticospinal tract in patients with resected brainstem cavernous malformations using high-definition fiber tractography and diffusion connectometry analysis: preliminary experience.

    PubMed

    Faraji, Amir H; Abhinav, Kumar; Jarbo, Kevin; Yeh, Fang-Cheng; Shin, Samuel S; Pathak, Sudhir; Hirsch, Barry E; Schneider, Walter; Fernandez-Miranda, Juan C; Friedlander, Robert M

    2015-11-01

    Brainstem cavernous malformations (CMs) are challenging due to a higher symptomatic hemorrhage rate and potential morbidity associated with their resection. The authors aimed to preoperatively define the relationship of CMs to the perilesional corticospinal tracts (CSTs) by obtaining qualitative and quantitative data using high-definition fiber tractography. These data were examined postoperatively by using longitudinal scans and in relation to patients' symptomatology. The extent of involvement of the CST was further evaluated longitudinally using the automated "diffusion connectometry" analysis. Fiber tractography was performed with DSI Studio using a quantitative anisotropy (QA)-based generalized deterministic tracking algorithm. Qualitatively, CST was classified as being "disrupted" and/or "displaced." Quantitative analysis involved obtaining mean QA values for the CST and its perilesional and nonperilesional segments. The contralateral CST was used for comparison. Diffusion connectometry analysis included comparison of patients' data with a template from 90 normal subjects. Three patients (mean age 22 years) with symptomatic pontomesencephalic hemorrhagic CMs and varying degrees of hemiparesis were identified. The mean follow-up period was 37.3 months. Qualitatively, CST was partially disrupted and displaced in all. Direction of the displacement was different in each case and progressively improved corresponding with the patient's neurological status. No patient experienced neurological decline related to the resection. The perilesional mean QA percentage decreases supported tract disruption and decreased further over the follow-up period (Case 1, 26%-49%; Case 2, 35%-66%; and Case 3, 63%-78%). Diffusion connectometry demonstrated rostrocaudal involvement of the CST consistent with the quantitative data. Hemorrhagic brainstem CMs can disrupt and displace perilesional white matter tracts with the latter occurring in unpredictable directions. This requires the use of tractography to accurately define their orientation to optimize surgical entry point, minimize morbidity, and enhance neurological outcomes. Observed anisotropy decreases in the perilesional segments are consistent with neural injury following hemorrhagic insults. A model using these values in different CST segments can be used to longitudinally monitor its craniocaudal integrity. Diffusion connectometry is a complementary approach providing longitudinal information on the rostrocaudal involvement of the CST.

  18. Quantification of HCV RNA in Liver Tissue by bDNA Assay.

    PubMed

    Dailey, P J; Collins, M L; Urdea, M S; Wilber, J C

    1999-01-01

    With this statement, Sherlock and Dooley have described two of the three major challenges involved in quantitatively measuring any analyte in tissue samples: the distribution of the analyte in the tissue; and the standard of reference, or denominator, with which to make comparisons between tissue samples. The third challenge for quantitative measurement of an analyte in tissue is to ensure reproducible and quantitative recovery of the analyte on extraction from tissue samples. This chapter describes a method that can be used to measure HCV RNA quantitatively in liver biopsy and tissue samples using the bDNA assay. All three of these challenges-distribution, denominator, and recovery-apply to the measurement of HCV RNA in liver biopsies.

  19. Big fish in a big pond: a study of academic self concept in first year medical students.

    PubMed

    Jackman, Kirsty; Wilson, Ian G; Seaton, Marjorie; Craven, Rhonda G

    2011-07-27

    Big-fish-little-pond effect (BFLPE) research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation.

  20. The Impact of a Virtual Public Charter School Program on the Learning Outcomes of Students with Disabilities: A Quantitative Study

    ERIC Educational Resources Information Center

    Epps, Sucari

    2017-01-01

    This quantitative study investigated the learning outcomes of students with disabilities in comparison to their non-disabled peers in a TK-12th grade school that offers a sixth-twelfth grade virtual public charter school program that currently serves students in the state of California. No differences were found between groups indicating…

  1. Comparison of Maximum Likelihood Estimation Approach and Regression Approach in Detecting Quantitative Trait Lco Using RAPD Markers

    Treesearch

    Changren Weng; Thomas L. Kubisiak; C. Dana Nelson; James P. Geaghan; Michael Stine

    1999-01-01

    Single marker regression and single marker maximum likelihood estimation were tied to detect quantitative trait loci (QTLs) controlling the early height growth of longleaf pine and slash pine using a ((longleaf pine x slash pine) x slash pine) BC, population consisting of 83 progeny. Maximum likelihood estimation was found to be more power than regression and could...

  2. Robust and transferable quantification of NMR spectral quality using IROC analysis

    NASA Astrophysics Data System (ADS)

    Zambrello, Matthew A.; Maciejewski, Mark W.; Schuyler, Adam D.; Weatherby, Gerard; Hoch, Jeffrey C.

    2017-12-01

    Non-Fourier methods are increasingly utilized in NMR spectroscopy because of their ability to handle nonuniformly-sampled data. However, non-Fourier methods present unique challenges due to their nonlinearity, which can produce nonrandom noise and render conventional metrics for spectral quality such as signal-to-noise ratio unreliable. The lack of robust and transferable metrics (i.e. applicable to methods exhibiting different nonlinearities) has hampered comparison of non-Fourier methods and nonuniform sampling schemes, preventing the identification of best practices. We describe a novel method, in situ receiver operating characteristic analysis (IROC), for characterizing spectral quality based on the Receiver Operating Characteristic curve. IROC utilizes synthetic signals added to empirical data as "ground truth", and provides several robust scalar-valued metrics for spectral quality. This approach avoids problems posed by nonlinear spectral estimates, and provides a versatile quantitative means of characterizing many aspects of spectral quality. We demonstrate applications to parameter optimization in Fourier and non-Fourier spectral estimation, critical comparison of different methods for spectrum analysis, and optimization of nonuniform sampling schemes. The approach will accelerate the discovery of optimal approaches to nonuniform sampling experiment design and non-Fourier spectrum analysis for multidimensional NMR.

  3. Statistical differences between relative quantitative molecular fingerprints from microbial communities.

    PubMed

    Portillo, M C; Gonzalez, J M

    2008-08-01

    Molecular fingerprints of microbial communities are a common method for the analysis and comparison of environmental samples. The significance of differences between microbial community fingerprints was analyzed considering the presence of different phylotypes and their relative abundance. A method is proposed by simulating coverage of the analyzed communities as a function of sampling size applying a Cramér-von Mises statistic. Comparisons were performed by a Monte Carlo testing procedure. As an example, this procedure was used to compare several sediment samples from freshwater ponds using a relative quantitative PCR-DGGE profiling technique. The method was able to discriminate among different samples based on their molecular fingerprints, and confirmed the lack of differences between aliquots from a single sample.

  4. Mapping of thermal injury in biologic tissues using quantitative pathologic techniques

    NASA Astrophysics Data System (ADS)

    Thomsen, Sharon L.

    1999-05-01

    Qualitative and quantitative pathologic techniques can be used for (1) mapping of thermal injury, (2) comparisons lesion sizes and configurations for different instruments or heating sources and (3) comparisons of treatment effects. Concentric zones of thermal damage form around a single volume heat source. The boundaries between some of these zones are distinct and measurable. Depending on the energy deposition, heating times and tissue type, the zones can include the following beginning at the hotter center and progressing to the cooler periphery: (1) tissue ablation, (2) carbonization, (3) tissue water vaporization, (4) structural protein denaturation (thermal coagulation), (5) vital enzyme protein denaturation, (6) cell membrane disruption, (7) hemorrhage, hemostasis and hyperhemia, (8) tissue necrosis and (9) wound organization and healing.

  5. Creation of a bovine herpes virus 1 (BoHV-1) quantitative particle standard by transmission electron microscopy and comparison with established standards for use in real-time PCR.

    PubMed

    Hoferer, Marc; Braun, Anne; Sting, Reinhard

    2017-07-01

    Standards are pivotal for pathogen quantification by real-time PCR (qPCR); however, the creation of a complete and universally applicable virus particle standard is challenging. In the present study a procedure based on purification of bovine herpes virus type 1 (BoHV-1) and subsequent quantification by transmission electron microscopy (TEM) is described. Accompanying quantitative quality controls of the TEM preparation procedure using qPCR yielded recovery rates of more than 95% of the BoHV-1 virus particles on the grid used for virus counting, which was attributed to pre-treatment of the grid with 5% bovine albumin. To compare the value of the new virus particle standard for use in qPCR, virus counter based quantification and established pure DNA standards represented by a plasmid and an oligonucleotide were included. It could be shown that the numbers of virus particles, plasmid and oligonucleotide equivalents were within one log10 range determined on the basis of standard curves indicating that different approaches provide comparable quantitative values. However, only virus particles represent a complete, universally applicable quantitative virus standard that meets the high requirements of an RNA and DNA virus gold standard. In contrast, standards based on pure DNA have to be considered as sub-standard due to limited applications. Copyright © 2017 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  6. Efficacy of Pitfall Trapping, Winkler and Berlese Extraction Methods for Measuring Ground-Dwelling Arthropods in Moist-Deciduous Forests in the Western Ghats

    PubMed Central

    Sabu, Thomas K.; Shiju, Raj T.

    2010-01-01

    The present study provides data to decide on the most appropriate method for sampling of ground-dwelling arthropods measured in a moist-deciduous forest in the Western Ghats in South India. The abundance of ground-dwelling arthropods was compared among large numbers of samples obtained using pitfall trapping, Berlese and Winkler extraction methods. Highest abundance and frequency of most of the represented taxa indicated pitfall trapping as the ideal method for sampling of ground-dwelling arthropods. However, with possible bias towards surface-active taxa, pitfall-trapping data is inappropriate for quantitative studies, and Berlese extraction is the better alternative. Berlese extraction is the better method for quantitative measurements than the other two methods, whereas pitfall trapping would be appropriate for qualitative measurements. A comparison of the Berlese and Winkler extraction data shows that in a quantitative multigroup approach, Winkler extraction was inferior to Berlese extraction because the total number of arthropods caught was the lowest; and many of the taxa that were caught from an identical sample via Berlese extraction method were not caught. Significantly a greater frequency and higher abundance of arthropods belonging to Orthoptera, Blattaria, and Diptera occurred in pitfall-trapped samples and Psocoptera and Acariformes in Berlese-extracted samples than that were obtained in the other two methods, indicating that both methods are useful, one complementing the other, eliminating a chance for possible under-representation of taxa in quantitative studies. PMID:20673122

  7. Genetic dissection of the maize (Zea mays L.) MAMP response.

    PubMed

    Zhang, Xinye; Valdés-López, Oswaldo; Arellano, Consuelo; Stacey, Gary; Balint-Kurti, Peter

    2017-06-01

    Loci associated with variation in maize responses to two microbe-associated molecular patterns (MAMPs) were identified. MAMP responses were correlated. No relationship between MAMP responses and quantitative disease resistance was identified. Microbe-associated molecular patterns (MAMPs) are highly conserved molecules commonly found in microbes which can be recognized by plant pattern recognition receptors. Recognition triggers a suite of responses including production of reactive oxygen species (ROS) and nitric oxide (NO) and expression changes of defense-related genes. In this study, we used two well-studied MAMPs (flg22 and chitooctaose) to challenge different maize lines to determine whether there was variation in the level of responses to these MAMPs, to dissect the genetic basis underlying that variation and to understand the relationship between MAMP response and quantitative disease resistance (QDR). Naturally occurring quantitative variation in ROS, NO production, and defense genes expression levels triggered by MAMPs was observed. A major quantitative traits locus (QTL) associated with variation in the ROS production response to both flg22 and chitooctaose was identified on chromosome 2 in a recombinant inbred line (RIL) population derived from the maize inbred lines B73 and CML228. Minor QTL associated with variation in the flg22 ROS response was identified on chromosomes 1 and 4. Comparison of these results with data previously obtained for variation in QDR and the defense response in the same RIL population did not provide any evidence for a common genetic basis controlling variation in these traits.

  8. Quantitative impedimetric monitoring of cell migration under the stimulation of cytokine or anti-cancer drug in a microfluidic chip

    PubMed Central

    Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao

    2015-01-01

    Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566

  9. ICP-MS Analysis of Lanthanide-Doped Nanoparticles as a Non-Radiative, Multiplex Approach to Quantify Biodistribution and Blood Clearance

    PubMed Central

    Crayton, Samuel H.; Elias, Andrew; Al-Zaki, Ajlan; Cheng, Zhiliang; Tsourkas, Andrew

    2011-01-01

    Recent advances in material science and chemistry have led to the development of nanoparticles with diverse physicochemical properties, e.g. size, charge, shape, and surface chemistry. Evaluating which physicochemical properties are best for imaging and therapeutic studies is challenging not only because of the multitude of samples to evaluate, but also because of the large experimental variability associated with in vivo studies (e.g. differences in tumor size, injected dose, subject weight, etc.). To address this issue, we have developed a lanthanide-doped nanoparticle system and analytical method that allows for the quantitative comparison of multiple nanoparticle compositions simultaneously. Specifically, superparamagnetic iron oxide (SPIO) with a range of different sizes and charges were synthesized, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy (ICP-MS) was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood samples and the resected tumor and organs. The method proved generalizable to other nanoparticle platforms, including dendrimers, liposomes, and polymersomes. This approach provides a simple, cost-effective, and non-radiative method to quantitatively compare tumor localization, biodistribution, and blood clearance of more than 10 nanoparticle compositions simultaneously, removing subject-to-subject variability. PMID:22100983

  10. Comparison of high-performance liquid chromatography and supercritical fluid chromatography using evaporative light scattering detection for the determination of plasticizers in medical devices.

    PubMed

    Lecoeur, Marie; Decaudin, Bertrand; Guillotin, Yoann; Sautou, Valérie; Vaccher, Claude

    2015-10-23

    Recently, interest in supercritical fluid chromatography (SFC) has increased due to its high throughput and the development of new system improving chromatographic performances. However, most papers dealt with fundamental studies and chiral applications and only few works described validation process of SFC method. Likewise, evaporative light scattering detection (ELSD) has been widely employed in liquid chromatography but only a few recent works presented its quantitative performances hyphenated with SFC apparatus. The present paper discusses about the quantitative performances of SFC-ELSD compared to HPLC-ELSD, for the determination of plasticizers (ATBC, DEHA, DEHT and TOTM) in PVC tubing used as medical devices. After the development of HPLC-ELSD, both methods were evaluated based on the total error approach using accuracy profile. The results show that HPLC-ELSD was more precise than SFC-ELSD but lower limits of quantitation were obtained by SFC. Hence, HPLC was validated in the ± 10% acceptance limits whereas SFC lacks of accuracy to quantify plasticizers. Finally, both methods were used to determine the composition of plasticized-PVC medical devices. Results demonstrated that SFC and HPLC both hyphenated with ELSD provided similar results. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Novel ionic liquid matrices for qualitative and quantitative detection of carbohydrates by matrix assisted laser desorption/ionization mass spectrometry.

    PubMed

    Zhao, Xiaoyong; Shen, Shanshan; Wu, Datong; Cai, Pengfei; Pan, Yuanjiang

    2017-09-08

    Analysis of carbohydrates based on matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is still challenging and researchers have been devoting themselves to efficient matrices discovery. In the present study, the design, synthesis, qualitative and quantitative performance of non-derivative ionic liquid matrices (ILMs) were reported. DHB/N-methylaniline (N-MA) and DHB/N-ethylaniline (N-EA), performing best for carbohydrate detection, have been screened out. The limit of detection for oligosaccharide provided by DHB/N-MA and DHB/N-EA were as low as 10 fmol. DHB/N-MA and DHB/N-EA showed significantly higher ion generation efficiency than DHB. The comparison of capacity to probe polysaccharide between these two ILMs and DHB also revealed their powerful potential. Their outstanding performance were probably due to lower proton affinities and stronger UV absorption at λ = 355 nm. What is more, taking DHB/N-MA as an example, quantitative analysis of fructo-oligosaccharide mixtures extracted and identified from rice noodles has been accomplished sensitively using an internal standard method. Overall, DHB/N-MA and DHB/N-EA exhibited excellent performance and might be significant sources as the carbohydrate matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Retrobiosynthetic nuclear magnetic resonance analysis of amino acid biosynthesis and intermediary metabolism. Metabolic flux in developing maize kernels.

    PubMed

    Glawischnig, E; Gierl, A; Tomas, A; Bacher, A; Eisenreich, W

    2001-03-01

    Information on metabolic networks could provide the basis for the design of targets for metabolic engineering. To study metabolic flux in cereals, developing maize (Zea mays) kernels were grown in sterile culture on medium containing [U-(13)C(6)]glucose or [1,2-(13)C(2)]acetate. After growth, amino acids, lipids, and sitosterol were isolated from kernels as well as from the cobs, and their (13)C isotopomer compositions were determined by quantitative nuclear magnetic resonance spectroscopy. The highly specific labeling patterns were used to analyze the metabolic pathways leading to amino acids and the triterpene on a quantitative basis. The data show that serine is generated from phosphoglycerate, as well as from glycine. Lysine is formed entirely via the diaminopimelate pathway and sitosterol is synthesized entirely via the mevalonate route. The labeling data of amino acids and sitosterol were used to reconstruct the labeling patterns of key metabolic intermediates (e.g. acetyl-coenzyme A, pyruvate, phosphoenolpyruvate, erythrose 4-phosphate, and Rib 5-phosphate) that revealed quantitative information about carbon flux in the intermediary metabolism of developing maize kernels. Exogenous acetate served as an efficient precursor of sitosterol, as well as of amino acids of the aspartate and glutamate family; in comparison, metabolites formed in the plastidic compartments showed low acetate incorporation.

  13. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    PubMed

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  14. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  15. Comparing quantitative values of two generations of laser-assisted indocyanine green dye angiography systems: can we predict necrosis?

    PubMed

    Phillips, Brett T; Fourman, Mitchell S; Rivara, Andrew; Dagum, Alexander B; Huston, Tara L; Ganz, Jason C; Bui, Duc T; Khan, Sami U

    2014-01-01

    Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R(2) = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8.

  16. Comparison of Quantitative Mass Spectrometry Platforms for Monitoring Kinase ATP Probe Uptake in Lung Cancer.

    PubMed

    Hoffman, Melissa A; Fang, Bin; Haura, Eric B; Rix, Uwe; Koomen, John M

    2018-01-05

    Recent developments in instrumentation and bioinformatics have led to new quantitative mass spectrometry platforms including LC-MS/MS with data-independent acquisition (DIA) and targeted analysis using parallel reaction monitoring mass spectrometry (LC-PRM), which provide alternatives to well-established methods, such as LC-MS/MS with data-dependent acquisition (DDA) and targeted analysis using multiple reaction monitoring mass spectrometry (LC-MRM). These tools have been used to identify signaling perturbations in lung cancers and other malignancies, supporting the development of effective kinase inhibitors and, more recently, providing insights into therapeutic resistance mechanisms and drug repurposing opportunities. However, detection of kinases in biological matrices can be challenging; therefore, activity-based protein profiling enrichment of ATP-utilizing proteins was selected as a test case for exploring the limits of detection of low-abundance analytes in complex biological samples. To examine the impact of different MS acquisition platforms, quantification of kinase ATP uptake following kinase inhibitor treatment was analyzed by four different methods: LC-MS/MS with DDA and DIA, LC-MRM, and LC-PRM. For discovery data sets, DIA increased the number of identified kinases by 21% and reduced missingness when compared with DDA. In this context, MRM and PRM were most effective at identifying global kinome responses to inhibitor treatment, highlighting the value of a priori target identification and manual evaluation of quantitative proteomics data sets. We compare results for a selected set of desthiobiotinylated peptides from PRM, MRM, and DIA and identify considerations for selecting a quantification method and postprocessing steps that should be used for each data acquisition strategy.

  17. Comparing Quantitative Values of Two Generations of Laser-Assisted Indocyanine Green Dye Angiography Systems: Can We Predict Necrosis?

    PubMed Central

    Fourman, Mitchell S.; Rivara, Andrew; Dagum, Alexander B.; Huston, Tara L.; Ganz, Jason C.; Bui, Duc T.; Khan, Sami U.

    2014-01-01

    Objective: Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Methods: Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. Results: 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R2 = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Conclusion: Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8. PMID:25525483

  18. Indicators of Family Care for Development for Use in Multicountry Surveys

    PubMed Central

    Kariger, Patricia; Engle, Patrice; Britto, Pia M. Rebello; Sywulka, Sara M.; Menon, Purnima

    2012-01-01

    Indicators of family care for development are essential for ascertaining whether families are providing their children with an environment that leads to positive developmental outcomes. This project aimed to develop indicators from a set of items, measuring family care practices and resources important for caregiving, for use in epidemiologic surveys in developing countries. A mixed method (quantitative and qualitative) design was used for item selection and evaluation. Qualitative and quantitative analyses were conducted to examine the validity of candidate items in several country samples. Qualitative methods included the use of global expert panels to identify and evaluate the performance of each candidate item as well as in-country focus groups to test the content validity of the items. The quantitative methods included analyses of item-response distributions, using bivariate techniques. The selected items measured two family care practices (support for learning/stimulating environment and limit-setting techniques) and caregiving resources (adequacy of the alternate caregiver when the mother worked). Six play-activity items, indicative of support for learning/stimulating environment, were included in the core module of UNICEF's Multiple Cluster Indictor Survey 3. The other items were included in optional modules. This project provided, for the first time, a globally-relevant set of items for assessing family care practices and resources in epidemiological surveys. These items have multiple uses, including national monitoring and cross-country comparisons of the status of family care for development used globally. The obtained information will reinforce attention to efforts to improve the support for development of children. PMID:23304914

  19. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A complementary perspective

    NASA Astrophysics Data System (ADS)

    Kahveci, Ajda

    2010-07-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses. An unobtrusive research method, content analysis, was used by coding the manifest content and counting the frequency of words, photographs, drawings, and questions by cognitive level. The context was an undergraduate chemistry teacher preparation program at a large public university in a metropolitan area in northwestern Turkey. Forty preservice chemistry teachers were guided to analyze 10 middle school science and 10 high school chemistry textbooks. Overall, the textbooks included unfair gender representations, a considerably higher number of input and processing than output level questions, and high load of science terminology. The textbooks failed to provide sufficient empirical evidence to be considered as gender equitable and inquiry-based. The quantitative approach employed for evaluation contrasts with a more interpretive approach, and has the potential in depicting textbook profiles in a more reliable way, complementing the commonly employed qualitative procedures. Implications suggest that further work in this line is needed on calibrating the analysis procedures with science textbooks used in different international settings. The procedures could be modified and improved to meet specific evaluation needs. In the Turkish context, next step research may concern the analysis of science textbooks being rewritten for the reform-based curricula to make cross-comparisons and evaluate a possible progression.

  20. A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.

    PubMed

    Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui

    2017-10-01

    Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.

  1. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  2. Towards Extending Forward Kinematic Models on Hyper-Redundant Manipulator to Cooperative Bionic Arms

    NASA Astrophysics Data System (ADS)

    Singh, Inderjeet; Lakhal, Othman; Merzouki, Rochdi

    2017-01-01

    Forward Kinematics is a stepping stone towards finding an inverse solution and subsequently a dynamic model of a robot. Hence a study and comparison of various Forward Kinematic Models (FKMs) is necessary for robot design. This paper deals with comparison of three FKMs on the same hyper-redundant Compact Bionic Handling Assistant (CBHA) manipulator under same conditions. The aim of this study is to project on modeling cooperative bionic manipulators. Two of these methods are quantitative methods, Arc Geometry HTM (Homogeneous Transformation Matrix) Method and Dual Quaternion Method, while the other one is Hybrid Method which uses both quantitative as well as qualitative approach. The methods are compared theoretically and experimental results are discussed to add further insight to the comparison. HTM is the widely used and accepted technique, is taken as reference and trajectory deviation in other techniques are compared with respect to HTM. Which method allows obtaining an accurate kinematic behavior of the CBHA, controlled in the real-time.

  3. Reduction of magnetic field fluctuations in powered magnets for NMR using inductive measurements and sampled-data feedback control.

    PubMed

    Li, Mingzhou; Schiano, Jeffrey L; Samra, Jenna E; Shetty, Kiran K; Brey, William W

    2011-10-01

    Resistive and hybrid (resistive/superconducting) magnets provide substantially higher magnetic fields than those available in low-temperature superconducting magnets, but their relatively low spatial homogeneity and temporal field fluctuations are unacceptable for high resolution NMR. While several techniques for reducing temporal fluctuations have demonstrated varying degrees of success, this paper restricts attention to methods that utilize inductive measurements and feedback control to actively cancel the temporal fluctuations. In comparison to earlier studies using analog proportional control, this paper shows that shaping the controller frequency response results in significantly higher reductions in temporal fluctuations. Measurements of temporal fluctuation spectra and the frequency response of the instrumentation that cancels the temporal fluctuations guide the controller design. In particular, we describe a sampled-data phase-lead-lag controller that utilizes the internal model principle to selectively attenuate magnetic field fluctuations caused by the power supply ripple. We present a quantitative comparison of the attenuation in temporal fluctuations afforded by the new design and a proportional control design. Metrics for comparison include measurements of the temporal fluctuations using Faraday induction and observations of the effect that the fluctuations have on nuclear resonance measurements. Copyright © 2011. Published by Elsevier Inc.

  4. Hierarchical mutual information for the comparison of hierarchical community structures in complex networks

    NASA Astrophysics Data System (ADS)

    Perotti, Juan Ignacio; Tessone, Claudio Juan; Caldarelli, Guido

    2015-12-01

    The quest for a quantitative characterization of community and modular structure of complex networks produced a variety of methods and algorithms to classify different networks. However, it is not clear if such methods provide consistent, robust, and meaningful results when considering hierarchies as a whole. Part of the problem is the lack of a similarity measure for the comparison of hierarchical community structures. In this work we give a contribution by introducing the hierarchical mutual information, which is a generalization of the traditional mutual information and makes it possible to compare hierarchical partitions and hierarchical community structures. The normalized version of the hierarchical mutual information should behave analogously to the traditional normalized mutual information. Here the correct behavior of the hierarchical mutual information is corroborated on an extensive battery of numerical experiments. The experiments are performed on artificial hierarchies and on the hierarchical community structure of artificial and empirical networks. Furthermore, the experiments illustrate some of the practical applications of the hierarchical mutual information, namely the comparison of different community detection methods and the study of the consistency, robustness, and temporal evolution of the hierarchical modular structure of networks.

  5. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C., E-mail: chholland@ucsd.edu

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. The utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)], as part of a multi-year transport model validation activity.« less

  6. How weeds emerge: a taxonomic and trait-based examination using United States data

    PubMed Central

    Kuester, Adam; Conner, Jeffrey K; Culley, Theresa; Baucom, Regina S

    2014-01-01

    Weeds can cause great economic and ecological harm to ecosystems. Despite their importance, comparisons of the taxonomy and traits of successful weeds often focus on a few specific comparisons – for example, introduced versus native weeds.We used publicly available inventories of US plant species to make comprehensive comparisons of the factors that underlie weediness. We quantitatively examined taxonomy to determine if certain genera are overrepresented by introduced, weedy or herbicide-resistant species, and we compared phenotypic traits of weeds to those of nonweeds, whether introduced or native.We uncovered genera that have more weeds and introduced species than expected by chance and plant families that have more herbicide-resistant species than expected by chance. Certain traits, generally related to fast reproduction, were more likely to be associated with weedy plants regardless of species’ origins. We also found stress tolerance traits associated with either native or introduced weeds compared with native or introduced nonweeds. Weeds and introduced species have significantly smaller genomes than nonweeds and native species.These results support trends for weedy plants reported from other floras, suggest that native and introduced weeds have different stress adaptations, and provide a comprehensive survey of trends across weeds within the USA. PMID:24494694

  7. An environmental intervention aimed at increasing physical activity levels in low-income women.

    PubMed

    Speck, Barbara J; Hines-Martin, Vicki; Stetson, Barbara A; Looney, Stephen W

    2007-01-01

    Regular physical activity is a health promotion and disease prevention behavior. Of all demographic groups, low-income women report the lowest levels of physical activity. The purpose of this study was to test an intervention aimed at reducing community environmental barriers to physical activity in low-income women. The research design was mixed methodology: (1) quantitative (quasi-experimental, pretest-posttest, cohort design in which no treatment partitioning was possible) and (2) qualitative (focus groups). The setting was a church-sponsored community center centrally located in a low-income urban neighborhood. The comparison group was recruited first followed by the intervention group to control for setting. The sample consisted of 104 women (comparison group, n = 53; intervention group, n = 51) between the ages of 18 and 63 years who were residents of neighborhoods served by the community center. No between-group differences were found for physical activity behavior. Significant between-group differences in cholesterol (P = .007) and perception of physical activity (P = .033) were observed. Significant intervention group increases from pretest to posttest were found related to advanced registered nurse practitioner support, friend support, and more positive physical activity environment at the community center. Qualitative data supported and enriched the quantitative data. Physical activity levels were not significantly different between the groups. In a sample of low-income women who have multiple barriers, improving attitudes, expanding their knowledge of community resources, and providing physical activity opportunities in their neighborhoods are important intermediate steps toward initiation and maintenance of regular physical activity.

  8. Quantitative Proteomic Analysis Reveals Populus cathayana Females Are More Sensitive and Respond More Sophisticatedly to Iron Deficiency than Males.

    PubMed

    Zhang, Sheng; Zhang, Yunxiang; Cao, Yanchun; Lei, Yanbao; Jiang, Hao

    2016-03-04

    Previous studies have shown that there are significant sexual differences in the morphological and physiological responses of Populus cathayana Rehder to nitrogen and phosphorus deficiencies, but little is known about the sex-specific differences in responses to iron deficiency. In this study, the effects of iron deficiency on the morphology, physiology, and proteome of P. cathayana males and females were investigated. The results showed that iron deficiency (25 days) significantly decreased height growth, photosynthetic rate, chlorophyll content, and tissue iron concentration in both sexes. A comparison between the sexes indicated that iron-deficient males had less height inhibition and photosynthesis system II or chloroplast ultrastructural damage than iron-deficient females. iTRAQ-based quantitative proteomic analysis revealed that 144 and 68 proteins were decreased in abundance (e.g., proteins involved in photosynthesis, carbohydrate and energy metabolism, and gene expression regulation) and 78 and 39 proteins were increased in abundance (e.g., proteins involved in amino acid metabolism and stress response) according to the criterion of ratio ≥1.5 in females and males, respectively. A comparison between the sexes indicated that iron-deficient females exhibited a greater change in the proteins involved in photosynthesis, carbon and energy metabolism, the redox system, and stress responsive proteins. This study reveals females are more sensitive and have a more sophisticated response to iron deficiency compared with males and provides new insights into differential sexual responses to nutrient deficiency.

  9. Noninvasive radioisotopic technique for detection of platelet deposition in mitral valve prostheses and quantitation of visceral microembolism in dogs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dewanjee, M.K.; Fuster, V.; Rao, S.A.

    1983-05-01

    A noninvasive technique has been developed in the dog model for imaging, with a gamma camera, the platelet deposition on Bjoerk-Shiley mitral valve prostheses early postoperatively. At 25 hours after implantation of the prosthesis and 24 hours after intravenous administration of 400 to 500 microCi of platelets labeled with indium-111, the platelet deposition in the sewing ring and perivalvular cardiac tissue can be clearly delineated in a scintiphotograph. An in vitro technique was also developed for quantitation of visceral microemboli in brain, lungs, kidneys, and other tissues. Biodistribution of the labeled platelets was quantitated, and the tissue/blood radioactivity ratio wasmore » determined in 22 dogs in four groups: unoperated normal dogs, sham-operated dogs, prosthesis-implanted dogs, and prosthesis-implanted dogs treated with dipyridamole before and aspirin and dipyridamole immediately after operation. Fifteen to 20% of total platelets were consumed as a consequence of the surgical procedure. On quantitation, we found that platelet deposition on the components of the prostheses was significantly reduced in prosthesis-implanted animals treated with dipyridamole and aspirin when compared with prosthesis-implanted, untreated dogs. All prosthesis-implanted animals considered together had a twofold to fourfold increase in tissue/blood radioactivity ratio in comparison with unoperated and sham-operated animals, an indication that the viscera work as filters and trap platelet microemboli that are presumably produced in the region of the mitral valve prostheses. In the dog model, indium-111-labeled platelets thus provide a sensitive marker for noninvasive imaging of platelet deposition on mechanical mitral valve prostheses, in vitro evaluation of platelet microembolism in viscera, in vitro quantitation of surgical consumption of platelets, and evaluation of platelet-inhibitor drugs.« less

  10. Detection and quantitation of HPV in genital and oral tissues and fluids by real time PCR

    PubMed Central

    2010-01-01

    Background Human papillomaviruses (HPVs) remain a serious world health problem due to their association with anogenital/oral cancers and warts. While over 100 HPV types have been identified, a subset is associated with malignancy. HPV16 and 18 are the most prevalent oncogenic types, while HPV6 and 11 are most commonly responsible for anogenital warts. While other quantitative PCR (qPCR) assays detect oncogenic HPV, there is no single tube assay distinguishing the most frequent oncogenic types and the most common types found in warts. Results A Sybr Green-based qPCR assay was developed utilizing degenerate primers to the highly conserved HPV E1 theoretically detecting any HPV type. A single tube multiplex qPCR assay was also developed using type-specific primer pairs and TaqMan probes that allowed for detection and quantitation of HPV6,11,16,18. Each HPV type was detected over a range from 2 × 101 to 2 × 106copies/reaction providing a reliable method of quantitating type-specific HPV in 140 anogenital/cutaneous/oral benign and malignant specimens. 35 oncogenic and low risk alpha genus HPV types were detected. Concordance was detected in previously typed specimens. Comparisons to the gold standard detected an overall sensitivity of 89% (95% CI: 77% - 96%) and specificity of 90% (95%CI: 52% - 98%). Conclusion There was good agreement between the ability of the qPCR assays described here to identify HPV types in malignancies previously typed using standard methods. These novel qPCR assays will allow rapid detection and quantitation of HPVs to assess their role in viral pathogenesis. PMID:20723234

  11. A comparative analysis of human plasma and serum proteins by combining native PAGE, whole-gel slicing and quantitative LC-MS/MS: Utilizing native MS-electropherograms in proteomic analysis for discovering structure and interaction-correlated differences.

    PubMed

    Wen, Meiling; Jin, Ya; Manabe, Takashi; Chen, Shumin; Tan, Wen

    2017-12-01

    MS identification has long been used for PAGE-separated protein bands, but global and systematic quantitation utilizing MS after PAGE has remained rare and not been reported for native PAGE. Here we reported on a new method combining native PAGE, whole-gel slicing and quantitative LC-MS/MS, aiming at comparative analysis on not only abundance, but also structures and interactions of proteins. A pair of human plasma and serum samples were used as test samples and separated on a native PAGE gel. Six lanes of each sample were cut, each lane was further sliced into thirty-five 1.1 mm × 1.1 mm squares and all the squares were subjected to standardized procedures of in-gel digestion and quantitative LC-MS/MS. The results comprised 958 data rows that each contained abundance values of a protein detected in one square in eleven gel lanes (one plasma lane excluded). The data were evaluated to have satisfactory reproducibility of assignment and quantitation. Totally 315 proteins were assigned, with each protein assigned in 1-28 squares. The abundance distributions in the plasma and serum gel lanes were reconstructed for each protein, named as "native MS-electropherograms". Comparison of the electropherograms revealed significant plasma-versus-serum differences on 33 proteins in 87 squares (fold difference > 2 or < 0.5, p < 0.05). Many of the differences matched with accumulated knowledge on protein interactions and proteolysis involved in blood coagulation, complement and wound healing processes. We expect this method would be useful to provide more comprehensive information in comparative proteomic analysis, on both quantities and structures/interactions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A Functional Model for the Integration of Gains and Losses under Risk: Implications for the Measurement of Subjective Value

    ERIC Educational Resources Information Center

    Viegas, Ricardo G.; Oliveira, Armando M.; Garriga-Trillo, Ana; Grieco, Alba

    2012-01-01

    In order to be treated quantitatively, subjective gains and losses (utilities/disutilities) must be psychologically measured. If legitimate comparisons are sought between them, measurement must be at least interval level, with a common unit. If comparisons of absolute magnitudes across gains and losses are further sought, as in standard…

  13. The Effects of Handwriting Instruction on Reading for Students in Grades 1 and 2

    ERIC Educational Resources Information Center

    Stroik, Linda R.

    2016-01-01

    The purpose of this quantitative quasi-experimental group comparison study using a repeated measures comparison group design with random assignment of subjects to groups was to investigate the effects of handwriting instruction on reading progress for learners in grade 1 and grade 2. At three points in time, the number of words each student read…

  14. Quantitative Analysis of the Shape of the Corpus Callosum in Patients with Autism and Comparison Individuals

    ERIC Educational Resources Information Center

    Casanova, Manuel F.; El-Baz, Ayman; Elnakib, Ahmed; Switala, Andrew E.; Williams, Emily L.; Williams, Diane L.; Minshew, Nancy J.; Conturo, Thomas E.

    2011-01-01

    Multiple studies suggest that the corpus callosum in patients with autism is reduced in size. This study attempts to elucidate the nature of this morphometric abnormality by analyzing the shape of this structure in 17 high-functioning patients with autism and an equal number of comparison participants matched for age, sex, IQ, and handedness. The…

  15. Computational understanding of Li-ion batteries

    NASA Astrophysics Data System (ADS)

    Urban, Alexander; Seo, Dong-Hwa; Ceder, Gerbrand

    2016-03-01

    Over the last two decades, computational methods have made tremendous advances, and today many key properties of lithium-ion batteries can be accurately predicted by first principles calculations. For this reason, computations have become a cornerstone of battery-related research by providing insight into fundamental processes that are not otherwise accessible, such as ionic diffusion mechanisms and electronic structure effects, as well as a quantitative comparison with experimental results. The aim of this review is to provide an overview of state-of-the-art ab initio approaches for the modelling of battery materials. We consider techniques for the computation of equilibrium cell voltages, 0-Kelvin and finite-temperature voltage profiles, ionic mobility and thermal and electrolyte stability. The strengths and weaknesses of different electronic structure methods, such as DFT+U and hybrid functionals, are discussed in the context of voltage and phase diagram predictions, and we review the merits of lattice models for the evaluation of finite-temperature thermodynamics and kinetics. With such a complete set of methods at hand, first principles calculations of ordered, crystalline solids, i.e., of most electrode materials and solid electrolytes, have become reliable and quantitative. However, the description of molecular materials and disordered or amorphous phases remains an important challenge. We highlight recent exciting progress in this area, especially regarding the modelling of organic electrolytes and solid-electrolyte interfaces.

  16. White-Nose Syndrome Disease Severity and a Comparison of Diagnostic Methods.

    PubMed

    McGuire, Liam P; Turner, James M; Warnecke, Lisa; McGregor, Glenna; Bollinger, Trent K; Misra, Vikram; Foster, Jeffrey T; Frick, Winifred F; Kilpatrick, A Marm; Willis, Craig K R

    2016-03-01

    White-nose syndrome is caused by the fungus Pseudogymnoascus destructans and has killed millions of hibernating bats in North America but the pathophysiology of the disease remains poorly understood. Our objectives were to (1) assess non-destructive diagnostic methods for P. destructans infection compared to histopathology, the current gold-standard, and (2) to evaluate potential metrics of disease severity. We used data from three captive inoculation experiments involving 181 little brown bats (Myotis lucifugus) to compare histopathology, quantitative PCR (qPCR), and ultraviolet fluorescence as diagnostic methods of P. destructans infection. To assess disease severity, we considered two histology metrics (wing area with fungal hyphae, area of dermal necrosis), P. destructans fungal load (qPCR), ultraviolet fluorescence, and blood chemistry (hematocrit, sodium, glucose, pCO2, and bicarbonate). Quantitative PCR was most effective for early detection of P. destructans, while all three methods were comparable in severe infections. Correlations among hyphae and necrosis scores, qPCR, ultraviolet fluorescence, blood chemistry, and hibernation duration indicate a multi-stage pattern of disease. Disruptions of homeostasis occurred rapidly in late hibernation. Our results provide valuable information about the use of non-destructive techniques for monitoring, and provide novel insight into the pathophysiology of white-nose syndrome, with implications for developing and implementing potential mitigation strategies.

  17. Quantitative measurement and analysis for detection and treatment planning of developmental dysplasia of the hip

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Lu, Hongbing; Chen, Hanyong; Zhao, Li; Shi, Zhengxing; Liang, Zhengrong

    2009-02-01

    Developmental dysplasia of the hip is a congenital hip joint malformation affecting the proximal femurs and acetabulum that are subluxatable, dislocatable, and dislocated. Conventionally, physicians made diagnoses and treatments only based on findings from two-dimensional (2D) images by manually calculating clinic parameters. However, anatomical complexity of the disease and the limitation of current standard procedures make accurate diagnosis quite difficultly. In this study, we developed a system that provides quantitative measurement of 3D clinical indexes based on computed tomography (CT) images. To extract bone structure from surrounding tissues more accurately, the system firstly segments the bone using a knowledge-based fuzzy clustering method, which is formulated by modifying the objective function of the standard fuzzy c-means algorithm with additive adaptation penalty. The second part of the system calculates automatically the clinical indexes, which are extended from 2D to 3D for accurate description of spatial relationship between femurs and acetabulum. To evaluate the system performance, experimental study based on 22 patients with unilateral or bilateral affected hip was performed. The results of 3D acetabulum index (AI) automatically provided by the system were validated by comparison with 2D results measured by surgeons manually. The correlation between the two results was found to be 0.622 (p<0.01).

  18. The scientific challenges to forecasting and nowcasting the magnetospheric response to space weather (Invited)

    NASA Astrophysics Data System (ADS)

    Hesse, M.; Kuznetsova, M. M.; Birn, J.; Pulkkinen, A. A.

    2013-12-01

    Space weather is different from terrestrial weather in an essential way. Terrestrial weather has benefitted from a long history of research, which has led to a deep and detailed level of understanding. In comparison, space weather is scientifically in its infancy. Many key processes in the causal chains from processes on the Sun to space weather effects in various locations in the heliosphere remain either poorly understood or not understood at all. Space weather is therefore, and will remain in the foreseeable future, primarily a research field. Extensive further research efforts are needed before we can reasonably expect the precision and fidelity of weather forecasts. For space weather within the Earth's magnetosphere, the coupling between solar wind and magnetosphere is of crucial importance. While past research has provided answers, often on qualitative levels, to some of the most fundamental questions, answers to some of the latter and the ability to predict quantitatively remain elusive. This presentation will provide an overview of pertinent aspects of solar wind-magnetospheric coupling, its importance for space weather near the Earth, and it will analyze the state of our ability to describe and predict its efficiency. It will conclude with a discussion of research activities, which are aimed at improving our ability to quantitatively forecast coupling processes.

  19. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  20. A comparison of the Sensititre® MYCOTB panel and the agar proportion method for the susceptibility testing of Mycobacterium tuberculosis.

    PubMed

    Abuali, M M; Katariwala, R; LaBombardi, V J

    2012-05-01

    The agar proportion method (APM) for determining Mycobacterium tuberculosis susceptibilities is a qualitative method that requires 21 days in order to produce the results. The Sensititre method allows for a quantitative assessment. Our objective was to compare the accuracy, time to results, and ease of use of the Sensititre method to the APM. 7H10 plates in the APM and 96-well microtiter dry MYCOTB panels containing 12 antibiotics at full dilution ranges in the Sensititre method were inoculated with M. tuberculosis and read for colony growth. Thirty-seven clinical isolates were tested using both methods and 26 challenge strains of blinded susceptibilities were tested using the Sensititre method only. The Sensititre method displayed 99.3% concordance with the APM. The APM provided reliable results on day 21, whereas the Sensititre method displayed consistent results by day 10. The Sensititre method provides a more rapid, quantitative, and efficient method of testing both first- and second-line drugs when compared to the gold standard. It will give clinicians a sense of the degree of susceptibility, thus, guiding the therapeutic decision-making process. Furthermore, the microwell plate format without the need for instrumentation will allow its use in resource-poor settings.

  1. Quasiparticle Level Alignment for Photocatalytic Interfaces.

    PubMed

    Migani, Annapaoala; Mowbray, Duncan J; Zhao, Jin; Petek, Hrvoje; Rubio, Angel

    2014-05-13

    Electronic level alignment at the interface between an adsorbed molecular layer and a semiconducting substrate determines the activity and efficiency of many photocatalytic materials. Standard density functional theory (DFT)-based methods have proven unable to provide a quantitative description of this level alignment. This requires a proper treatment of the anisotropic screening, necessitating the use of quasiparticle (QP) techniques. However, the computational complexity of QP algorithms has meant a quantitative description of interfacial levels has remained elusive. We provide a systematic study of a prototypical interface, bare and methanol-covered rutile TiO2(110) surfaces, to determine the type of many-body theory required to obtain an accurate description of the level alignment. This is accomplished via a direct comparison with metastable impact electron spectroscopy (MIES), ultraviolet photoelectron spectroscopy (UPS), and two-photon photoemission (2PP) spectroscopy. We consider GGA DFT, hybrid DFT, and G0W0, scQPGW1, scQPGW0, and scQPGW QP calculations. Our results demonstrate that G0W0, or our recently introduced scQPGW1 approach, are required to obtain the correct alignment of both the highest occupied and lowest unoccupied interfacial molecular levels (HOMO/LUMO). These calculations set a new standard in the interpretation of electronic structure probe experiments of complex organic molecule/semiconductor interfaces.

  2. Quantitative sampling of conformational heterogeneity of a DNA hairpin using molecular dynamics simulations and ultrafast fluorescence spectroscopy

    PubMed Central

    Voltz, Karine; Léonard, Jérémie; Touceda, Patricia Tourón; Conyard, Jamie; Chaker, Ziyad; Dejaegere, Annick; Godet, Julien; Mély, Yves; Haacke, Stefan; Stote, Roland H.

    2016-01-01

    Molecular dynamics (MD) simulations and time resolved fluorescence (TRF) spectroscopy were combined to quantitatively describe the conformational landscape of the DNA primary binding sequence (PBS) of the HIV-1 genome, a short hairpin targeted by retroviral nucleocapsid proteins implicated in the viral reverse transcription. Three 2-aminopurine (2AP) labeled PBS constructs were studied. For each variant, the complete distribution of fluorescence lifetimes covering 5 orders of magnitude in timescale was measured and the populations of conformers experimentally observed to undergo static quenching were quantified. A binary quantification permitted the comparison of populations from experimental lifetime amplitudes to populations of aromatically stacked 2AP conformers obtained from simulation. Both populations agreed well, supporting the general assumption that quenching of 2AP fluorescence results from pi-stacking interactions with neighboring nucleobases and demonstrating the success of the proposed methodology for the combined analysis of TRF and MD data. Cluster analysis of the latter further identified predominant conformations that were consistent with the fluorescence decay times and amplitudes, providing a structure-based rationalization for the wide range of fluorescence lifetimes. Finally, the simulations provided evidence of local structural perturbations induced by 2AP. The approach presented is a general tool to investigate fine structural heterogeneity in nucleic acid and nucleoprotein assemblies. PMID:26896800

  3. A Multi-Camera System for Bioluminescence Tomography in Preclinical Oncology Research

    PubMed Central

    Lewis, Matthew A.; Richer, Edmond; Slavine, Nikolai V.; Kodibagkar, Vikram D.; Soesbe, Todd C.; Antich, Peter P.; Mason, Ralph P.

    2013-01-01

    Bioluminescent imaging (BLI) of cells expressing luciferase is a valuable noninvasive technique for investigating molecular events and tumor dynamics in the living animal. Current usage is often limited to planar imaging, but tomographic imaging can enhance the usefulness of this technique in quantitative biomedical studies by allowing accurate determination of tumor size and attribution of the emitted light to a specific organ or tissue. Bioluminescence tomography based on a single camera with source rotation or mirrors to provide additional views has previously been reported. We report here in vivo studies using a novel approach with multiple rotating cameras that, when combined with image reconstruction software, provides the desired representation of point source metastases and other small lesions. Comparison with MRI validated the ability to detect lung tumor colonization in mouse lung. PMID:26824926

  4. Comparison of quantitative myocardial perfusion imaging CT to fluorescent microsphere-based flow from high-resolution cryo-images

    NASA Astrophysics Data System (ADS)

    Eck, Brendan L.; Fahmi, Rachid; Levi, Jacob; Fares, Anas; Wu, Hao; Li, Yuemeng; Vembar, Mani; Dhanantwari, Amar; Bezerra, Hiram G.; Wilson, David L.

    2016-03-01

    Myocardial perfusion imaging using CT (MPI-CT) has the potential to provide quantitative measures of myocardial blood flow (MBF) which can aid the diagnosis of coronary artery disease. We evaluated the quantitative accuracy of MPI-CT in a porcine model of balloon-induced LAD coronary artery ischemia guided by fractional flow reserve (FFR). We quantified MBF at baseline (FFR=1.0) and under moderate ischemia (FFR=0.7) using MPI-CT and compared to fluorescent microsphere-based MBF from high-resolution cryo-images. Dynamic, contrast-enhanced CT images were obtained using a spectral detector CT (Philips Healthcare). Projection-based mono-energetic images were reconstructed and processed to obtain MBF. Three MBF quantification approaches were evaluated: singular value decomposition (SVD) with fixed Tikhonov regularization (ThSVD), SVD with regularization determined by the L-Curve criterion (LSVD), and Johnson-Wilson parameter estimation (JW). The three approaches over-estimated MBF compared to cryo-images. JW produced the most accurate MBF, with average error 33.3+/-19.2mL/min/100g, whereas LSVD and ThSVD had greater over-estimation, 59.5+/-28.3mL/min/100g and 78.3+/-25.6 mL/min/100g, respectively. Relative blood flow as assessed by a flow ratio of LAD-to-remote myocardium was strongly correlated between JW and cryo-imaging, with R2=0.97, compared to R2=0.88 and 0.78 for LSVD and ThSVD, respectively. We assessed tissue impulse response functions (IRFs) from each approach for sources of error. While JW was constrained to physiologic solutions, both LSVD and ThSVD produced IRFs with non-physiologic properties due to noise. The L-curve provided noise-adaptive regularization but did not eliminate non-physiologic IRF properties or optimize for MBF accuracy. These findings suggest that model-based MPI-CT approaches may be more appropriate for quantitative MBF estimation and that cryo-imaging can support the development of MPI-CT by providing spatial distributions of MBF.

  5. Quantitative habitability.

    PubMed

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  6. Tracking Expected Improvements of Decadal Prediction in Climate Services

    NASA Astrophysics Data System (ADS)

    Suckling, E.; Thompson, E.; Smith, L. A.

    2013-12-01

    Physics-based simulation models are ultimately expected to provide the best available (decision-relevant) probabilistic climate predictions, as they can capture the dynamics of the Earth System across a range of situations, situations for which observations for the construction of empirical models are scant if not nonexistent. This fact in itself provides neither evidence that predictions from today's Earth Systems Models will outperform today's empirical models, nor a guide to the space and time scales on which today's model predictions are adequate for a given purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales. The skill of these forecasts is contrasted with that of state-of-the-art climate models, and the challenges faced by each approach are discussed. The focus is on providing decision-relevant probability forecasts for decision support. An empirical model, known as Dynamic Climatology is shown to be competitive with CMIP5 climate models on decadal scale probability forecasts. Contrasting the skill of simulation models not only with each other but also with empirical models can reveal the space and time scales on which a generation of simulation models exploits their physical basis effectively. It can also quantify their ability to add information in the formation of operational forecasts. Difficulties (i) of information contamination (ii) of the interpretation of probabilistic skill and (iii) of artificial skill complicate each modelling approach, and are discussed. "Physics free" empirical models provide fixed, quantitative benchmarks for the evaluation of ever more complex climate models, that is not available from (inter)comparisons restricted to only complex models. At present, empirical models can also provide a background term for blending in the formation of probability forecasts from ensembles of simulation models. In weather forecasting this role is filled by the climatological distribution, and can significantly enhance the value of longer lead-time weather forecasts to those who use them. It is suggested that the direct comparison of simulation models with empirical models become a regular component of large model forecast intercomparison and evaluation. This would clarify the extent to which a given generation of state-of-the-art simulation models provide information beyond that available from simpler empirical models. It would also clarify current limitations in using simulation forecasting for decision support. No model-based probability forecast is complete without a quantitative estimate if its own irrelevance; this estimate is likely to increase as a function of lead time. A lack of decision-relevant quantitative skill would not bring the science-based foundation of anthropogenic warming into doubt. Similar levels of skill with empirical models does suggest a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to clearly state such weaknesses of a given generation of simulation models, while clearly stating their strength and their foundation, risks the credibility of science in support of policy in the long term.

  7. The rapid quantitation of the filamentous blue-green alga plectonema boryanum by the luciferase assay for ATP

    NASA Technical Reports Server (NTRS)

    Bush, V. N.

    1974-01-01

    Plectonema boryanum is a filamentous blue green alga. Blue green algae have a procaryotic cellular organization similar to bacteria, but are usually obligate photoautotrophs, obtaining their carbon and energy from photosynthetic mechanism similar to higher plants. This research deals with a comparison of three methods of quantitating filamentous populations: microscopic cell counts, the luciferase assay for ATP and optical density measurements.

  8. Experimental Comparison of Different Carbon Fiber Composites in Reinforcement Layouts for Wooden Beams of Historical Buildings

    PubMed Central

    Rescalvo, Francisco J.; Valverde-Palacios, Ignacio; Gallego, Antolino

    2017-01-01

    This paper offers a detailed, quantitative and exhaustive experimental comparison in terms of mechanical properties of three different layouts of carbon composite materials (CFRP) used to strengthen existing old timber beams highly affected by diverse natural defects and biological attacks, testing the use of pultruded laminate attached on the tension side of the element (LR), CFRP fabrics totally U-shape wrapping the timber element (UR), and the combined use of both reinforcement solutions (UR-P). Moreover, unidirectional and bidirectional fabrics were considered and compared. Timber elements used for the experimental program were extracted from a recent rehabilitation of the roof of the current Faculty of Law building, University of Granada (Spain), catalogued as a historical edifice. Experimental results from bending tests show that in all cases reinforcement provides a clear improvement in terms of bending capacity and stiffness as compared with the control specimens (without reinforcement). However, improvements in terms of ductility differ considerably depending on the kind of layout. PMID:28934116

  9. Quantitative approach for defining basic color terms and color category best exemplars.

    PubMed

    Fider, Nicole; Narens, Louis; Jameson, Kimberly A; Komarova, Natalia L

    2017-08-01

    A new method is presented that identifies basic color terms (BCTs) from color-naming data. A function is defined that measures how well a term is understood by a communicating population. BCTs are then separated from other color terms by a threshold value applied to this function. A new mathematical algorithm is proposed and analyzed for determining the best exemplar associated with each BCT. Using data provided by the World Color Survey, comparisons are made between the paper's methods and those from other studies. These comparisons show that the paper's new definition of "basicness" mostly agrees with the typical definition found in the color categorization literature, which was originally due to Kay and colleagues. The new definition, unlike the typical one, has the advantage of not relying on syntactic or semantic features of languages or color lexicons. This permits the methodology developed to be generalizable and applied to other category domains for which a construct of "basicness" could have an important role.

  10. Testing for nonrandom shape similarity between sister cells using automated shape comparison

    NASA Astrophysics Data System (ADS)

    Guo, Monica; Marshall, Wallace F.

    2009-02-01

    Several reports in the biological literature have indicated that when a living cell divides, the two daughter cells have a tendency to be mirror images of each other in terms of their overall cell shape. This phenomenon would be consistent with inheritance of spatial organization from mother cell to daughters. However the published data rely on a small number of examples that were visually chosen, raising potential concerns about inadvertent selection bias. We propose to revisit this issue using automated quantitative shape comparison methods which would have no contribution from the observer and which would allow statistical testing of similarity in large numbers of cells. In this report we describe a first order approach to the problem using rigid curve matching. Using test images, we compare a pointwise correspondence based distance metric with a chamfer matching strategy and find that the latter provides better correspondence and smaller distances between aligned curves, especially when we allow nonrigid deformation of the outlines in addition to rotation.

  11. On the predictive ability of mechanistic models for the Haitian cholera epidemic.

    PubMed

    Mari, Lorenzo; Bertuzzo, Enrico; Finger, Flavio; Casagrandi, Renato; Gatto, Marino; Rinaldo, Andrea

    2015-03-06

    Predictive models of epidemic cholera need to resolve at suitable aggregation levels spatial data pertaining to local communities, epidemiological records, hydrologic drivers, waterways, patterns of human mobility and proxies of exposure rates. We address the above issue in a formal model comparison framework and provide a quantitative assessment of the explanatory and predictive abilities of various model settings with different spatial aggregation levels and coupling mechanisms. Reference is made to records of the recent Haiti cholera epidemics. Our intensive computations and objective model comparisons show that spatially explicit models accounting for spatial connections have better explanatory power than spatially disconnected ones for short-to-intermediate calibration windows, while parsimonious, spatially disconnected models perform better with long training sets. On average, spatially connected models show better predictive ability than disconnected ones. We suggest limits and validity of the various approaches and discuss the pathway towards the development of case-specific predictive tools in the context of emergency management. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  12. Monomer volume fraction profiles in pH responsive planar polyelectrolyte brushes

    DOE PAGES

    Mahalik, Jyoti P.; Yang, Yubo; Deodhar, Chaitra V.; ...

    2016-03-06

    Spatial dependencies of monomer volume fraction profiles of pH responsive polyelectrolyte brushes were investigated using field theories and neutron reflectivity experiments. In particular, planar polyelectrolyte brushes in good solvent were studied and direct comparisons between predictions of the theories and experimental measurements are presented. The comparisons between the theories and the experimental data reveal that solvent entropy and ion-pairs resulting from adsorption of counterions from the added salt play key roles in affecting the monomer distribution and must be taken into account in modeling polyelectrolyte brushes. Furthermore, the utility of this physics-based approach based on these theories for the predictionmore » and interpretation of neutron reflectivity profiles in the context of pH responsive planar polyelectrolyte brushes such as polybasic poly(2-(dimethylamino)ethyl methacrylate) (PDMAEMA) and polyacidic poly(methacrylic acid) (PMAA) brushes is demonstrated. The approach provides a quantitative way of estimating molecular weights of the polymers polymerized using surface-initiated atom transfer radical polymerization.« less

  13. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-04-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.

  14. The Integral Method, a new approach to quantify bactericidal activity.

    PubMed

    Gottardi, Waldemar; Pfleiderer, Jörg; Nagl, Markus

    2015-08-01

    The bactericidal activity (BA) of antimicrobial agents is generally derived from the results of killing assays. A reliable quantitative characterization and particularly a comparison of these substances, however, are impossible with this information. We here propose a new method that takes into account the course of the complete killing curve for assaying BA and that allows a clear-cut quantitative comparison of antimicrobial agents with only one number. The new Integral Method, based on the reciprocal area below the killing curve, reliably calculates an average BA [log10 CFU/min] and, by implementation of the agent's concentration C, the average specific bactericidal activity SBA=BA/C [log10 CFU/min/mM]. Based on experimental killing data, the pertaining BA and SBA values of exemplary active halogen compounds were established, allowing quantitative assertions. N-chlorotaurine (NCT), chloramine T (CAT), monochloramine (NH2Cl), and iodine (I2) showed extremely diverging SBA values of 0.0020±0.0005, 1.11±0.15, 3.49±0.22, and 291±137log10 CFU/min/mM, respectively, against Staphylococcus aureus. This immediately demonstrates an approximately 550-fold stronger activity of CAT, 1730-fold of NH2Cl, and 150,000-fold of I2 compared to NCT. The inferred quantitative assertions and conclusions prove the new method suitable for characterizing bactericidal activity. Its application comprises the effect of defined agents on various bacteria, the consequence of temperature shifts, the influence of varying drug structure, dose-effect relationships, ranking of isosteric agents, comparison of competing commercial antimicrobial formulations, and the effect of additives. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    PubMed

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.

  17. Ultratrace level determination and quantitative analysis of kidney injury biomarkers in patient samples attained by zinc oxide nanorods

    NASA Astrophysics Data System (ADS)

    Singh, Manpreet; Alabanza, Anginelle; Gonzalez, Lorelis E.; Wang, Weiwei; Reeves, W. Brian; Hahm, Jong-In

    2016-02-01

    Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules.Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules. Electronic supplementary information (ESI) available: Typical SEM images of the ZnO NRs used in the biomarker assays are provided in Fig. S1. See DOI: 10.1039/c5nr08706f

  18. 76 FR 5719 - Pattern of Violations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... safety and health record of each mine rather than on a strictly quantitative comparison of mines to... several reservations, given the methodological difficulties involved in estimating the compensating wage...

  19. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  20. Thermal biology, torpor and behaviour in sugar gliders: a laboratory-field comparison.

    PubMed

    Geiser, Fritz; Holloway, Joanne C; Körtner, Gerhard

    2007-07-01

    Most studies on animal physiology and behaviour are conducted in captivity without verification that data are representative of free-ranging animals. We provide the first quantitative comparison of daily torpor, thermal biology and activity patterns, conducted on two groups of sugar gliders (Petaurus breviceps, Marsupialia) exposed to similar thermal conditions, one in captivity and the other in the field. Our study shows that activity in captive gliders in an outdoor aviary is restricted to the night and largely unaffected by weather, whereas free-ranging gliders omit foraging on cold/wet nights and may also forage in the afternoon. Torpor occurrence in gliders was significantly lower in captivity (8.4% after food deprivation; 1.1% for all observations) than in the field (25.9%), mean torpor bout duration was shorter in captivity (6.9 h) than in the field (13.1 h), and mean body temperatures during torpor were higher in captivity (25.3 degrees C) than in the field (19.6 degrees C). Moreover, normothermic body temperature as a function of air temperature differed between captive and free-ranging gliders, with a >3 degrees C difference at low air temperatures. Our comparison shows that activity patterns, thermal physiology, use of torpor and patterns of torpor may differ substantially between the laboratory and field, and provides further evidence that functional and behavioural data on captive individuals may not necessarily be representative of those living in the wild.

  1. Quantitative Features of Liver Lesions, Lung Nodules, and Renal Stones at Multi-Detector Row CT Examinations: Dependency on Radiation Dose and Reconstruction Algorithm.

    PubMed

    Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan

    2016-04-01

    To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was significantly affected by the reconstruction algorithm used (average of 3.33 features affected by MBIR throughout lesion types; P < .002, for all comparisons), no significant effect of the radiation dose setting was observed for all but one of the texture features (P = .002-.998). Radiation dose settings and reconstruction algorithms affect the extraction and analysis of quantitative imaging features in lesions at multi-detector row CT.

  2. Selection of stable reference genes for quantitative rt-PCR comparisons of mouse embryonic and extra-embryonic stem cells.

    PubMed

    Veazey, Kylee J; Golding, Michael C

    2011-01-01

    Isolation and culture of both embryonic and tissue specific stem cells provide an enormous opportunity to study the molecular processes driving development. To gain insight into the initial events underpinning mammalian embryogenesis, pluripotent stem cells from each of the three distinct lineages present within the preimplantation blastocyst have been derived. Embryonic (ES), trophectoderm (TS) and extraembryonic endoderm (XEN) stem cells possess the developmental potential of their founding lineages and seemingly utilize distinct epigenetic modalities to program gene expression. However, the basis for these differing cellular identities and epigenetic properties remain poorly defined.Quantitative reverse transcription-polymerase chain reaction (qPCR) is a powerful and efficient means of rapidly comparing patterns of gene expression between different developmental stages and experimental conditions. However, careful, empirical selection of appropriate reference genes is essential to accurately measuring transcriptional differences. Here we report the quantitation and evaluation of fourteen commonly used references genes between ES, TS and XEN stem cells. These included: Actb, B2m, Hsp70, Gapdh, Gusb, H2afz, Hk2, Hprt, Pgk1, Ppia, Rn7sk, Sdha, Tbp and Ywhaz. Utilizing three independent statistical analysis, we identify Pgk1, Sdha and Tbp as the most stable reference genes between each of these stem cell types. Furthermore, we identify Sdha, Tbp and Ywhaz as well as Ywhaz, Pgk1 and Hk2 as the three most stable reference genes through the in vitro differentiation of embryonic and trophectoderm stem cells respectively.Understanding the transcriptional and epigenetic regulatory mechanisms controlling cellular identity within these distinct stem cell types provides essential insight into cellular processes controlling both embryogenesis and stem cell biology. Normalizing quantitative RT-PCR measurements using the geometric mean CT values obtained for the identified mRNAs, offers a reliable method to assess differing patterns of gene expression between the three founding stem cell lineages present within the mammalian preimplantation embryo.

  3. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    PubMed

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.

  4. Velocity Measurement in Carotid Artery: Quantitative Comparison of Time-Resolved 3D Phase-Contrast MRI and Image-based Computational Fluid Dynamics

    PubMed Central

    Sarrami-Foroushani, Ali; Nasr Esfahany, Mohsen; Nasiraei Moghaddam, Abbas; Saligheh Rad, Hamidreza; Firouznia, Kavous; Shakiba, Madjid; Ghanaati, Hossein; Wilkinson, Iain David; Frangi, Alejandro Federico

    2015-01-01

    Background: Understanding hemodynamic environment in vessels is important for realizing the mechanisms leading to vascular pathologies. Objectives: Three-dimensional velocity vector field in carotid bifurcation is visualized using TR 3D phase-contrast magnetic resonance imaging (TR 3D PC MRI) and computational fluid dynamics (CFD). This study aimed to present a qualitative and quantitative comparison of the velocity vector field obtained by each technique. Subjects and Methods: MR imaging was performed on a 30-year old male normal subject. TR 3D PC MRI was performed on a 3 T scanner to measure velocity in carotid bifurcation. 3D anatomical model for CFD was created using images obtained from time-of-flight MR angiography. Velocity vector field in carotid bifurcation was predicted using CFD and PC MRI techniques. A statistical analysis was performed to assess the agreement between the two methods. Results: Although the main flow patterns were the same for the both techniques, CFD showed a greater resolution in mapping the secondary and circulating flows. Overall root mean square (RMS) errors for all the corresponding data points in PC MRI and CFD were 14.27% in peak systole and 12.91% in end diastole relative to maximum velocity measured at each cardiac phase. Bland-Altman plots showed a very good agreement between the two techniques. However, this study was not aimed to validate any of methods, instead, the consistency was assessed to accentuate the similarities and differences between Time-resolved PC MRI and CFD. Conclusion: Both techniques provided quantitatively consistent results of in vivo velocity vector fields in right internal carotid artery (RCA). PC MRI represented a good estimation of main flow patterns inside the vasculature, which seems to be acceptable for clinical use. However, limitations of each technique should be considered while interpreting results. PMID:26793288

  5. Different binding motifs of the celiac disease-associated HLA molecules DQ2.5, DQ2.2, and DQ7.5 revealed by relative quantitative proteomics of endogenous peptide repertoires.

    PubMed

    Bergseng, Elin; Dørum, Siri; Arntzen, Magnus Ø; Nielsen, Morten; Nygård, Ståle; Buus, Søren; de Souza, Gustavo A; Sollid, Ludvig M

    2015-02-01

    Celiac disease is caused by intolerance to cereal gluten proteins, and HLA-DQ molecules are involved in the disease pathogenesis by presentation of gluten peptides to CD4(+) T cells. The α- or β-chain sharing HLA molecules DQ2.5, DQ2.2, and DQ7.5 display different risks for the disease. It was recently demonstrated that T cells of DQ2.5 and DQ2.2 patients recognize distinct sets of gluten epitopes, suggesting that these two DQ2 variants select different peptides for display. To explore whether this is the case, we performed a comprehensive comparison of the endogenous self-peptides bound to HLA-DQ molecules of B-lymphoblastoid cell lines. Peptides were eluted from affinity-purified HLA molecules of nine cell lines and subjected to quadrupole orbitrap mass spectrometry and MaxQuant software analysis. Altogether, 12,712 endogenous peptides were identified at very different relative abundances. Hierarchical clustering of normalized quantitative data demonstrated significant differences in repertoires of peptides between the three DQ variant molecules. The neural network-based method, NNAlign, was used to identify peptide-binding motifs. The binding motifs of DQ2.5 and DQ7.5 concurred with previously established binding motifs. The binding motif of DQ2.2 was strikingly different from that of DQ2.5 with position P3 being a major anchor having a preference for threonine and serine. This is notable as three recently identified epitopes of gluten recognized by T cells of DQ2.2 celiac patients harbor serine at position P3. This study demonstrates that relative quantitative comparison of endogenous peptides sampled from our protein metabolism by HLA molecules provides clues to understand HLA association with disease.

  6. Functional ankle instability as a risk factor for osteoarthritis: using T2-mapping to analyze early cartilage degeneration in the ankle joint of young athletes.

    PubMed

    Golditz, T; Steib, S; Pfeifer, K; Uder, M; Gelse, K; Janka, R; Hennig, F F; Welsch, G H

    2014-10-01

    The aim of this study was to investigate, using T2-mapping, the impact of functional instability in the ankle joint on the development of early cartilage damage. Ethical approval for this study was provided. Thirty-six volunteers from the university sports program were divided into three groups according to their ankle status: functional ankle instability (FAI, initial ankle sprain with residual instability); ankle sprain Copers (initial sprain, without residual instability); and controls (without a history of ankle injuries). Quantitative T2-mapping magnetic resonance imaging (MRI) was performed at the beginning ('early-unloading') and at the end ('late-unloading') of the MR-examination, with a mean time span of 27 min. Zonal region-of-interest T2-mapping was performed on the talar and tibial cartilage in the deep and superficial layers. The inter-group comparisons of T2-values were analyzed using paired and unpaired t-tests. Statistical analysis of variance was performed. T2-values showed significant to highly significant differences in 11 of 12 regions throughout the groups. In early-unloading, the FAI-group showed a significant increase in quantitative T2-values in the medial, talar regions (P = 0.008, P = 0.027), whereas the Coper-group showed this enhancement in the central-lateral regions (P = 0.05). Especially the comparison of early-loading to late-unloading values revealed significantly decreasing T2-values over time laterally and significantly increasing T2-values medially in the FAI-group, which were not present in the Coper- or control-group. Functional instability causes unbalanced loading in the ankle joint, resulting in cartilage alterations as assessed by quantitative T2-mapping. This approach can visualize and localize early cartilage abnormalities, possibly enabling specific treatment options to prevent osteoarthritis in young athletes. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  7. A novel quantitative real-time polymerase chain reaction method for detecting toxigenic Pasteurella multocida in nasal swabs from swine.

    PubMed

    Scherrer, Simone; Frei, Daniel; Wittenbrink, Max Michael

    2016-12-01

    Progressive atrophic rhinitis (PAR) in pigs is caused by toxigenic Pasteurella multocida. In Switzerland, PAR is monitored by selective culture of nasal swabs and subsequent polymerase chain reaction (PCR) screening of bacterial colonies for the P. multocida toxA gene. A panel of 203 nasal swabs from a recent PAR outbreak were used to evaluate a novel quantitative real-time PCR for toxigenic P. multocida in porcine nasal swabs. In comparison to the conventional PCR with a limit of detection of 100 genome equivalents per PCR reaction, the real-time PCR had a limit of detection of 10 genome equivalents. The real-time PCR detected toxA-positive P. multocida in 101 samples (49.8%), whereas the conventional PCR was less sensitive with 90 toxA-positive samples (44.3%). In comparison to the real-time PCR, 5.4% of the toxA-positive samples revealed unevaluable results by conventional PCR. The approach of culture-coupled toxA PCR for the monitoring of PAR in pigs is substantially improved by a novel quantitative real-time PCR.

  8. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  9. Towards discrete wavelet transform-based human activity recognition

    NASA Astrophysics Data System (ADS)

    Khare, Manish; Jeon, Moongu

    2017-06-01

    Providing accurate recognition of human activities is a challenging problem for visual surveillance applications. In this paper, we present a simple and efficient algorithm for human activity recognition based on a wavelet transform. We adopt discrete wavelet transform (DWT) coefficients as a feature of human objects to obtain advantages of its multiresolution approach. The proposed method is tested on multiple levels of DWT. Experiments are carried out on different standard action datasets including KTH and i3D Post. The proposed method is compared with other state-of-the-art methods in terms of different quantitative performance measures. The proposed method is found to have better recognition accuracy in comparison to the state-of-the-art methods.

  10. Online interactive analysis of protein structure ensembles with Bio3D-web.

    PubMed

    Skjærven, Lars; Jariwala, Shashank; Yao, Xin-Qiu; Grant, Barry J

    2016-11-15

    Bio3D-web is an online application for analyzing the sequence, structure and conformational heterogeneity of protein families. Major functionality is provided for identifying protein structure sets for analysis, their alignment and refined structure superposition, sequence and structure conservation analysis, mapping and clustering of conformations and the quantitative comparison of their predicted structural dynamics. Bio3D-web is based on the Bio3D and Shiny R packages. All major browsers are supported and full source code is available under a GPL2 license from http://thegrantlab.org/bio3d-web CONTACT: bjgrant@umich.edu or lars.skjarven@uib.no. © The Author 2016. Published by Oxford University Press.

  11. Investigation of Thermal Stress Convection in Nonisothermal Gases under Microgravity Conditions

    NASA Technical Reports Server (NTRS)

    Mackowski, Daniel W.

    1999-01-01

    The project has sought to ascertain the veracity of the Burnett relations, as applied to slow moving, highly nonisothermal gases, by comparison of convection and stress predictions with those generated by the DSMC method. The Burnett equations were found to provide reasonable descriptions of the pressure distribution and normal stress in stationary gases with a 1-D temperature gradient. Continuum/Burnett predictions of thermal stress convection in 2-D heated enclosures, however, are not quantitatively supported by DSMC results. For such situations, it appears that thermal creep flows, generated at the boundaries of the enclosure, will be significantly larger than the flows resulting from thermal stress in the gas.

  12. Enhanced Molecular Sieve CO2 Removal Evaluation

    NASA Technical Reports Server (NTRS)

    Rose, Susan; ElSherif, Dina; MacKnight, Allen

    1996-01-01

    The objective of this research is to quantitatively characterize the performance of two major types of molecular sieves for two-bed regenerative carbon dioxide removal at the conditions compatible with both a spacesuit and station application. One sorbent is a zeolite-based molecular sieve that has been substantially improved over the materials used in Skylab. The second sorbent is a recently developed carbon-based molecular sieve. Both molecular sieves offer the potential of high payoff for future manned missions by reducing system complexity, weight (including consumables), and power consumption in comparison with competing concepts. The research reported here provides the technical data required to improve CO2 removal systems for regenerative life support systems for future IVA and EVA missions.

  13. Soil spectral characterization

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.

    1981-01-01

    The spectral characterization of soils is discussed with particular reference to the bidirectional reflectance factor as a quantitative measure of soil spectral properties, the role of soil color, soil parameters affecting soil reflectance, and field characteristics of soil reflectance. Comparisons between laboratory-measured soil spectra and Landsat MSS data have shown good agreement, especially in discriminating relative drainage conditions and organic matter levels in unvegetated soils. The capacity to measure both visible and infrared soil reflectance provides information on other soil characteristics and makes it possible to predict soil response to different management conditions. Field and laboratory soil spectral characterization helps define the extent to which intrinsic spectral information is available from soils as a consequence of their composition and field characteristics.

  14. Direction-dependent stability of skyrmion lattice in helimagnets induced by exchange anisotropy

    NASA Astrophysics Data System (ADS)

    Hu, Yangfan

    2018-06-01

    Exchange anisotropy provides a direction dependent mechanism for the stability of the skyrmion lattice phase in noncentrosymmetric bulk chiral magnets. Based on the Fourier representation of the skyrmion lattice, we explain the direction dependence of the temperature-magnetic field phase diagram for bulk MnSi through a phenomenological mean-field model incorporating exchange anisotropy. Through quantitative comparison with experimental results, we clarify that the stability of the skyrmion lattice phase in bulk MnSi is determined by a combined effect of negative exchange anisotropy and thermal fluctuation. The effect of exchange anisotropy and the order of Fourier representation on the equilibrium properties of the skyrmion lattice is discussed in detail.

  15. Final report of APMP.QM-S6: clenbuterol in porcine meat

    NASA Astrophysics Data System (ADS)

    Sin, D. W.-M.; Ho, C.; Yip, Y.-C.

    2016-01-01

    At the CCQM Organic Analysis Working Group (OAWG) Meeting held in April 2012 and the APMP TCQM Meeting held in November 2012, an APMP supplementary comparison (APMP.QM-S6) on the determination of clenbuterol in porcine meat was supported by the OAWG and APMP TCQM. This comparison was organized by the Government Laboratory, Hong Kong. In order to accommodate a wider participation, a pilot study (APMP.QM-P22) was run in parallel to APMP.QM-S6. This study provided the means for assessing the measurement capabilities for determination of low-polarity measurands in a procedure that requires extraction, clean-up, analytical separation, and selective detection in a food matrix. A total of 7 institutes registered for the supplementary comparison and 6 of them submitted their results. 4 results were included for SCRV calculation. All participating laboratories applied Isotope Dilution Liquid Chromatography-Tandem Mass Spectrometry (ID-LCMS/MS) technique with clenbuterol-d9 as internal standard spiked for quantitation in this programme. KEY WORDS FOR SEARCH APMP.QM-S6 and Clenbuterol Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  16. Big Fish in a Big Pond: a study of academic self concept in first year medical students

    PubMed Central

    2011-01-01

    Background Big-fish-little-pond effect (BFLPE) research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Methods Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. Results The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Conclusions Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation. PMID:21794166

  17. A community resource benchmarking predictions of peptide binding to MHC-I molecules.

    PubMed

    Peters, Bjoern; Bui, Huynh-Hoa; Frankild, Sune; Nielson, Morten; Lundegaard, Claus; Kostem, Emrah; Basch, Derek; Lamberth, Kasper; Harndahl, Mikkel; Fleri, Ward; Wilson, Stephen S; Sidney, John; Lund, Ole; Buus, Soren; Sette, Alessandro

    2006-06-09

    Recognition of peptides bound to major histocompatibility complex (MHC) class I molecules by T lymphocytes is an essential part of immune surveillance. Each MHC allele has a characteristic peptide binding preference, which can be captured in prediction algorithms, allowing for the rapid scan of entire pathogen proteomes for peptide likely to bind MHC. Here we make public a large set of 48,828 quantitative peptide-binding affinity measurements relating to 48 different mouse, human, macaque, and chimpanzee MHC class I alleles. We use this data to establish a set of benchmark predictions with one neural network method and two matrix-based prediction methods extensively utilized in our groups. In general, the neural network outperforms the matrix-based predictions mainly due to its ability to generalize even on a small amount of data. We also retrieved predictions from tools publicly available on the internet. While differences in the data used to generate these predictions hamper direct comparisons, we do conclude that tools based on combinatorial peptide libraries perform remarkably well. The transparent prediction evaluation on this dataset provides tool developers with a benchmark for comparison of newly developed prediction methods. In addition, to generate and evaluate our own prediction methods, we have established an easily extensible web-based prediction framework that allows automated side-by-side comparisons of prediction methods implemented by experts. This is an advance over the current practice of tool developers having to generate reference predictions themselves, which can lead to underestimating the performance of prediction methods they are not as familiar with as their own. The overall goal of this effort is to provide a transparent prediction evaluation allowing bioinformaticians to identify promising features of prediction methods and providing guidance to immunologists regarding the reliability of prediction tools.

  18. Validity of the Schizophrenia Diagnosis of the Psychopathology Instrument for Mentally Retarded Adults (PIMRA): A Comparison of Schizophrenic Patients with and without Mental Retardation.

    ERIC Educational Resources Information Center

    Linaker, Olav M.; Helle, Jon

    1994-01-01

    This study found that the schizophrenia subscale of the Psychopathology Instrument for Mentally Retarded Adults was a valid quantitative measure of schizophrenia if one item was removed from the scale. Comparison with a nonretarded population indicated that mentally retarded patients had less delusions and more incoherence and flat affect. They…

  19. Career Maturity and College Students: A Case Study Comparison of Student-Athletes and Non-Athletes at a Division I Institution

    ERIC Educational Resources Information Center

    Tarver, Walter L., III.

    2017-01-01

    This quantitative comparative research study examines the career maturity of student-athletes in comparison to non-athletes at a Division I university. The study also measures differences in career maturity among student-athletes based on gender, class level, race/ethnicity, by sport, by type of sport (revenue/non-revenue), and professional sports…

  20. Quantitative comparisons of cancer induction in humans by internally deposited radionuclides and external radiation.

    PubMed

    Harrison, J D; Muirhead, C R

    2003-01-01

    To compare quantitative estimates of lifetime cancer risk in humans for exposures to internally deposited radionuclides and external radiation. To assess the possibility that risks from radionuclide exposures may be underestimated. Risk estimates following internal exposures can be made for a small number of alpha-particle-emitting nuclides. (1) Lung cancer in underground miners exposed by inhalation to radon-222 gas and its short-lived progeny. Studies of residential (222)Rn exposure are generally consistent with predictions from the miner studies. (2) Liver cancer and leukaemia in patients given intravascular injections of Thorotrast, a thorium-232 oxide preparation that concentrates in liver, spleen and bone marrow. (3) Bone cancer in patients given injections of radium-224, and in workers exposed occupationally to (226)Ra and (228)Ra, mainly by ingestion. (4) Lung cancer in Mayak workers exposed to plutonium-239, mainly by inhalation. Liver and bone cancers were also seen, but the dosimetry is not yet sufficiently good enough to provide quantitative estimates of risks. Comparisons can be made between risk estimates for radiation-induced cancer derived for radionuclide exposure and those derived for the A-bomb survivors, exposed mainly to low-LET (linear energy transfer) external radiation. Data from animal studies, using dogs and rodents, allow comparisons of cancer induction by a range of alpha- and beta-/gamma-emitting radionuclides. They provide information on relative biological effectiveness (RBE), dose-response relationships, dose-rate effects and the location of target cells for different malignancies. For lung and liver cancer, the estimated values of risk per Sv for internal exposure, assuming an RBE for alpha-particles of 20, are reasonably consistent with estimates for external exposure to low-LET radiation. This also applies to bone cancer when risk is calculated on the basis of average bone dose, but consideration of dose to target cells on bone surfaces suggests a low RBE for alpha-particles. Similarly, for leukaemia, the comparison of risks from alpha-irradiation ((232)Th and progeny) and external radiation suggest a low alpha RBE; this conclusion is supported by animal data. Risk estimates for internal exposure are dependent on the assumptions made in calculating dose. Account is taken of the distribution of radionuclides within tissues and the distribution of target cells for cancer induction. For the lungs and liver, the available human and animal data provide support for current assumptions. However, for bone cancer and leukaemia, it may be that changes are required. Bone cancer risk may be best assessed by calculating dose to a 50 micro m layer of marrow adjacent to endosteal (inner) bone surfaces rather than to a single 10 micro m cell layer as currently assumed. Target cells for leukaemia may be concentrated towards the centre of marrow cavities so that the risk of leukaemia from bone-seeking radionuclides, particularly alpha emitters, may be overestimated by the current assumption of uniform distribution of target cells throughout red bone marrow. The lifetime risk estimates considered here for exposure to internally deposited radionuclides and to external radiation are subject to uncertainties, arising from the dosimetric assumptions made, from the quality of cancer incidence and mortality data and from aspects of risk modelling; including variations in baseline rates between populations for some cancer types. Bearing in mind such uncertainties, comparisons of risk estimates for internal emitters and external radiation show good agreement for lung and liver cancers. For leukaemia, the available data suggest that the assumption of an alpha-particle RBE of 20 can result in overestimates of risk. For bone cancer, it also appears that current assumptions will overestimate risks from alpha-particle-emitting nuclides, particularly at low doses.

Top