Sample records for quantitative performance comparisons

  1. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  2. Comparison of symptomatology and performance degradation for motion and radiation sickness. Technical report, 6 January 1984-31 March 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClellan, G.E.; Wiker, S.F.

    1985-05-31

    This report quantifies for the first time the relationship between the signs and symptoms of acute radiation sickness and those of motion sickness. With this relationship, a quantitative comparison is made between data on human performance degradation during motion sickness and estimates of performance degradation during radiation sickness. The comparison validates estimates made by the Intermediate Dose Program on the performance degradation from acute radiation sickness.

  3. Inclusion and Student Learning: A Quantitative Comparison of Special and General Education Student Performance Using Team and Solo-Teaching

    ERIC Educational Resources Information Center

    Jamison, Joseph A.

    2013-01-01

    This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…

  4. Comparison of quantitative and qualitative tests for glucose-6-phosphate dehydrogenase deficiency.

    PubMed

    LaRue, Nicole; Kahn, Maria; Murray, Marjorie; Leader, Brandon T; Bansil, Pooja; McGray, Sarah; Kalnoky, Michael; Zhang, Hao; Huang, Huiqiang; Jiang, Hui; Domingo, Gonzalo J

    2014-10-01

    A barrier to eliminating Plasmodium vivax malaria is inadequate treatment of infected patients. 8-Aminoquinoline-based drugs clear the parasite; however, people with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk for hemolysis from these drugs. Understanding the performance of G6PD deficiency tests is critical for patient safety. Two quantitative assays and two qualitative tests were evaluated. The comparison of quantitative assays gave a Pearson correlation coefficient of 0.7585 with significant difference in mean G6PD activity, highlighting the need to adhere to a single reference assay. Both qualitative tests had high sensitivity and negative predictive value at a cutoff G6PD value of 40% of normal activity if interpreted conservatively and performed under laboratory conditions. The performance of both tests dropped at a cutoff level of 45%. Cytochemical staining of specimens confirmed that heterozygous females with > 50% G6PD-deficient cells can seem normal by phenotypic tests. © The American Society of Tropical Medicine and Hygiene.

  5. Quantitative comparison of tumor delivery for multiple targeted nanoparticles simultaneously by multiplex ICP-MS.

    PubMed

    Elias, Andrew; Crayton, Samuel H; Warden-Rothman, Robert; Tsourkas, Andrew

    2014-07-28

    Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples.

  6. Performance comparison of various seal coat grades used in Texas.

    DOT National Transportation Integrated Search

    2012-07-01

    This report documents research efforts to provide comparative quantitative performance information for various grades of seal coat aggregate available in the Texas Department of Transportations standard specifications. Length of service before rep...

  7. A comparison study of image features between FFDM and film mammogram images

    PubMed Central

    Jing, Hao; Yang, Yongyi; Wernick, Miles N.; Yarusso, Laura M.; Nishikawa, Robert M.

    2012-01-01

    Purpose: This work is to provide a direct, quantitative comparison of image features measured by film and full-field digital mammography (FFDM). The purpose is to investigate whether there is any systematic difference between film and FFDM in terms of quantitative image features and their influence on the performance of a computer-aided diagnosis (CAD) system. Methods: The authors make use of a set of matched film-FFDM image pairs acquired from cadaver breast specimens with simulated microcalcifications consisting of bone and teeth fragments using both a GE digital mammography system and a screen-film system. To quantify the image features, the authors consider a set of 12 textural features of lesion regions and six image features of individual microcalcifications (MCs). The authors first conduct a direct comparison on these quantitative features extracted from film and FFDM images. The authors then study the performance of a CAD classifier for discriminating between MCs and false positives (FPs) when the classifier is trained on images of different types (film, FFDM, or both). Results: For all the features considered, the quantitative results show a high degree of correlation between features extracted from film and FFDM, with the correlation coefficients ranging from 0.7326 to 0.9602 for the different features. Based on a Fisher sign rank test, there was no significant difference observed between the features extracted from film and those from FFDM. For both MC detection and discrimination of FPs from MCs, FFDM had a slight but statistically significant advantage in performance; however, when the classifiers were trained on different types of images (acquired with FFDM or SFM) for discriminating MCs from FPs, there was little difference. Conclusions: The results indicate good agreement between film and FFDM in quantitative image features. While FFDM images provide better detection performance in MCs, FFDM and film images may be interchangeable for the purposes of training CAD algorithms, and a single CAD algorithm may be applied to either type of images. PMID:22830771

  8. Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Liu, Nan-Suey

    2005-01-01

    The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.

  9. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  10. Quantitative Comparison of Tumor Delivery for Multiple Targeted Nanoparticles Simultaneously by Multiplex ICP-MS

    PubMed Central

    Elias, Andrew; Crayton, Samuel H.; Warden-Rothman, Robert; Tsourkas, Andrew

    2014-01-01

    Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples. PMID:25068300

  11. Quantitative Comparison of Minimum Inductance and Minimum Power Algorithms for the Design of Shim Coils for Small Animal Imaging

    PubMed Central

    HUDSON, PARISA; HUDSON, STEPHEN D.; HANDLER, WILLIAM B.; SCHOLL, TIMOTHY J.; CHRONIK, BLAINE A.

    2010-01-01

    High-performance shim coils are required for high-field magnetic resonance imaging and spectroscopy. Complete sets of high-power and high-performance shim coils were designed using two different methods: the minimum inductance and the minimum power target field methods. A quantitative comparison of shim performance in terms of merit of inductance (ML) and merit of resistance (MR) was made for shim coils designed using the minimum inductance and the minimum power design algorithms. In each design case, the difference in ML and the difference in MR given by the two design methods was <15%. Comparison of wire patterns obtained using the two design algorithms show that minimum inductance designs tend to feature oscillations within the current density; while minimum power designs tend to feature less rapidly varying current densities and lower power dissipation. Overall, the differences in coil performance obtained by the two methods are relatively small. For the specific case of shim systems customized for small animal imaging, the reduced power dissipation obtained when using the minimum power method is judged to be more significant than the improvements in switching speed obtained from the minimum inductance method. PMID:20411157

  12. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  13. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.

  14. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976

  15. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  16. Quantitative Comparison of Three Standardization Methods Using a One-Way ANOVA for Multiple Mean Comparisons

    ERIC Educational Resources Information Center

    Barrows, Russell D.

    2007-01-01

    A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…

  17. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  18. Physical similarity or numerical representation counts in same-different, numerical comparison, physical comparison, and priming tasks?

    PubMed

    Zhang, Li; Xin, Ziqiang; Feng, Tingyong; Chen, Yinghe; Szűcs, Denes

    2018-03-01

    Recent studies have highlighted the fact that some tasks used to study symbolic number representations are confounded by judgments about physical similarity. Here, we investigated whether the contribution of physical similarity and numerical representation differed in the often-used symbolic same-different, numerical comparison, physical comparison, and priming tasks. Experiment 1 showed that subjective physical similarity was the best predictor of participants' performance in the same-different task, regardless of simultaneous or sequential presentation. Furthermore, the contribution of subjective physical similarity was larger in a simultaneous presentation than in a sequential presentation. Experiment 2 showed that only numerical representation was involved in numerical comparison. Experiment 3 showed that both subjective physical similarity and numerical representation contributed to participants' physical comparison performance. Finally, only numerical representation contributed to participants' performance in a priming task as revealed by Experiment 4. Taken together, the contribution of physical similarity and numerical representation depends on task demands. Performance primarily seems to rely on numerical properties in tasks that require explicit quantitative comparison judgments (physical or numerical), while physical stimulus properties exert an effect in the same-different task.

  19. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Comparison of growth and metabolic regulation between wild, domesticated and transgenic salmonids.

    USDA-ARS?s Scientific Manuscript database

    To gain a better understanding of the aspects underlying normal and growth hormone enhanced growth in salmonids, quantitative expression analysis was performed for a number of genes related to muscle growth, metabolism, immunology and energy regulation. This analysis was performed in liver and musc...

  1. Big fish in a big pond: a study of academic self concept in first year medical students.

    PubMed

    Jackman, Kirsty; Wilson, Ian G; Seaton, Marjorie; Craven, Rhonda G

    2011-07-27

    Big-fish-little-pond effect (BFLPE) research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation.

  2. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  3. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  4. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  5. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  6. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  7. A quantitative comparison of soil moisture inversion algorithms

    NASA Technical Reports Server (NTRS)

    Zyl, J. J. van; Kim, Y.

    2001-01-01

    This paper compares the performance of four bare surface radar soil moisture inversion algorithms in the presence of measurement errors. The particular errors considered include calibration errors, system thermal noise, local topography and vegetation cover.

  8. Large size crystalline vs. co-sintered ceramic Yb(3+):YAG disk performance in diode pumped amplifiers.

    PubMed

    Albach, Daniel; Chanteloup, Jean-Christophe

    2015-01-12

    A comprehensive experimental benchmarking of Yb(3+):YAG crystalline and co-sintered ceramic disks of similar thickness and doping level is presented in the context of high average power laser amplifier operation. Comparison is performed considering gain, depolarization and wave front deformation quantitative measurements and analysis.

  9. Comparative Performance of Reagents and Platforms for Quantitation of Cytomegalovirus DNA by Digital PCR

    PubMed Central

    Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.

    2016-01-01

    A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685

  10. Comparison study of two procedures for the determination of emamectin benzoate in medicated fish feed.

    PubMed

    Farer, Leslie J; Hayes, John M

    2005-01-01

    A new method has been developed for the determination of emamectin benzoate in fish feed. The method uses a wet extraction, cleanup by solid-phase extraction, and quantitation and separation by liquid chromatography (LC). In this paper, we compare the performance of this method with that of a previously reported LC assay for the determination of emamectin benzoate in fish feed. Although similar to the previous method, the new procedure uses a different sample pretreatment, wet extraction, and quantitation method. The performance of the new method was compared with that of the previously reported method by analyses of 22 medicated feed samples from various commercial sources. A comparison of the results presented here reveals slightly lower assay values obtained with the new method. Although a paired sample t-test indicates the difference in results is significant, this difference is within the method precision of either procedure.

  11. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  12. cp-R, an interface the R programming language for clinical laboratory method comparisons.

    PubMed

    Holmes, Daniel T

    2015-02-01

    Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Quantitative analysis of peel-off degree for printed electronics

    NASA Astrophysics Data System (ADS)

    Park, Janghoon; Lee, Jongsu; Sung, Ki-Hak; Shin, Kee-Hyun; Kang, Hyunkyoo

    2018-02-01

    We suggest a facile methodology of peel-off degree evaluation by image processing on printed electronics. The quantification of peeled and printed areas was performed using open source programs. To verify the accuracy of methods, we manually removed areas from the printed circuit that was measured, resulting in 96.3% accuracy. The sintered patterns showed a decreasing tendency in accordance with the increase in the energy density of an infrared lamp, and the peel-off degree increased. Thus, the comparison between both results was presented. Finally, the correlation between performance characteristics was determined by quantitative analysis.

  14. Big Fish in a Big Pond: a study of academic self concept in first year medical students

    PubMed Central

    2011-01-01

    Background Big-fish-little-pond effect (BFLPE) research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Methods Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. Results The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Conclusions Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation. PMID:21794166

  15. Objective comparison of particle tracking methods.

    PubMed

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R; Godinez, William J; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E G; Jaldén, Joakim; Blau, Helen M; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P; Dan, Han-Wei; Tsai, Yuh-Show; Ortiz de Solórzano, Carlos; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-03-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Because manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized an open competition in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to notable practical conclusions for users and developers.

  16. a Performance Comparison of Feature Detectors for Planetary Rover Mapping and Localization

    NASA Astrophysics Data System (ADS)

    Wan, W.; Peng, M.; Xing, Y.; Wang, Y.; Liu, Z.; Di, K.; Teng, B.; Mao, X.; Zhao, Q.; Xin, X.; Jia, M.

    2017-07-01

    Feature detection and matching are key techniques in computer vision and robotics, and have been successfully implemented in many fields. So far there is no performance comparison of feature detectors and matching methods for planetary mapping and rover localization using rover stereo images. In this research, we present a comprehensive evaluation and comparison of six feature detectors, including Moravec, Förstner, Harris, FAST, SIFT and SURF, aiming for optimal implementation of feature-based matching in planetary surface environment. To facilitate quantitative analysis, a series of evaluation criteria, including distribution evenness of matched points, coverage of detected points, and feature matching accuracy, are developed in the research. In order to perform exhaustive evaluation, stereo images, simulated under different baseline, pitch angle, and interval of adjacent rover locations, are taken as experimental data source. The comparison results show that SIFT offers the best overall performance, especially it is less sensitive to changes of image taken at adjacent locations.

  17. A Quantitative Comparison of the Relative Performance of VHF and UHF Broadcast Systems. Technical Monograph Number 1.

    ERIC Educational Resources Information Center

    Rubin, Philip A.; And Others

    A study was undertaken to: (1) assess problems with UHF television systems; and (2) identify problem-solving activities on which different broadcast institutions could cooperate, The model for comparing UHF with VHF broadcast/reception services assigned performance disparity figures to each of the following elements: (1) transmitter and…

  18. Academic Performance of Students with Disabilities in Higher Education: Insights from a Study of One Catholic College

    ERIC Educational Resources Information Center

    Wasielewski, Laura M.

    2016-01-01

    The purpose of this study was to determine if students with disabilities perform comparably to students without disabilities academically at a small Catholic liberal arts college. Quantitative results were gathered through the comparison of end of semester and cumulative grade point averages for students with disabilities and students without…

  19. The Effect of Studying Tech Prep in High School and College Academic Performance

    ERIC Educational Resources Information Center

    Ray, Larry A.

    2011-01-01

    This study examined the academic performance of Tech Prep students (referred to as participants) in comparison to non-Tech Prep students (referred to as non-participants) entering a two-year community college from sixteen different high schools in Stark County, Ohio. This study provided a quantitative analysis of students' academic experiences to…

  20. A risk based approach for SSTO/TSTO comparisons

    NASA Astrophysics Data System (ADS)

    Greenberg, Joel S.

    1996-03-01

    An approach has been developed for performing early comparisons of transportation architectures explicitly taking into account quantitative measures of uncertainty and resulting risk. Risk considerations are necessary since the transportation systems are likely to have significantly different levels of risk, both because of differing degrees of freedom in achieving desired performance levels and their different states of development and utilization. The approach considers the uncertainty of achievement of technology goals, effect that the achieved technology level will have on transportation system performance and the relationship between system performance/capability and the ability to accommodate variations in payload mass. The consequences of system performance are developed in terms of nonrecurring, recurring, and the present value of transportation system life cycle costs.

  1. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  2. Statistical differences between relative quantitative molecular fingerprints from microbial communities.

    PubMed

    Portillo, M C; Gonzalez, J M

    2008-08-01

    Molecular fingerprints of microbial communities are a common method for the analysis and comparison of environmental samples. The significance of differences between microbial community fingerprints was analyzed considering the presence of different phylotypes and their relative abundance. A method is proposed by simulating coverage of the analyzed communities as a function of sampling size applying a Cramér-von Mises statistic. Comparisons were performed by a Monte Carlo testing procedure. As an example, this procedure was used to compare several sediment samples from freshwater ponds using a relative quantitative PCR-DGGE profiling technique. The method was able to discriminate among different samples based on their molecular fingerprints, and confirmed the lack of differences between aliquots from a single sample.

  3. Telerobotics - Display, control, and communication problems

    NASA Technical Reports Server (NTRS)

    Stark, Lawrence; Kim, Won-Soo; Tendick, Frank; Hannaford, Blake; Ellis, Stephen

    1987-01-01

    An experimental telerobotics simulation is described suitable for studying human operator (HO) performance. Simple manipulator pick-and-place and tracking tasks allowed quantitative comparison of a number of calligraphic display viewing conditions. An enhanced perspective display was effective with a reference line from target to base, with or without a complex three-dimensional grid framing the view. This was true especially if geometrical display parameters such as azimuth and elevation were arranged to be near optimal. Quantitative comparisons were made possible, utilizing control performance measures such as root mean square error. There was a distinct preference for controlling the manipulator in end-effector Cartesian space for the primitive pick-and-place task, rather than controlling joint angles and then, via direct kinematis, the end-effector position. An introduced communication delay was found to produce decrease in performance. In considerable part, this difficulty could be compensated for by preview control information. The fact that neurological control of normal human movement contains a sampled data period of 0.2 s may relate to this robustness of HO control to delay.

  4. Statistical issues in the comparison of quantitative imaging biomarker algorithms using pulmonary nodule volume as an example.

    PubMed

    Obuchowski, Nancy A; Barnhart, Huiman X; Buckler, Andrew J; Pennello, Gene; Wang, Xiao-Feng; Kalpathy-Cramer, Jayashree; Kim, Hyun J Grace; Reeves, Anthony P

    2015-02-01

    Quantitative imaging biomarkers are being used increasingly in medicine to diagnose and monitor patients' disease. The computer algorithms that measure quantitative imaging biomarkers have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms' bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms' performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for quantitative imaging biomarker studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  5. Comparison of multipoint linkage analyses for quantitative traits in the CEPH data: parametric LOD scores, variance components LOD scores, and Bayes factors.

    PubMed

    Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M

    2007-01-01

    We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus.

  6. Comparison of multipoint linkage analyses for quantitative traits in the CEPH data: parametric LOD scores, variance components LOD scores, and Bayes factors

    PubMed Central

    Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M

    2007-01-01

    We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus. PMID:18466597

  7. Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.

    2013-12-01

    Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.

  8. Objective comparison of particle tracking methods

    PubMed Central

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F.; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R.; Godinez, William J.; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E. G.; Jaldén, Joakim; Blau, Helen M.; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L.; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P.; Dan, Han-Wei; Tsai, Yuh-Show; de Solórzano, Carlos Ortiz; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-01-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Since manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized, for the first time, an open competition, in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to important practical conclusions for users and developers. PMID:24441936

  9. Near-field marking of gold nanostars by ultrashort pulsed laser irradiation: experiment and simulations

    NASA Astrophysics Data System (ADS)

    Møller, Søren H.; Vester-Petersen, Joakim; Nazir, Adnan; Eriksen, Emil H.; Julsgaard, Brian; Madsen, Søren P.; Balling, Peter

    2018-02-01

    Quantitative measurements of the electric near-field distribution of star-shaped gold nanoparticles have been performed by femtosecond laser ablation. Measurements were carried out on and off the plasmon resonance. A detailed comparison with numerical simulations of the electric fields is presented. Semi-quantitative agreement is found, with slight systematic differences between experimentally observed and simulated near-field patterns close to strong electric-field gradients. The deviations are attributed to carrier transport preceding ablation.

  10. Quantitative diagnostic performance of myocardial perfusion SPECT with attenuation correction in women.

    PubMed

    Wolak, Arik; Slomka, Piotr J; Fish, Mathews B; Lorenzo, Santiago; Berman, Daniel S; Germano, Guido

    2008-06-01

    Attenuation correction (AC) for myocardial perfusion SPECT (MPS) had not been evaluated separately in women despite specific considerations in this group because of breast photon attenuation. We aimed to evaluate the performance of AC in women by using automated quantitative analysis of MPS to avoid any bias. Consecutive female patients--134 with a low likelihood (LLk) of coronary artery disease (CAD) and 114 with coronary angiography performed within less than 3 mo of MPS--who were referred for rest-stress electrocardiography-gated 99mTc-sestamibi MPS with AC were considered. Imaging data were evaluated for contour quality control. An additional 50 LLk studies in women were used to create equivalent normal limits for studies with AC and with no correction (NC). An experienced technologist unaware of the angiography and other results performed the contour quality control. All other processing was performed in a fully automated manner. Quantitative analysis was performed with the Cedars-Sinai myocardial perfusion analysis package. All automated segmental analyses were performed with the 17-segment, 5-point American Heart Association model. Summed stress scores (SSS) of > or =3 were considered abnormal. CAD (> or =70% stenosis) was present in 69 of 114 patients (60%). The normalcy rates were 93% for both NC and AC studies. The SSS for patients with CAD and without CAD for NC versus AC were 10.0 +/- 9.0 (mean +/- SD) versus 10.2 +/- 8.5 and 1.6 +/- 2.3 versus 1.8 +/- 2.5, respectively; P was not significant (NS) for all comparisons of NC versus AC. The SSS for LLk patients for NC versus AC were 0.51 +/- 1.0 versus 0.6 +/- 1.1, respectively; P was NS. The specificity for both NC and AC was 73%. The sensitivities for NC and AC were 80% and 81%, respectively, and the accuracies for NC and AC were 77% and 78%, respectively; P was NS for both comparisons. There are no significant diagnostic differences between automated quantitative MPS analyses performed in studies processed with and without AC in women.

  11. Analyses of Disruption of Cerebral White Matter Integrity in Schizophrenia with MR Diffusion Tensor Fiber Tracking Method

    NASA Astrophysics Data System (ADS)

    Yamamoto, Utako; Kobayashi, Tetsuo; Kito, Shinsuke; Koga, Yoshihiko

    We have analyzed cerebral white matter using magnetic resonance diffusion tensor imaging (MR-DTI) to measure the diffusion anisotropy of water molecules. The goal of this study is the quantitative evaluation of schizophrenia. Diffusion tensor images are acquired for patients with schizophrenia and healthy comparison subjects, group-matched for age, sex, and handedness. Fiber tracking is performed on the superior longitudinal fasciculus for the comparison between the patient and comparison groups. We have analysed and compared the cross-sectional area on the starting coronal plane and the mean and standard deviation of the fractional anisotropy and the apparent diffusion coefficient along fibers in the right and left hemispheres. In the right hemisphere, the cross-sectional areas in patient group are significantly smaller than those in the comparison group. Furthermore, in the comparison group, the cross-sectional areas in the right hemisphere are significantly larger than those in the left hemisphere, whereas there is no significant difference in the patient group. These results suggest that we may evaluate the disruption in white matter integrity in schizophrenic patients quantitatively by comparing the cross-sectional area of the superior longitudinal fasciculus in the right and left hemispheres.

  12. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Systematic Comparison of Label-Free, Metabolic Labeling, and Isobaric Chemical Labeling for Quantitative Proteomics on LTQ Orbitrap Velos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhou; Adams, Rachel M; Chourey, Karuna

    2012-01-01

    A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less

  14. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  15. Comparing Laser Welding Technologies with Friction Stir Welding for Production of Aluminum Tailor-Welded Blanks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Carsley, John; Carlson, Blair

    2014-01-15

    A comparison of welding techniques was performed to determine the most effective method for producing aluminum tailor-welded blanks for high volume automotive applications. Aluminum sheet was joined with an emphasis on post weld formability, surface quality and weld speed. Comparative results from several laser based welding techniques along with friction stir welding are presented. The results of this study demonstrate a quantitative comparison of weld methodologies in preparing tailor-welded aluminum stampings for high volume production in the automotive industry. Evaluation of nearly a dozen welding variations ultimately led to down selecting a single process based on post-weld quality and performance.

  16. What Value "Value Added"?

    ERIC Educational Resources Information Center

    Richards, Andrew

    2015-01-01

    Two quantitative measures of school performance are currently used, the average points score (APS) at Key Stage 2 and value-added (VA), which measures the rate of academic improvement between Key Stage 1 and 2. These figures are used by parents and the Office for Standards in Education to make judgements and comparisons. However, simple…

  17. Performance evaluation of canine-associated Bacteroidales assays in a multi-laboratory comparison study

    EPA Science Inventory

    The contribution of fecal pollution from dogs in urbanized areas can be significant and is an often underestimated problem. Microbial source tracking methods (MST) utilizing quantitative PCR of dog-associated gene sequences encoding 16S rRNA of Bacteroidales are a useful tool to ...

  18. Quantitative Features of Liver Lesions, Lung Nodules, and Renal Stones at Multi-Detector Row CT Examinations: Dependency on Radiation Dose and Reconstruction Algorithm.

    PubMed

    Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan

    2016-04-01

    To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was significantly affected by the reconstruction algorithm used (average of 3.33 features affected by MBIR throughout lesion types; P < .002, for all comparisons), no significant effect of the radiation dose setting was observed for all but one of the texture features (P = .002-.998). Radiation dose settings and reconstruction algorithms affect the extraction and analysis of quantitative imaging features in lesions at multi-detector row CT.

  19. Quantitation of sugar content in pyrolysis liquids after acid hydrolysis using high-performance liquid chromatography without neutralization.

    PubMed

    Johnston, Patrick A; Brown, Robert C

    2014-08-13

    A rapid method for the quantitation of total sugars in pyrolysis liquids using high-performance liquid chromatography (HPLC) was developed. The method avoids the tedious and time-consuming sample preparation required by current analytical methods. It is possible to directly analyze hydrolyzed pyrolysis liquids, bypassing the neutralization step usually required in determination of total sugars. A comparison with traditional methods was used to determine the validity of the results. The calibration curve coefficient of determination on all standard compounds was >0.999 using a refractive index detector. The relative standard deviation for the new method was 1.13%. The spiked sugar recoveries on the pyrolysis liquid samples were between 104 and 105%. The research demonstrates that it is possible to obtain excellent accuracy and efficiency using HPLC to quantitate glucose after acid hydrolysis of polymeric and oligomeric sugars found in fast pyrolysis bio-oils without neutralization.

  20. KEY COMPARISON: CCQM-K61: Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA

    NASA Astrophysics Data System (ADS)

    Woolford, Alison; Holden, Marcia; Salit, Marc; Burns, Malcolm; Ellison, Stephen L. R.

    2009-01-01

    Key comparison CCQM-K61 was performed to demonstrate and document the capability of interested national metrology institutes in the determination of the quantity of specific DNA target in an aqueous solution. The study provides support for the following measurement claim: "Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA". The comparison was an activity of the Bioanalysis Working Group (BAWG) of the Comité Consultatif pour la Quantité de Matière and was coordinated by NIST (Gaithersburg, USA) and LGC (Teddington, UK). The following laboratories (in alphabetical order) participated in this key comparison. DMSC (Thailand); IRMM (European Union); KRISS (Republic of Korea); LGC (UK); NIM (China); NIST (USA); NMIA (Australia); NMIJ (Japan); VNIIM (Russian Federation) Good agreement was observed between the reported results of all nine of the participants. Uncertainty estimates did not account fully for the dispersion of results even after allowance for possible inhomogeneity in calibration materials. Preliminary studies suggest that the effects of fluorescence threshold setting might contribute to the excess dispersion, and further study of this topic is suggested Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  1. STARS: The Space Transportation Architecture Risk System

    NASA Technical Reports Server (NTRS)

    Greenberg, Joel S.

    1997-01-01

    Because of the need to perform comparisons between transportation systems that are likely to have significantly different levels of risk, both because of differing degrees of freedom in achieving desired performance levels and their different states of development and utilization, an approach has been developed for performing early comparisons of transportation architectures explicitly taking into account quantitative measures of uncertainty and resulting risk. The approach considers the uncertainty associated with the achievement of technology goals, the effect that the achieved level of technology will have on transportation system performance and the relationship between transportation system performance/capability and the ability to accommodate variations in payload mass. The consequences of system performance are developed in terms of expected values and associated standard deviations of nonrecurring, recurring and the present value of transportation system life cycle cost. Typical results are presented to illustrate the application of the methodology.

  2. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study.

    PubMed

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W; Arokoski, Jari P; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T; Tervonen, Osmo; Koski, Juhani M; Saarakkala, Simo

    2016-03-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level.

  3. Simultaneous determination of rhamnose, xylitol, arabitol, fructose, glucose, inositol, sucrose, maltose in jujube (Zizyphus jujube Mill.) extract: comparison of HPLC-ELSD, LC-ESI-MS/MS and GC-MS.

    PubMed

    Sun, Shihao; Wang, Hui; Xie, Jianping; Su, Yue

    2016-01-01

    Jujube extract is commonly used as a food additive and flavoring. The sensory properties of the extract, especially sweetness, are a critical factor determining the product quality and therefore affecting consumer acceptability. Small molecular carbohydrates make major contribution to the sweetness of the jujube extract, and their types and contents in the extract have direct influence on quality of the product. So, an appropriate qualitative and quantitative method for determination of the carbohydrates is vitally important for quality control of the product. High performance liquid chromatography-evaporative light scattering detection (HPLC-ELSD), liquid chromatography-electronic spay ionization tandem mass spectrometry (LC-ESI-MS/MS), and gas chromatography-mass spectrometry (GC-MS) methods have been developed and applied to determining small molecular carbohydrates in jujube extract, respectively. Eight sugars and alditols were identified from the extract, including rhamnose, xylitol, arabitol, fructose, glucose, inositol, sucrose, and maltose. Comparisons were carried out to investigate the performance of the methods. Although the methods have been found to perform satisfactorily, only three sugars (fructose, glucose and inositol) could be detected by all these methods. Meanwhile, a similar quantitative result for the three sugars can be obtained by the methods. Eight sugars and alditols in the jujube extract were determined by HPLC-ELSD, LC-ESI-MS/MS and GC-MS, respectively. The LC-ELSD method and the LC-ESI-MS/MS method with good precision and accuracy were suitable for quantitative analysis of carbohydrates in jujube extract; although the performance of the GC-MS method for quantitative analysis was inferior to the other methods, it has a wider scope in qualitative analysis. A multi-analysis technique should be adopted in order to obtain complete constituents of about the carbohydrates in jujube extract, and the methods should be employed according to the purpose of analysis.

  4. Quantitative and Qualitative Differences in Morphological Traits Revealed between Diploid Fragaria Species

    PubMed Central

    SARGENT, DANIEL J.; GEIBEL, M.; HAWKINS, J. A.; WILKINSON, M. J.; BATTEY, N. H.; SIMPSON, D. W.

    2004-01-01

    • Background and Aims The aims of this investigation were to highlight the qualitative and quantitative diversity apparent between nine diploid Fragaria species and produce interspecific populations segregating for a large number of morphological characters suitable for quantitative trait loci analysis. • Methods A qualitative comparison of eight described diploid Fragaria species was performed and measurements were taken of 23 morphological traits from 19 accessions including eight described species and one previously undescribed species. A principal components analysis was performed on 14 mathematically unrelated traits from these accessions, which partitioned the species accessions into distinct morphological groups. Interspecific crosses were performed with accessions of species that displayed significant quantitative divergence and, from these, populations that should segregate for a range of quantitative traits were raised. • Key Results Significant differences between species were observed for all 23 morphological traits quantified and three distinct groups of species accessions were observed after the principal components analysis. Interspecific crosses were performed between these groups, and F2 and backcross populations were raised that should segregate for a range of morphological characters. In addition, the study highlighted a number of distinctive morphological characters in many of the species studied. • Conclusions Diploid Fragaria species are morphologically diverse, yet remain highly interfertile, making the group an ideal model for the study of the genetic basis of phenotypic differences between species through map-based investigation using quantitative trait loci. The segregating interspecific populations raised will be ideal for such investigations and could also provide insights into the nature and extent of genome evolution within this group. PMID:15469944

  5. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  6. Modeling of electron-specimen interaction in scanning electron microscope for e-beam metrology and inspection: challenges and perspectives

    NASA Astrophysics Data System (ADS)

    Suzuki, Makoto; Kameda, Toshimasa; Doi, Ayumi; Borisov, Sergey; Babin, Sergey

    2018-03-01

    The interpretation of scanning electron microscopy (SEM) images of the latest semiconductor devices is not intuitive and requires comparison with computed images based on theoretical modeling and simulations. For quantitative image prediction and geometrical reconstruction of the specimen structure, the accuracy of the physical model is essential. In this paper, we review the current models of electron-solid interaction and discuss their accuracy. We perform the comparison of the simulated results with our experiments of SEM overlay of under-layer, grain imaging of copper interconnect, and hole bottom visualization by angular selective detectors, and show that our model well reproduces the experimental results. Remaining issues for quantitative simulation are also discussed, including the accuracy of the charge dynamics, treatment of beam skirt, and explosive increase in computing time.

  7. Comparison of different approaches to quantitative adenovirus detection in stool specimens of hematopoietic stem cell transplant recipients.

    PubMed

    Kosulin, K; Dworzak, S; Lawitschka, A; Matthes-Leodolter, S; Lion, T

    2016-12-01

    Adenoviruses almost invariably proliferate in the gastrointestinal tract prior to dissemination, and critical threshold concentrations in stool correlate with the risk of viremia. Monitoring of adenovirus loads in stool may therefore be important for timely initiation of treatment in order to prevent invasive infection. Comparison of a manual DNA extraction kit in combination with a validated in-house PCR assay with automated extraction on the NucliSENS-EasyMAG device coupled with the Adenovirus R-gene kit (bioMérieux) for quantitative adenovirus analysis in stool samples. Stool specimens spiked with adenovirus concentrations in a range from 10E2-10E11 copies/g and 32 adenovirus-positive clinical stool specimens from pediatric stem cell transplant recipients were tested along with appropriate negative controls. Quantitative analysis of viral load in adenovirus-positive stool specimens revealed a median difference of 0.5 logs (range 0.1-2.2) between the detection systems tested and a difference of 0.3 logs (range 0.0-1.7) when the comparison was restricted to the PCR assays only. Spiking experiments showed a detection limit of 10 2 -10 3 adenovirus copies/g stool revealing a somewhat higher sensitivity offered by the automated extraction. The dynamic range of accurate quantitative analysis by both systems investigated was between 10 3 and 10 8 virus copies/g. The differences in quantitative analysis of adenovirus copy numbers between the systems tested were primarily attributable to the DNA extraction method used, while the qPCR assays revealed a high level of concordance. Both systems showed adequate performance for detection and monitoring of adenoviral load in stool specimens. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.

    PubMed

    Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P

    2016-07-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  9. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    PubMed Central

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  10. Variations in optical coherence tomography resolution and uniformity: a multi-system performance comparison

    PubMed Central

    Fouad, Anthony; Pfefer, T. Joshua; Chen, Chao-Wei; Gong, Wei; Agrawal, Anant; Tomlins, Peter H.; Woolliams, Peter D.; Drezek, Rebekah A.; Chen, Yu

    2014-01-01

    Point spread function (PSF) phantoms based on unstructured distributions of sub-resolution particles in a transparent matrix have been demonstrated as a useful tool for evaluating resolution and its spatial variation across image volumes in optical coherence tomography (OCT) systems. Measurements based on PSF phantoms have the potential to become a standard test method for consistent, objective and quantitative inter-comparison of OCT system performance. Towards this end, we have evaluated three PSF phantoms and investigated their ability to compare the performance of four OCT systems. The phantoms are based on 260-nm-diameter gold nanoshells, 400-nm-diameter iron oxide particles and 1.5-micron-diameter silica particles. The OCT systems included spectral-domain and swept source systems in free-beam geometries as well as a time-domain system in both free-beam and fiberoptic probe geometries. Results indicated that iron oxide particles and gold nanoshells were most effective for measuring spatial variations in the magnitude and shape of PSFs across the image volume. The intensity of individual particles was also used to evaluate spatial variations in signal intensity uniformity. Significant system-to-system differences in resolution and signal intensity and their spatial variation were readily quantified. The phantoms proved useful for identification and characterization of irregularities such as astigmatism. Our multi-system results provide evidence of the practical utility of PSF-phantom-based test methods for quantitative inter-comparison of OCT system resolution and signal uniformity. PMID:25071949

  11. The Lifestyles of Blind, Low Vision, and Sighted Youths: A Quantitative Comparison.

    ERIC Educational Resources Information Center

    Wolffe, K.; Sacks, S. Z.

    1997-01-01

    Analysis of interviews and time-diary protocols with 48 students (16 blind, 16 low-vision, and 16 sighted), ages 15-21, and their parents focused on four lifestyle areas: academic involvement and performance, daily living and personal care activities, recreation and leisure activities, and work and vocational experiences. Similarities and…

  12. A Comparison of Community College Full-Time and Adjunct Faculties' Perceptions of Factors Associated with Grade Inflation

    ERIC Educational Resources Information Center

    Schutz, Kelly R.; Drake, Brent M.; Lessner, Janet; Hughes, Gail F.

    2015-01-01

    Grades historically have indicated student performance in college. Previous studies in the higher education literature, primarily conducted at four-year teaching institutions, have suggested reasons for grade inflation but have provided little supporting empirical data. This quantitative, non-experimental, comparative study used survey research to…

  13. Cardiovascular disease testing on the Dimension Vista system: biomarkers of acute coronary syndromes.

    PubMed

    Kelley, Walter E; Lockwood, Christina M; Cervelli, Denise R; Sterner, Jamie; Scott, Mitchell G; Duh, Show-Hong; Christenson, Robert H

    2009-09-01

    Performance characteristics of the LOCI cTnI, CK-MB, MYO, NTproBNP and hsCRP methods on the Dimension Vista System were evaluated. Imprecision (following CLSI EP05-A2 guidelines), limit of quantitation (cTnI), limit of blank, linearity on dilution, serum versus plasma matrix studies (cTnI), and method comparison studies were conducted. Method imprecision of 1.8 to 9.7% (cTnI), 1.8 to 5.7% (CK-MB), 2.1 to 2.2% (MYO), 1.6 to 3.3% (NTproBNP), and 3.5 to 4.2% (hsCRP) were demonstrated. The manufacturer's claimed imprecision, detection limits and upper measurement limits were met. Limit of Quantitation was 0.040 ng/mL for the cTnI assay. Agreement of serum and plasma values for cTnI (r=0.99) was shown. Method comparison study results were acceptable. The Dimension Vista cTnI, CK-MB, MYO, NTproBNP, and hsCRP methods demonstrate acceptable performance characteristics for use as an aid in the diagnosis and risk assessment of patients presenting with suspected acute coronary syndromes.

  14. Method performance and multi-laboratory assessment of a normal phase high pressure liquid chromatography-fluorescence detection method for the quantitation of flavanols and procyanidins in cocoa and chocolate containing samples.

    PubMed

    Robbins, Rebecca J; Leonczak, Jadwiga; Johnson, J Christopher; Li, Julia; Kwik-Uribe, Catherine; Prior, Ronald L; Gu, Liwei

    2009-06-12

    The quantitative parameters and method performance for a normal-phase HPLC separation of flavanols and procyanidins in chocolate and cocoa-containing food products were optimized and assessed. Single laboratory method performance was examined over three months using three separate secondary standards. RSD(r) ranged from 1.9%, 4.5% to 9.0% for cocoa powder, liquor and chocolate samples containing 74.39, 15.47 and 1.87 mg/g flavanols and procyanidins, respectively. Accuracy was determined by comparison to the NIST Standard Reference Material 2384. Inter-lab assessment indicated that variability was quite low for seven different cocoa-containing samples, with a RSD(R) of less than 10% for the range of samples analyzed.

  15. Three-phase bone scintigraphy for diagnosis of Charcot neuropathic osteoarthropathy in the diabetic foot - does quantitative data improve diagnostic value?

    PubMed

    Fosbøl, M; Reving, S; Petersen, E H; Rossing, P; Lajer, M; Zerahn, B

    2017-01-01

    To investigate whether inclusion of quantitative data on blood flow distribution compared with visual qualitative evaluation improve the reliability and diagnostic performance of 99 m Tc-hydroxymethylene diphosphate three-phase bone scintigraphy (TPBS) in patients suspected for charcot neuropathic osteoarthropathy (CNO) of the foot. A retrospective cohort study of TPBS performed on 148 patients with suspected acute CNO referred from a single specialized diabetes care centre. The quantitative blood flow distribution was calculated based on the method described by Deutsch et al. All scintigraphies were re-evaluated by independent, blinded observers twice with and without quantitative data on blood flow distribution at ankle and focus level, respectively. The diagnostic validity of TPBS was determined by subsequent review of clinical data and radiological examinations. A total of 90 patients (61%) had confirmed diagnosis of CNO. The sensitivity, specificity and accuracy of three-phase bone scintigraphy without/with quantitative data were 89%/88%, 58%/62% and 77%/78%, respectively. The intra-observer agreement improved significantly by adding quantitative data in the evaluation (Kappa value 0·79/0·94). The interobserver agreement was not significantly improved. Adding quantitative data on blood flow distribution in the interpretation of TBPS improves intra-observer variation, whereas no difference in interobserver variation was observed. The sensitivity of TPBS in the diagnosis of CNO is high, but holds limited specificity. Diagnostic performance does not improve using quantitative data in the evaluation. This may be due to the reference intervals applied in the study or the absence of a proper gold standard diagnostic procedure for comparison. © 2015 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  16. On Quantitative Comparative Research in Communication and Language Evolution

    PubMed Central

    Oller, D. Kimbrough; Griebel, Ulrike

    2014-01-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives. PMID:25285057

  17. On Quantitative Comparative Research in Communication and Language Evolution.

    PubMed

    Oller, D Kimbrough; Griebel, Ulrike

    2014-09-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives.

  18. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe.

    PubMed

    Takahashi, J; Kawakami, K; Raabe, D

    2017-04-01

    The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Direct comparison of low- and mid-frequency Raman spectroscopy for quantitative solid-state pharmaceutical analysis.

    PubMed

    Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J

    2018-02-05

    This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Fiber tractography using machine learning.

    PubMed

    Neher, Peter F; Côté, Marc-Alexandre; Houde, Jean-Christophe; Descoteaux, Maxime; Maier-Hein, Klaus H

    2017-09-01

    We present a fiber tractography approach based on a random forest classification and voting process, guiding each step of the streamline progression by directly processing raw diffusion-weighted signal intensities. For comparison to the state-of-the-art, i.e. tractography pipelines that rely on mathematical modeling, we performed a quantitative and qualitative evaluation with multiple phantom and in vivo experiments, including a comparison to the 96 submissions of the ISMRM tractography challenge 2015. The results demonstrate the vast potential of machine learning for fiber tractography. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    PubMed

    Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E

    2014-01-01

    Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and subjective assessment of difficulty in fingerprint comparisons.

  2. Quantitative analysis of thoria phase in Th-U alloys using diffraction studies

    NASA Astrophysics Data System (ADS)

    Thakur, Shital; Krishna, P. S. R.; Shinde, A. B.; Kumar, Raj; Roy, S. B.

    2017-05-01

    In the present study the quantitative phase analysis of Th-U alloys in bulk form namely Th-52 wt% U and Th-3wt%U has been performed over the data obtained from both X ray diffraction and neutron diffraction technique using Rietveld method of FULLPROF software. Quantifying thoria (ThO2) phase present in bulk of the sample is limited due to surface oxidation and low penetration of x rays in high Z material. Neutron diffraction study probing bulk of the samples has been presented in comparison with x-ray diffraction study.

  3. Structure of polyacrylic acid and polymethacrylic acid solutions : a small angle neutron scattering study

    NASA Astrophysics Data System (ADS)

    Moussaid, A.; Schosseler, F.; Munch, J. P.; Candau, S. J.

    1993-04-01

    The intensity scattered from polyacrylic acid and polymethacrylic acid solutions has been measured by small angle neutron scattering experiemnts. The influence of polymer concentration, ionization degree, temperature and salt content has been investigated. Results are in qualitative agreement with a model which predicts the existence of microphases in the unstable region of the phase diagram. Quantitative comparison with the theory is performed by fitting the theoretical structure factor to the experimental data. For a narrow range of ionizaiton degrees nearly quantitative agreement with the theory is found for the polyacrylic acide system.

  4. A Comparison of Urban School- and Community-Based Dental Clinics

    ERIC Educational Resources Information Center

    Larsen, Charles D.; Larsen, Michael D.; Handwerker, Lisa B.; Kim, Maile S.; Rosenthal, Murray

    2009-01-01

    Background: The objective of the study was to quantitatively compare school- and community-based dental clinics in New York City that provide dental services to children in need. It was hypothesized that the school-based clinics would perform better in terms of several measures. Methods: We reviewed billing and visit data derived from encounter…

  5. Classroom versus Computer-Based CPR Training: A Comparison of the Effectiveness of Two Instructional Methods

    ERIC Educational Resources Information Center

    Rehberg, Robb S.; Gazzillo Diaz, Linda; Middlemas, David A.

    2009-01-01

    Objective: The objective of this study was to determine whether computer-based CPR training is comparable to traditional classroom training. Design and Setting: This study was quantitative in design. Data was gathered from a standardized examination and skill performance evaluation which yielded numerical scores. Subjects: The subjects were 64…

  6. A Comparison of the Act and Frequency of Plagiarism between Technical and Non-Technical Programme Undergraduates

    ERIC Educational Resources Information Center

    BavaHarji, Madhubala; Chetty, Thiba Naraina; Ismail, Zalina Bt; Letchumanan, Krishnaveni

    2016-01-01

    Concerned with intellectual theft, we decided to examine intellectual theft among undergraduates at a private higher education institution. The aim of this study was to compare the act and frequency of plagiarism, particularly between programmes, gender, year of study and academic performance. This study adopted the quantitative approach, using a…

  7. [Noncollagen bone proteins use in the composition of osteoplactic material Gapkol modified by vacuum].

    PubMed

    Volozhin, A I; Grigor'ian, A S; Desiatnichenko, K S; Ozhelevskaia, S A; Doktorov, A A; Kurdiumov, S G; Fionova, E V; Gurin, A N; Karakov, K G

    2008-01-01

    In rat experiments the ability of noncollagen bone proteins (NCBP) in the composition of osteoplactic modified material Gapkol (not tanned in formalin and subjected to vacuum extraction) to increase bone reparation in comparison with traditional Gapkol was studied. Quantitative evaluation was performed on rat parietal bone and qualitative evaluation was performed on rat mandible. It was shown that Gapkol with NCBP (not tanned in formalin and subjected to vacuum extraction) increased reparative osteogenesis.

  8. Devising tissue ingrowth metrics: a contribution to the computational characterization of engineered soft tissue healing.

    PubMed

    Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin

    2018-03-14

    The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.

  9. Photon-counting-based diffraction phase microscopy combined with single-pixel imaging

    NASA Astrophysics Data System (ADS)

    Shibuya, Kyuki; Araki, Hiroyuki; Iwata, Tetsuo

    2018-04-01

    We propose a photon-counting (PC)-based quantitative-phase imaging (QPI) method for use in diffraction phase microscopy (DPM) that is combined with a single-pixel imaging (SPI) scheme (PC-SPI-DPM). This combination of DPM with the SPI scheme overcomes a low optical throughput problem that has occasionally prevented us from obtaining quantitative-phase images in DPM through use of a high-sensitivity single-channel photodetector such as a photomultiplier tube (PMT). The introduction of a PMT allowed us to perform PC with ease and thus solved a dynamic range problem that was inherent to SPI. As a proof-of-principle experiment, we performed a comparison study of analogue-based SPI-DPM and PC-SPI-DPM for a 125-nm-thick indium tin oxide (ITO) layer coated on a silica glass substrate. We discuss the basic performance of the method and potential future modifications of the proposed system.

  10. Performance of the New Aptima HCV Quant Dx Assay in Comparison to the Cobas TaqMan HCV2 Test for Use with the High Pure System in Detection and Quantification of Hepatitis C Virus RNA in Plasma or Serum.

    PubMed

    Schalasta, Gunnar; Speicher, Andrea; Börner, Anna; Enders, Martin

    2016-04-01

    Quantitating the level of hepatitis C virus (HCV) RNA is the standard of care for monitoring HCV-infected patients during treatment. The performances of commercially available assays differ for precision, limit of detection, and limit of quantitation (LOQ). Here, we compare the performance of the Hologic Aptima HCV Quant Dx assay (Aptima) to that of the Roche Cobas TaqMan HCV test, version 2.0, using the High Pure system (HPS/CTM), considered a reference assay since it has been used in trials defining clinical decision points in patient care. The assays' performance characteristics were assessed using HCV RNA reference panels and plasma/serum from chronically HCV-infected patients. The agreement between the assays for the 3 reference panels was good, with a difference in quantitation values of <0.5 log. High concordance was demonstrated between the assays for 245 clinical samples (kappa = 0.80; 95% confidence interval [CI], 0.720 to 0.881); however, Aptima detected and/or quantitated 20 samples that HPS/CTM did not detect, while Aptima did not detect 1 sample that was quantitated by HPS/CTM. For the 165 samples quantitated by both assays, the values were highly correlated (R= 0.98;P< 0.0001). The linearity of quantitation from concentrations of 1.4 to 6 log was excellent for both assays for all HCV genotypes (GT) tested (GT 1a, 1b, 2b, and 3a) (R(2)> 0.99). The assays had similar levels of total and intra-assay variability across all genotypes at concentrations from 1,000 to 25 IU/ml. Aptima had a greater analytical sensitivity, quantitating more than 50% of replicates at 25-IU/ml target. Aptima showed performance characteristics comparable to those of HPS/CTM and increased sensitivity, making it suitable for use as a clinical diagnostic tool on the fully automated Panther platform. Copyright © 2016 Schalasta et al.

  11. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  12. GLM Proxy Data Generation: Methods for Stroke/Pulse Level Inter-Comparison of Ground-Based Lightning Reference Networks

    NASA Technical Reports Server (NTRS)

    Cummins, Kenneth L.; Carey, Lawrence D.; Schultz, Christopher J.; Bateman, Monte G.; Cecil, Daniel J.; Rudlosky, Scott D.; Petersen, Walter Arthur; Blakeslee, Richard J.; Goodman, Steven J.

    2011-01-01

    In order to produce useful proxy data for the GOES-R Geostationary Lightning Mapper (GLM) in regions not covered by VLF lightning mapping systems, we intend to employ data produced by ground-based (regional or global) VLF/LF lightning detection networks. Before using these data in GLM Risk Reduction tasks, it is necessary to have a quantitative understanding of the performance of these networks, in terms of CG flash/stroke DE, cloud flash/pulse DE, location accuracy, and CLD/CG classification error. This information is being obtained through inter-comparison with LMAs and well-quantified VLF/LF lightning networks. One of our approaches is to compare "bulk" counting statistics on the spatial scale of convective cells, in order to both quantify relative performance and observe variations in cell-based temporal trends provided by each network. In addition, we are using microsecond-level stroke/pulse time correlation to facilitate detailed inter-comparisons at a more-fundamental level. The current development status of our ground-based inter-comparison and evaluation tools will be presented, and performance metrics will be discussed through a comparison of Vaisala s Global Lightning Dataset (GLD360) with the NLDN at locations within and outside the U.S.

  13. GLM Proxy Data Generation: Methods for Stroke/Pulse Level Inter-comparison of Ground-based Lightning Reference Networks

    NASA Astrophysics Data System (ADS)

    Cummins, K. L.; Carey, L. D.; Schultz, C. J.; Bateman, M. G.; Cecil, D. J.; Rudlosky, S. D.; Petersen, W. A.; Blakeslee, R. J.; Goodman, S. J.

    2011-12-01

    In order to produce useful proxy data for the GOES-R Geostationary Lightning Mapper (GLM) in regions not covered by VLF lightning mapping systems, we intend to employ data produced by ground-based (regional or global) VLF/LF lightning detection networks. Before using these data in GLM Risk Reduction tasks, it is necessary to have a quantitative understanding of the performance of these networks, in terms of CG flash/stroke DE, cloud flash/pulse DE, location accuracy, and CLD/CG classification error. This information is being obtained through inter-comparison with LMAs and well-quantified VLF/LF lightning networks. One of our approaches is to compare "bulk" counting statistics on the spatial scale of convective cells, in order to both quantify relative performance and observe variations in cell-based temporal trends provided by each network. In addition, we are using microsecond-level stroke/pulse time correlation to facilitate detailed inter-comparisons at a more-fundamental level. The current development status of our ground-based inter-comparison and evaluation tools will be presented, and performance metrics will be discussed through a comparison of Vaisala's Global Lightning Dataset (GLD360) with the NLDN at locations within and outside the U.S.

  14. Nontargeted quantitation of lipid classes using hydrophilic interaction liquid chromatography-electrospray ionization mass spectrometry with single internal standard and response factor approach.

    PubMed

    Cífková, Eva; Holčapek, Michal; Lísa, Miroslav; Ovčačíková, Magdaléna; Lyčka, Antonín; Lynen, Frédéric; Sandra, Pat

    2012-11-20

    The identification and quantitation of a wide range of lipids in complex biological samples is an essential requirement for the lipidomic studies. High-performance liquid chromatography-mass spectrometry (HPLC/MS) has the highest potential to obtain detailed information on the whole lipidome, but the reliable quantitation of multiple lipid classes is still a challenging task. In this work, we describe a new method for the nontargeted quantitation of polar lipid classes separated by hydrophilic interaction liquid chromatography (HILIC) followed by positive-ion electrospray ionization mass spectrometry (ESI-MS) using a single internal lipid standard to which all class specific response factors (RFs) are related to. The developed method enables the nontargeted quantitation of lipid classes and molecules inside these classes in contrast to the conventional targeted quantitation, which is based on predefined selected reaction monitoring (SRM) transitions for selected lipids only. In the nontargeted quantitation method described here, concentrations of lipid classes are obtained by the peak integration in HILIC chromatograms multiplied by their RFs related to the single internal standard (i.e., sphingosyl PE, d17:1/12:0) used as common reference for all polar lipid classes. The accuracy, reproducibility and robustness of the method have been checked by various means: (1) the comparison with conventional lipidomic quantitation using SRM scans on a triple quadrupole (QqQ) mass analyzer, (2) (31)P nuclear magnetic resonance (NMR) quantitation of the total lipid extract, (3) method robustness test using subsequent measurements by three different persons, (4) method transfer to different HPLC/MS systems using different chromatographic conditions, and (5) comparison with previously published results for identical samples, especially human reference plasma from the National Institute of Standards and Technology (NIST human plasma). Results on human plasma, egg yolk and porcine liver extracts are presented and discussed.

  15. Comparison of the COBAS TAQMAN HIV-1 HPS with VERSANT HIV-1 RNA 3.0 assay (bDNA) for plasma RNA quantitation in different HIV-1 subtypes.

    PubMed

    Gomes, Perpétua; Palma, Ana Carolina; Cabanas, Joaquim; Abecasis, Ana; Carvalho, Ana Patrícia; Ziermann, Rainer; Diogo, Isabel; Gonçalves, Fátima; Lobo, Céu Sousa; Camacho, Ricardo

    2006-08-01

    Quantitation of HIV-1 RNA levels in plasma has an undisputed prognostic value and is extremely important for evaluating response to antiretroviral therapy. The purpose of this study was to evaluate the performance of the real-time PCR COBAS TaqMan 48 analyser, comparing it to the existing VERSANT 3.0 (bDNA) for HIV-1 RNA quantitation in plasma of individuals infected with different HIV-1 subtypes (104 blood samples). A positive linear correlation between the two tests (r2 = 0.88) was found. Quantitation by the COBAS TaqMan assay was approximately 0.32log10 higher than by bDNA. The relationship between the two assays was similar within all subtypes with a Deming regression of <1 and <0 for the Bland-Altman plots. Overall, no significant differences were found in plasma viral load quantitation in different HIV-1 subtypes between both assays; therefore these assays are suitable for viral load quantitation of highly genetically diverse HIV-1 plasma samples.

  16. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  17. Methodological aspects of multicenter studies with quantitative PET.

    PubMed

    Boellaard, Ronald

    2011-01-01

    Quantification of whole-body FDG PET studies is affected by many physiological and physical factors. Much of the variability in reported standardized uptake value (SUV) data seen in the literature results from the variability in methodology applied among these studies, i.e., due to the use of different scanners, acquisition and reconstruction settings, region of interest strategies, SUV normalization, and/or corrections methods. To date, the variability in applied methodology prohibits a proper comparison and exchange of quantitative FDG PET data. Consequently, the promising role of quantitative PET has been demonstrated in several monocentric studies, but these published results cannot be used directly as a guideline for clinical (multicenter) trials performed elsewhere. In this chapter, the main causes affecting whole-body FDG PET quantification and strategies to minimize its inter-institute variability are addressed.

  18. Validation Process for LEWICE Coupled by Use of a Navier-stokes Solver

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    2016-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth for many meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will show the differences in ice shape between LEWICE 3.5 and experimental data. In addition, comparisons will be made between the lift and drag calculated on the ice shapes from experiment and those produced by LEWICE. This report will also provide a description of both programs. Quantitative geometric comparisons are shown for horn height, horn angle, icing limit, area and leading edge thickness. Quantitative comparisons of calculated lift and drag will also be shown. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  19. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and posterior eye segment as well as in skin imaging. The new estimator shows superior performance and also shows clearer image contrast.

  20. Quantification and Comparison of Anti-Fibrotic Therapies by Polarized SRM and SHG-Based Morphometry in Rat UUO Model

    PubMed Central

    Weldon, Steve M.; Matera, Damian; Lee, ChungWein; Yang, Haichun; Fryer, Ryan M.; Fogo, Agnes B.; Reinhart, Glenn A.

    2016-01-01

    Renal interstitial fibrosis (IF) is an important pathologic manifestation of disease progression in a variety of chronic kidney diseases (CKD). However, the quantitative and reproducible analysis of IF remains a challenge, especially in experimental animal models of progressive IF. In this study, we compare traditional polarized Sirius Red morphometry (SRM) to novel Second Harmonic Generation (SHG)-based morphometry of unstained tissues for quantitative analysis of IF in the rat 5 day unilateral ureteral obstruction (UUO) model. To validate the specificity of SHG for detecting fibrillar collagen components in IF, co-localization studies for collagens type I, III, and IV were performed using IHC. In addition, we examined the correlation, dynamic range, sensitivity, and ability of polarized SRM and SHG-based morphometry to detect an anti-fibrotic effect of three different treatment regimens. Comparisons were made across three separate studies in which animals were treated with three mechanistically distinct pharmacologic agents: enalapril (ENA, 15, 30, 60 mg/kg), mycophenolate mofetil (MMF, 2, 20 mg/kg) or the connective tissue growth factor (CTGF) neutralizing antibody, EX75606 (1, 3, 10 mg/kg). Our results demonstrate a strong co-localization of the SHG signal with fibrillar collagens I and III but not non-fibrillar collagen IV. Quantitative IF, calculated as percent cortical area of fibrosis, demonstrated similar response profile for both polarized SRM and SHG-based morphometry. The two methodologies exhibited a strong correlation across all three pharmacology studies (r2 = 0.89–0.96). However, compared with polarized SRM, SHG-based morphometry delivered a greater dynamic range and absolute magnitude of reduction of IF after treatment. In summary, we demonstrate that SHG-based morphometry in unstained kidney tissues is comparable to polarized SRM for quantitation of fibrillar collagens, but with an enhanced sensitivity to detect treatment-induced reductions in IF. Thus, performing SHG-based morphometry on unstained kidney tissue is a reliable alternative to traditional polarized SRM for quantitative analysis of IF. PMID:27257917

  1. A Study to Determine the Academic Progression between Economically Disadvantaged Students and Their Economically Advantaged Peers on Georgia's Statewide Criteria Referenced Competency Test

    ERIC Educational Resources Information Center

    Warner, Tonya

    2009-01-01

    This quantitative study implemented a non-experimental design that was descriptive, ex-post facto, and longitudinal. This study is examining economically disadvantaged students (EDS) with comparison to non-economically disadvantaged students (non-EDS) and their academic performance on Georgia's Criterion-Referenced Competency Tests (CRCT).…

  2. Development of a Contact Permeation Test Fixture and Method

    DTIC Science & Technology

    2013-04-01

    direct contact with the skin, indicates the need for a quantitative contact test method. Comparison tests were conducted with VX on a standardized...Guide for the Care and Use of Laboratory Animals (8th ed.; National Research Council: Washington, DC, 2011). This test was also performed in...1 1.2 Development of a Contact-Based Permeation Test Method ........................................ 1 2. EXPERIMENTAL PROCEDURES

  3. Accurate experimental and theoretical comparisons between superconductor-insulator-superconductor mixers showing weak and strong quantum effects

    NASA Technical Reports Server (NTRS)

    Mcgrath, W. R.; Richards, P. L.; Face, D. W.; Prober, D. E.; Lloyd, F. L.

    1988-01-01

    A systematic study of the gain and noise in superconductor-insulator-superconductor mixers employing Ta based, Nb based, and Pb-alloy based tunnel junctions was made. These junctions displayed both weak and strong quantum effects at a signal frequency of 33 GHz. The effects of energy gap sharpness and subgap current were investigated and are quantitatively related to mixer performance. Detailed comparisons are made of the mixing results with the predictions of a three-port model approximation to the Tucker theory. Mixer performance was measured with a novel test apparatus which is accurate enough to allow for the first quantitative tests of theoretical noise predictions. It is found that the three-port model of the Tucker theory underestimates the mixer noise temperature by a factor of about 2 for all of the mixers. In addition, predicted values of available mixer gain are in reasonable agreement with experiment when quantum effects are weak. However, as quantum effects become strong, the predicted available gain diverges to infinity, which is in sharp contrast to the experimental results. Predictions of coupled gain do not always show such divergences.

  4. Evaluating the More Suitable ISM Frequency Band for IoT-Based Smart Grids: A Quantitative Study of 915 MHz vs. 2400 MHz.

    PubMed

    Sandoval, Ruben M; Garcia-Sanchez, Antonio-Javier; Garcia-Sanchez, Felipe; Garcia-Haro, Joan

    2016-12-31

    IoT has begun to be employed pervasively in industrial environments and critical infrastructures thanks to its positive impact on performance and efficiency. Among these environments, the Smart Grid (SG) excels as the perfect host for this technology, mainly due to its potential to become the motor of the rest of electrically-dependent infrastructures. To make this SG-oriented IoT cost-effective, most deployments employ unlicensed ISM bands, specifically the 2400 MHz one, due to its extended communication bandwidth in comparison with lower bands. This band has been extensively used for years by Wireless Sensor Networks (WSN) and Mobile Ad-hoc Networks (MANET), from which the IoT technologically inherits. However, this work questions and evaluates the suitability of such a "default" communication band in SG environments, compared with the 915 MHz ISM band. A comprehensive quantitative comparison of these bands has been accomplished in terms of: power consumption, average network delay, and packet reception rate. To allow such a study, a dual-band propagation model specifically designed for the SG has been derived, tested, and incorporated into the well-known TOSSIM simulator. Simulation results reveal that only in the absence of other 2400 MHz interfering devices (such as WiFi or Bluetooth) or in small networks, is the 2400 MHz band the best option. In any other case, SG-oriented IoT quantitatively perform better if operating in the 915 MHz band.

  5. Evaluating the More Suitable ISM Frequency Band for IoT-Based Smart Grids: A Quantitative Study of 915 MHz vs. 2400 MHz

    PubMed Central

    Sandoval, Ruben M.; Garcia-Sanchez, Antonio-Javier; Garcia-Sanchez, Felipe; Garcia-Haro, Joan

    2016-01-01

    IoT has begun to be employed pervasively in industrial environments and critical infrastructures thanks to its positive impact on performance and efficiency. Among these environments, the Smart Grid (SG) excels as the perfect host for this technology, mainly due to its potential to become the motor of the rest of electrically-dependent infrastructures. To make this SG-oriented IoT cost-effective, most deployments employ unlicensed ISM bands, specifically the 2400 MHz one, due to its extended communication bandwidth in comparison with lower bands. This band has been extensively used for years by Wireless Sensor Networks (WSN) and Mobile Ad-hoc Networks (MANET), from which the IoT technologically inherits. However, this work questions and evaluates the suitability of such a “default” communication band in SG environments, compared with the 915 MHz ISM band. A comprehensive quantitative comparison of these bands has been accomplished in terms of: power consumption, average network delay, and packet reception rate. To allow such a study, a dual-band propagation model specifically designed for the SG has been derived, tested, and incorporated into the well-known TOSSIM simulator. Simulation results reveal that only in the absence of other 2400 MHz interfering devices (such as WiFi or Bluetooth) or in small networks, is the 2400 MHz band the best option. In any other case, SG-oriented IoT quantitatively perform better if operating in the 915 MHz band. PMID:28042863

  6. Least squares QR-based decomposition provides an efficient way of computing optimal regularization parameter in photoacoustic tomography.

    PubMed

    Shaw, Calvin B; Prakash, Jaya; Pramanik, Manojit; Yalavarthy, Phaneendra K

    2013-08-01

    A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison.

  7. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    NASA Astrophysics Data System (ADS)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  8. Comparison of high-performance liquid chromatography and supercritical fluid chromatography using evaporative light scattering detection for the determination of plasticizers in medical devices.

    PubMed

    Lecoeur, Marie; Decaudin, Bertrand; Guillotin, Yoann; Sautou, Valérie; Vaccher, Claude

    2015-10-23

    Recently, interest in supercritical fluid chromatography (SFC) has increased due to its high throughput and the development of new system improving chromatographic performances. However, most papers dealt with fundamental studies and chiral applications and only few works described validation process of SFC method. Likewise, evaporative light scattering detection (ELSD) has been widely employed in liquid chromatography but only a few recent works presented its quantitative performances hyphenated with SFC apparatus. The present paper discusses about the quantitative performances of SFC-ELSD compared to HPLC-ELSD, for the determination of plasticizers (ATBC, DEHA, DEHT and TOTM) in PVC tubing used as medical devices. After the development of HPLC-ELSD, both methods were evaluated based on the total error approach using accuracy profile. The results show that HPLC-ELSD was more precise than SFC-ELSD but lower limits of quantitation were obtained by SFC. Hence, HPLC was validated in the ± 10% acceptance limits whereas SFC lacks of accuracy to quantify plasticizers. Finally, both methods were used to determine the composition of plasticized-PVC medical devices. Results demonstrated that SFC and HPLC both hyphenated with ELSD provided similar results. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Novel ionic liquid matrices for qualitative and quantitative detection of carbohydrates by matrix assisted laser desorption/ionization mass spectrometry.

    PubMed

    Zhao, Xiaoyong; Shen, Shanshan; Wu, Datong; Cai, Pengfei; Pan, Yuanjiang

    2017-09-08

    Analysis of carbohydrates based on matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is still challenging and researchers have been devoting themselves to efficient matrices discovery. In the present study, the design, synthesis, qualitative and quantitative performance of non-derivative ionic liquid matrices (ILMs) were reported. DHB/N-methylaniline (N-MA) and DHB/N-ethylaniline (N-EA), performing best for carbohydrate detection, have been screened out. The limit of detection for oligosaccharide provided by DHB/N-MA and DHB/N-EA were as low as 10 fmol. DHB/N-MA and DHB/N-EA showed significantly higher ion generation efficiency than DHB. The comparison of capacity to probe polysaccharide between these two ILMs and DHB also revealed their powerful potential. Their outstanding performance were probably due to lower proton affinities and stronger UV absorption at λ = 355 nm. What is more, taking DHB/N-MA as an example, quantitative analysis of fructo-oligosaccharide mixtures extracted and identified from rice noodles has been accomplished sensitively using an internal standard method. Overall, DHB/N-MA and DHB/N-EA exhibited excellent performance and might be significant sources as the carbohydrate matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Evaluating the interior thermal performance of mosques in the tropical environment

    NASA Astrophysics Data System (ADS)

    Nordin, N. I.; Misni, A.

    2018-02-01

    This study introduces the methodology applied in conducting data collection and data analysis. Data collection is the process of gathering and measuring information on targeted variables in an established systematic method. Qualitative and quantitative methods are combined in collecting data from government departments, site experiments and observation. Furthermore, analysing the indoor thermal performance data in the heritage and new mosques were used thermal monitoring tests, while validation will be made by meteorology data. Origin 8 version of the software is used to analyse all the data. Comparison techniques were applied to analyse several factors that influence the indoor thermal performance of mosques, namely building envelope include floor area, opening, and material used. Building orientation, location, surrounding vegetation and water elements are also recorded as supported building primary data. The comparison of primary data using these variables for four mosques include heritage and new buildings were revealed.

  11. Comparison of analytical and experimental performance of a wind-tunnel diffuser section

    NASA Technical Reports Server (NTRS)

    Shyne, R. J.; Moore, R. D.; Boldman, D. R.

    1986-01-01

    Wind tunnel diffuser performance is evaluated by comparing experimental data with analytical results predicted by an one-dimensional integration procedure with skin friction coefficient, a two-dimensional interactive boundary layer procedure for analyzing conical diffusers, and a two-dimensional, integral, compressible laminar and turbulent boundary layer code. Pressure, temperature, and velocity data for a 3.25 deg equivalent cone half-angle diffuser (37.3 in., 94.742 cm outlet diameter) was obtained from the one-tenth scale Altitude Wind Tunnel modeling program at the NASA Lewis Research Center. The comparison is performed at Mach numbers of 0.162 (Re = 3.097x19(6)), 0.326 (Re = 6.2737x19(6)), and 0.363 (Re = 7.0129x10(6)). The Reynolds numbers are all based on an inlet diffuser diameter of 32.4 in., 82.296 cm, and reasonable quantitative agreement was obtained between the experimental data and computational codes.

  12. Introduction of an automated user-independent quantitative volumetric magnetic resonance imaging breast density measurement system using the Dixon sequence: comparison with mammographic breast density assessment.

    PubMed

    Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja

    2015-02-01

    The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative and quantitative MG-based approaches, implying that the current assessment might overestimate breast density with MG.

  13. Performance Evaluation of the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit: Comparison with the Roche COBAS® AmpliPrep/COBAS TaqMan® HIV-1 Test Ver.2.0 for Quantification of HIV-1 Viral Load in Indonesia.

    PubMed

    Kosasih, Agus Susanto; Sugiarto, Christine; Hayuanta, Hubertus Hosti; Juhaendi, Runingsih; Setiawan, Lyana

    2017-08-08

    Measurement of viral load in human immunodeficiency virus type 1 (HIV-1) infected patients is essential for the establishment of a therapeutic strategy. Several assays based on qPCR are available for the measurement of viral load; they differ in sample volume, technology applied, target gene, sensitivity and dynamic range. The Bioneer AccuPower® HIV-1 Quantitative RT-PCR is a novel commercial kit that has not been evaluated for its performance. This study aimed to evaluate the performance of the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit. In total, 288 EDTA plasma samples from the Dharmais Cancer Hospital were analyzed with the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit and the Roche COBAS? AmpliPrep/COBAS® TaqMan® HIV-1 version 2.0 (CAP/CTM v2.0). The performance of the Bioneer assay was then evaluated against the Roche CAP/CTM v2.0. Overall, there was good agreement between the two assays. The Bioneer assay showed significant linear correlation with CAP/CTM v2.0 (R2=0.963, p<0.001) for all samples (N=118) which were quantified by both assays, with high agreement (94.9%, 112/118) according to the Bland-Altman model. The mean difference between the quantitative values measured by Bioneer assay and CAP/CTM v2.0 was 0.11 Log10 IU/mL (SD=0.26). Based on these results, the Bioneer assay can be used to quantify HIV-1 RNA in clinical laboratories.

  14. A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.

    PubMed

    Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao

    2015-06-15

    ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Forecast errors in dust vertical distributions over Rome (Italy): Multiple particle size representation and cloud contributions

    NASA Astrophysics Data System (ADS)

    Kishcha, P.; Alpert, P.; Shtivelman, A.; Krichak, S. O.; Joseph, J. H.; Kallos, G.; Katsafados, P.; Spyrou, C.; Gobbi, G. P.; Barnaba, F.; Nickovic, S.; PéRez, C.; Baldasano, J. M.

    2007-08-01

    In this study, forecast errors in dust vertical distributions were analyzed. This was carried out by using quantitative comparisons between dust vertical profiles retrieved from lidar measurements over Rome, Italy, performed from 2001 to 2003, and those predicted by models. Three models were used: the four-particle-size Dust Regional Atmospheric Model (DREAM), the older one-particle-size version of the SKIRON model from the University of Athens (UOA), and the pre-2006 one-particle-size Tel Aviv University (TAU) model. SKIRON and DREAM are initialized on a daily basis using the dust concentration from the previous forecast cycle, while the TAU model initialization is based on the Total Ozone Mapping Spectrometer aerosol index (TOMS AI). The quantitative comparison shows that (1) the use of four-particle-size bins in the dust modeling instead of only one-particle-size bins improves dust forecasts; (2) cloud presence could contribute to noticeable dust forecast errors in SKIRON and DREAM; and (3) as far as the TAU model is concerned, its forecast errors were mainly caused by technical problems with TOMS measurements from the Earth Probe satellite. As a result, dust forecast errors in the TAU model could be significant even under cloudless conditions. The DREAM versus lidar quantitative comparisons at different altitudes show that the model predictions are more accurate in the middle part of dust layers than in the top and bottom parts of dust layers.

  16. Grating-based tomography applications in biomedical engineering

    NASA Astrophysics Data System (ADS)

    Schulz, Georg; Thalmann, Peter; Khimchenko, Anna; Müller, Bert

    2017-10-01

    For the investigation of soft tissues or tissues consisting of soft and hard tissues on the microscopic level, hard X-ray phase tomography has become one of the most suitable imaging techniques. Besides other phase contrast methods grating interferometry has the advantage of higher sensitivity than inline methods and the quantitative results. One disadvantage of the conventional double-grating setup (XDGI) compared to inline methods is the limitation of the spatial resolution. This limitation can be overcome by removing the analyser grating resulting in a single-grating setup (XSGI). In order to verify the performance of XSGI concerning contrast and spatial resolution, a quantitative comparison of XSGI and XDGI tomograms of a human nerve was performed. Both techniques provide sufficient contrast to allow for the distinction of tissue types. The spatial resolution of the two-fold binned XSGI data set is improved by a factor of two in comparison to XDGI which underlies its performance in tomography of soft tissues. Another application for grating-based X-ray phase tomography is the simultaneous visualization of soft and hard tissues of a plaque-containing coronary artery. The simultaneous visualization of both tissues is important for the segmentation of the lumen. The segmented data can be used for flow simulations in order to obtain information about the three-dimensional wall shear stress distribution needed for the optimization of mechano-sensitive nanocontainers used for drug delivery.

  17. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.

  18. Quantitative Voxel-to-Voxel Comparison of TriBeam and DCT Strontium Titanate Three-Dimensional Data Sets (Postprint)

    DTIC Science & Technology

    2015-02-09

    Peter Gumbsch Karlsruhe Institute of Technology Melanie Syha European Synchrotron Radiation Facility Peter Gumbsch Fraunhofer IWM 9...REPORT DOCUMENTATION PAGE Cont’d 6. AUTHOR(S) 3) Melanie Syha - ESRF 4) Peter Gumbsch - Fraunhofer IWM 7. PERFORMING ORGANIZATION NAME(S...4) Fraunhofer IWM , Woelerstr. 11, 79108 Freiburg, Germany Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std

  19. Parenting in Families with a Child with Autism Spectrum Disorder and a Typically Developing Child: Mothers' Experiences and Cognitions

    ERIC Educational Resources Information Center

    Meirsschaut, Mieke; Roeyers, Herbert; Warreyn, Petra

    2010-01-01

    The parenting experiences of mothers in a family with a child with autism spectrum disorder (ASD) and a typically developing (TD) child were studied using a qualitative analysis of mothers' perceptions of the impact of autism on family and personal life. An additional quantitative comparison was performed to evaluate the effect of ASD on mothers'…

  20. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  1. NHS-based Tandem Mass Tagging of Proteins at the Level of Whole Cells: A Critical Evaluation in Comparison to Conventional TMT-Labeling Approaches for Quantitative Proteome Analysis.

    PubMed

    Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara

    2017-01-01

    Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.

  2. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  3. Environmental Health Practice: Statistically Based Performance Measurement

    PubMed Central

    Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.

    2007-01-01

    Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709

  4. iMet-Q: A User-Friendly Tool for Label-Free Metabolomics Quantitation Using Dynamic Peak-Width Determination

    PubMed Central

    Chang, Hui-Yin; Chen, Ching-Tai; Lih, T. Mamie; Lynn, Ke-Shiuan; Juo, Chiun-Gung; Hsu, Wen-Lian; Sung, Ting-Yi

    2016-01-01

    Efficient and accurate quantitation of metabolites from LC-MS data has become an important topic. Here we present an automated tool, called iMet-Q (intelligent Metabolomic Quantitation), for label-free metabolomics quantitation from high-throughput MS1 data. By performing peak detection and peak alignment, iMet-Q provides a summary of quantitation results and reports ion abundance at both replicate level and sample level. Furthermore, it gives the charge states and isotope ratios of detected metabolite peaks to facilitate metabolite identification. An in-house standard mixture and a public Arabidopsis metabolome data set were analyzed by iMet-Q. Three public quantitation tools, including XCMS, MetAlign, and MZmine 2, were used for performance comparison. From the mixture data set, seven standard metabolites were detected by the four quantitation tools, for which iMet-Q had a smaller quantitation error of 12% in both profile and centroid data sets. Our tool also correctly determined the charge states of seven standard metabolites. By searching the mass values for those standard metabolites against Human Metabolome Database, we obtained a total of 183 metabolite candidates. With the isotope ratios calculated by iMet-Q, 49% (89 out of 183) metabolite candidates were filtered out. From the public Arabidopsis data set reported with two internal standards and 167 elucidated metabolites, iMet-Q detected all of the peaks corresponding to the internal standards and 167 metabolites. Meanwhile, our tool had small abundance variation (≤0.19) when quantifying the two internal standards and had higher abundance correlation (≥0.92) when quantifying the 167 metabolites. iMet-Q provides user-friendly interfaces and is publicly available for download at http://ms.iis.sinica.edu.tw/comics/Software_iMet-Q.html. PMID:26784691

  5. GLS-Finder: A Platform for Fast Profiling of Glucosinolates in Brassica Vegetables.

    PubMed

    Sun, Jianghao; Zhang, Mengliang; Chen, Pei

    2016-06-01

    Mass spectrometry combined with related tandem techniques has become the most popular method for plant secondary metabolite characterization. We introduce a new strategy based on in-database searching, mass fragmentation behavior study, formula predicting for fast profiling of glucosinolates, a class of important compounds in brassica vegetables. A MATLAB script-based expert system computer program, "GLS-Finder", was developed. It is capable of qualitative and semi-quantitative analyses of glucosinolates in samples using data generated by ultrahigh-performance liquid chromatography-high-resolution accurate mass with multi-stage mass fragmentation (UHPLC-HRAM/MS(n)). A suite of bioinformatic tools was integrated into the "GLS-Finder" to perform raw data deconvolution, peak alignment, glucosinolate putative assignments, semi-quantitation, and unsupervised principal component analysis (PCA). GLS-Finder was successfully applied to identify intact glucosinolates in 49 commonly consumed Brassica vegetable samples in the United States. It is believed that this work introduces a new way of fast data processing and interpretation for qualitative and quantitative analyses of glucosinolates, where great efficacy was improved in comparison to identification manually.

  6. Visual investigation on the heat dissipation process of a heat sink by using digital holographic interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Bingjing; Zhao, Jianlin, E-mail: jlzhao@nwpu.edu.cn; Wang, Jun

    2013-11-21

    We present a method for visually and quantitatively investigating the heat dissipation process of plate-fin heat sinks by using digital holographic interferometry. A series of phase change maps reflecting the temperature distribution and variation trend of the air field surrounding heat sink during the heat dissipation process are numerically reconstructed based on double-exposure holographic interferometry. According to the phase unwrapping algorithm and the derived relationship between temperature and phase change of the detection beam, the full-field temperature distributions are quantitatively obtained with a reasonably high measurement accuracy. And then the impact of heat sink's channel width on the heat dissipationmore » performance in the case of natural convection is analyzed. In addition, a comparison between simulation and experiment results is given to verify the reliability of this method. The experiment results certify the feasibility and validity of the presented method in full-field, dynamical, and quantitative measurement of the air field temperature distribution, which provides a basis for analyzing the heat dissipation performance of plate-fin heat sinks.« less

  7. Experimental comparison between performance of the PM and LPM methods in computed radiography

    NASA Astrophysics Data System (ADS)

    Kermani, Aboutaleb; Feghhi, Seyed Amir Hossein; Rokrok, Behrouz

    2018-07-01

    The scatter downgrades the image quality and reduces its information efficiency in quantitative measurement usages when creating projections with ionizing radiation. Therefore, the variety of methods have been applied for scatter reduction and correction of the undesirable effects. As new approaches, the ordinary and localized primary modulation methods have already been used individually through experiments and simulations in medical and industrial computed tomography, respectively. The aim of this study is the evaluation of capabilities and limitations of these methods in comparison with each other. For this mean, the ordinary primary modulation has been implemented in computed radiography for the first time and the potential of both methods has been assessed in thickness measurement as well as scatter to primary signal ratio determination. The comparison results, based on the experimental outputs which obtained using aluminum specimens and continuous X-ray spectra, are to the benefit of the localized primary modulation method because of improved accuracy and higher performance especially at the edges.

  8. SU-D-204-05: Quantitative Comparison of a High Resolution Micro-Angiographic Fluoroscopic (MAF) Detector with a Standard Flat Panel Detector (FPD) Using the New Metric of Generalized Measured Relative Object Detectability (GM-ROD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russ, M; Ionita, C; Bednarek, D

    Purpose: In endovascular image-guided neuro-interventions, visualization of fine detail is paramount. For example, the ability of the interventionist to visualize the stent struts depends heavily on the x-ray imaging detector performance. Methods: A study to examine the relative performance of the high resolution MAF-CMOS (pixel size 75µm, Nyquist frequency 6.6 cycles/mm) and a standard Flat Panel Detector (pixel size 194µm, Nyquist frequency 2.5 cycles/mm) detectors in imaging a neuro stent was done using the Generalized Measured Relative Object Detectability (GM-ROD) metric. Low quantum noise images of a deployed stent were obtained by averaging 95 frames obtained by both detectors withoutmore » changing other exposure or geometric parameters. The square of the Fourier transform of each image is taken and divided by the generalized normalized noise power spectrum to give an effective measured task-specific signal-to-noise ratio. This expression is then integrated from 0 to each of the detector’s Nyquist frequencies, and the GM-ROD value is determined by taking a ratio of the integrals for the MAF-CMOS to that of the FPD. The lower bound of integration can be varied to emphasize high frequencies in the detector comparisons. Results: The MAF-CMOS detector exhibits vastly superior performance over the FPD when integrating over all frequencies, yielding a GM-ROD value of 63.1. The lower bound of integration was stepped up in increments of 0.5 cycles/mm for higher frequency comparisons. As the lower bound increased, the GM-ROD value was augmented, reflecting the superior performance of the MAF-CMOS in the high frequency regime. Conclusion: GM-ROD is a versatile metric that can provide quantitative detector and task dependent comparisons that can be used as a basis for detector selection. Supported by NIH Grant: 2R01EB002873 and an equipment grant from Toshiba Medical Systems Corporation.« less

  9. Boron concentration measurements by alpha spectrometry and quantitative neutron autoradiography in cells and tissues treated with different boronated formulations and administration protocols.

    PubMed

    Bortolussi, Silva; Ciani, Laura; Postuma, Ian; Protti, Nicoletta; Luca Reversi; Bruschi, Piero; Ferrari, Cinzia; Cansolino, Laura; Panza, Luigi; Ristori, Sandra; Altieri, Saverio

    2014-06-01

    The possibility to measure boron concentration with high precision in tissues that will be irradiated represents a fundamental step for a safe and effective BNCT treatment. In Pavia, two techniques have been used for this purpose, a quantitative method based on charged particles spectrometry and a boron biodistribution imaging based on neutron autoradiography. A quantitative method to determine boron concentration by neutron autoradiography has been recently set-up and calibrated for the measurement of biological samples, both solid and liquid, in the frame of the feasibility study of BNCT. This technique was calibrated and the obtained results were cross checked with those of α spectrometry, in order to validate them. The comparisons were performed using tissues taken form animals treated with different boron administration protocols. Subsequently the quantitative neutron autoradiography was employed to measure osteosarcoma cell samples treated with BPA and with new boronated formulations. © 2013 Published by Elsevier Ltd.

  10. Statistical Issues in the Comparison of Quantitative Imaging Biomarker Algorithms using Pulmonary Nodule Volume as an Example

    PubMed Central

    2014-01-01

    Quantitative imaging biomarkers (QIBs) are being used increasingly in medicine to diagnose and monitor patients’ disease. The computer algorithms that measure QIBs have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms’ bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms’ performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for QIB studies. PMID:24919828

  11. Assessment of simulation fidelity using measurements of piloting technique in flight

    NASA Technical Reports Server (NTRS)

    Clement, W. F.; Cleveland, W. B.; Key, D. L.

    1984-01-01

    The U.S. Army and NASA joined together on a project to conduct a systematic investigation and validation of a ground based piloted simulation of the Army/Sikorsky UH-60A helicopter. Flight testing was an integral part of the validation effort. Nap-of-the-Earth (NOE) piloting tasks which were investigated included the bob-up, the hover turn, the dash/quickstop, the sidestep, the dolphin, and the slalom. Results from the simulation indicate that the pilot's NOE task performance in the simulator is noticeably and quantifiably degraded when compared with the task performance results generated in flight test. The results of the flight test and ground based simulation experiments support a unique rationale for the assessment of simulation fidelity: flight simulation fidelity should be judged quantitatively by measuring pilot's control strategy and technique as induced by the simulator. A quantitative comparison is offered between the piloting technique observed in a flight simulator and that observed in flight test for the same tasks performed by the same pilots.

  12. Forensic Comparison and Matching of Fingerprints: Using Quantitative Image Measures for Estimating Error Rates through Understanding and Predicting Difficulty

    PubMed Central

    Kellman, Philip J.; Mnookin, Jennifer L.; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E.

    2014-01-01

    Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and subjective assessment of difficulty in fingerprint comparisons. PMID:24788812

  13. Quantitative comparison of PZT and CMUT probes for photoacoustic imaging: Experimental validation.

    PubMed

    Vallet, Maëva; Varray, François; Boutet, Jérôme; Dinten, Jean-Marc; Caliano, Giosuè; Savoia, Alessandro Stuart; Vray, Didier

    2017-12-01

    Photoacoustic (PA) signals are short ultrasound (US) pulses typically characterized by a single-cycle shape, often referred to as N-shape. The spectral content of such wideband signals ranges from a few hundred kilohertz to several tens of megahertz. Typical reception frequency responses of classical piezoelectric US imaging transducers, based on PZT technology, are not sufficiently broadband to fully preserve the entire information contained in PA signals, which are then filtered, thus limiting PA imaging performance. Capacitive micromachined ultrasonic transducers (CMUT) are rapidly emerging as a valid alternative to conventional PZT transducers in several medical ultrasound imaging applications. As compared to PZT transducers, CMUTs exhibit both higher sensitivity and significantly broader frequency response in reception, making their use attractive in PA imaging applications. This paper explores the advantages of the CMUT larger bandwidth in PA imaging by carrying out an experimental comparative study using various CMUT and PZT probes from different research laboratories and manufacturers. PA acquisitions are performed on a suture wire and on several home-made bimodal phantoms with both PZT and CMUT probes. Three criteria, based on the evaluation of pure receive impulse response, signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) respectively, have been used for a quantitative comparison of imaging results. The measured fractional bandwidths of the CMUT arrays are larger compared to PZT probes. Moreover, both SNR and CNR are enhanced by at least 6 dB with CMUT technology. This work highlights the potential of CMUT technology for PA imaging through qualitative and quantitative parameters.

  14. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral, distal femoral, acetabular) were identified.

  15. Database for LDV Signal Processor Performance Analysis

    NASA Technical Reports Server (NTRS)

    Baker, Glenn D.; Murphy, R. Jay; Meyers, James F.

    1989-01-01

    A comparative and quantitative analysis of various laser velocimeter signal processors is difficult because standards for characterizing signal bursts have not been established. This leaves the researcher to select a signal processor based only on manufacturers' claims without the benefit of direct comparison. The present paper proposes the use of a database of digitized signal bursts obtained from a laser velocimeter under various configurations as a method for directly comparing signal processors.

  16. Evaluating the Contribution of Different Item Features to the Effect Size of the Gender Difference in Three-Dimensional Mental Rotation Using Automatic Item Generation

    ERIC Educational Resources Information Center

    Arendasy, Martin E.; Sommer, Markus

    2010-01-01

    In complex three-dimensional mental rotation tasks males have been reported to score up to one standard deviation higher than females. However, this effect size estimate could be compromised by the presence of gender bias at the item level, which calls the validity of purely quantitative performance comparisons into question. We hypothesized that…

  17. Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks

    NASA Technical Reports Server (NTRS)

    Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.

    2000-01-01

    Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.

  18. Accounting for the Multiple Natures of Missing Values in Label-Free Quantitative Proteomics Data Sets to Compare Imputation Strategies.

    PubMed

    Lazar, Cosmin; Gatto, Laurent; Ferro, Myriam; Bruley, Christophe; Burger, Thomas

    2016-04-01

    Missing values are a genuine issue in label-free quantitative proteomics. Recent works have surveyed the different statistical methods to conduct imputation and have compared them on real or simulated data sets and recommended a list of missing value imputation methods for proteomics application. Although insightful, these comparisons do not account for two important facts: (i) depending on the proteomics data set, the missingness mechanism may be of different natures and (ii) each imputation method is devoted to a specific type of missingness mechanism. As a result, we believe that the question at stake is not to find the most accurate imputation method in general but instead the most appropriate one. We describe a series of comparisons that support our views: For instance, we show that a supposedly "under-performing" method (i.e., giving baseline average results), if applied at the "appropriate" time in the data-processing pipeline (before or after peptide aggregation) on a data set with the "appropriate" nature of missing values, can outperform a blindly applied, supposedly "better-performing" method (i.e., the reference method from the state-of-the-art). This leads us to formulate few practical guidelines regarding the choice and the application of an imputation method in a proteomics context.

  19. Quantitative framework for prospective motion correction evaluation.

    PubMed

    Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2016-02-01

    Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.

  20. QUANTITATIVE ASSESSMENT OF INTEGRATED PHRENIC NERVE ACTIVITY

    PubMed Central

    Nichols, Nicole L.; Mitchell, Gordon S.

    2016-01-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1G93A Taconic rat groups (an ALS model). Meta-analysis results indicate: 1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; 2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ~1.0; and 3) consistently reduced activity in end-stage SOD1G93A rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. PMID:26724605

  1. Impact of different meander sizes on the RF transmit performance and coupling of microstrip line elements at 7 T.

    PubMed

    Rietsch, Stefan H G; Quick, Harald H; Orzada, Stephan

    2015-08-01

    In this work, the transmit performance and interelement coupling characteristics of radio frequency (RF) antenna microstrip line elements are examined in simulations and measurements. The initial point of the simulations is a microstrip line element loaded with a phantom. Meander structures are then introduced at the end of the element. The size of the meanders is increased in fixed steps and the magnetic field is optimized. In continuative simulations, the coupling between identical elements is evaluated for different element spacing and loading conditions. Verification of the simulation results is accomplished in measurements of the coupling between two identical elements for four different meander sizes. Image acquisition on a 7 T magnetic resonance imaging (MRI) system provides qualitative and quantitative comparisons to confirm the simulation results. Simulations point out an optimum range of meander sizes concerning coupling in all chosen geometric setups. Coupling measurement results are in good agreement with the simulations. Qualitative and quantitative comparisons of the acquired MRI images substantiate the coupling results. The coupling between coil elements in RF antenna arrays consisting of the investigated element types can be optimized under consideration of the central magnetic field strength or efficiency depending on the desired application.

  2. UPLC-MS/MS quantitative analysis and structural fragmentation study of five Parmotrema lichens from the Eastern Ghats.

    PubMed

    Kumar, K; Siva, Bandi; Sarma, V U M; Mohabe, Satish; Reddy, A Madhusudana; Boustie, Joel; Tiwari, Ashok K; Rao, N Rama; Babu, K Suresh

    2018-07-15

    Comparative phytochemical analysis of five lichen species [Parmotrema tinctorum (Delise ex Nyl.) Hale, P. andinum (Mull. Arg.) Hale, P. praesorediosum (Nyl.) Hale, P. grayanum (Hue) Hale, P. austrosinense (Zahlbr.) Hale] of Parmotrema genus were performed using two complementary UPLC-MS systems. The first system consists of high resolution UPLC-QToF-MS/MS spectrometer and the second system consisted of UPLC-MS/MS in Multiple Reaction Monitoring (MRM) mode for quantitative analysis of major constituents in the selected lichen species. The individual compounds (47 compounds) were identified using Q-ToF-MS/MS, via comparison of the exact molecular masses from their MS/MS spectra, the comparison of literature data and retention times to those of standard compounds which were isolated from crude extract of abundant lichen, P. tinctorum. The analysis also allowed us to identify unknown peaks/compounds, which were further characterized by their mass fragmentation studies. The quantitative MRM analysis was useful to have a better discrimination of species according to their chemical profile. Moreover, the determination of antioxidant activities (ABTS + inhibition) and Advance Glycation Endproducts (AGEs) inhibition carried out for the crude extracts revealed a potential antiglycaemic activity to be confirmed for P. austrosinense. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Comparison of three‐dimensional analysis and stereological techniques for quantifying lithium‐ion battery electrode microstructures

    PubMed Central

    TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.

    2016-01-01

    Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804

  4. Comparison of three-dimensional analysis and stereological techniques for quantifying lithium-ion battery electrode microstructures.

    PubMed

    Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R

    2016-09-01

    Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  5. Development of an educational simulator system, ECCSIM-Lite, for the acquisition of basic perfusion techniques and evaluation.

    PubMed

    Ninomiya, Shinji; Tokumine, Asako; Yasuda, Toru; Tomizawa, Yasuko

    2007-01-01

    A training system with quantitative evaluation of performance for training perfusionists is valuable for preparation for rare but critical situations. A simulator system, ECCSIM-Lite, for extracorporeal circulation (ECC) training of perfusionists was developed. This system consists of a computer system containing a simulation program of the hemodynamic conditions and the training scenario with instructions, a flow sensor unit, a reservoir with a built-in water level sensor, and an ECC circuit with a soft bag representing the human body. This system is relatively simple, easy to handle, compact, and reasonably inexpensive. Quantitative information is recorded, including the changes in arterial flow by the manipulation of a knob, the changes in venous drainage by handling a clamp, and the change in reservoir level; the time courses of the above parameters are presented graphically. To increase the realism of the training, a numerical-hydraulic circulatory model was applied. Following the instruction and explanation of the scenario in the form of audio and video captions, it is possible for a trainee to undertake self-study without an instructor or a computer operator. To validate the system, a training session was given to three beginners using a simple training scenario; it was possible to record the performance of the perfusion sessions quantitatively. In conclusion, the ECCSIM-Lite system is expected to be useful for perfusion training, since quantitative information about the trainee's performance is recorded and it is possible to use the data for assessment and comparison.

  6. Comparison of strain and shear wave elastography for qualitative and quantitative assessment of breast masses in the same population.

    PubMed

    Kim, Hyo Jin; Kim, Sun Mi; Kim, Bohyoung; La Yun, Bo; Jang, Mijung; Ko, Yousun; Lee, Soo Hyun; Jeong, Heeyeong; Chang, Jung Min; Cho, Nariya

    2018-04-18

    We investigated addition of strain and shear wave elastography to conventional ultrasonography for the qualitative and quantitative assessment of breast masses; cut-off points were determined for strain ratio, elasticity ratio, and visual score for differentiating between benign and malignant masses. In all, 108 masses from 94 patients were evaluated with strain and shear wave elastography and scored for suspicion of malignancy, visual score, strain ratio, and elasticity ratio. The diagnostic performance between ultrasonography alone and ultrasonography combined with either type of elastography was compared; cut-off points were determined for strain ratio, elasticity ratio, and visual score. Of the 108 masses, 44 were malignant and 64 were benign. The areas under the curves were significantly higher for strain and shear wave elastography-supplemented ultrasonography (0.839 and 0.826, respectively; P = 0.656) than for ultrasonography alone (0.764; P = 0.018 and 0.035, respectively). The diagnostic performances of strain and elasticity ratios were similar when differentiating benign from malignant masses. Cut-off values for strain ratio, elasticity ratio, and visual scores for strain and shear wave elastography were 2.93, 4, 3, and 2, respectively. Both forms of elastography similarly improved the diagnostic performance of conventional ultrasonography in the qualitative and quantitative assessment of breast masses.

  7. A Novel HPLC Method for the Concurrent Analysis and Quantitation of Seven Water-Soluble Vitamins in Biological Fluids (Plasma and Urine): A Validation Study and Application

    PubMed Central

    Grotzkyj Giorgi, Margherita; Howland, Kevin; Martin, Colin; Bonner, Adrian B.

    2012-01-01

    An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B1, B2, B5, B6, B9, B12) in biological matrices (plasma and urine). Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males). Interday and intraday precision were <4% and <7%, respectively, for all vitamins. Recovery percentages ranged from 93% to 100%. PMID:22536136

  8. A novel HPLC method for the concurrent analysis and quantitation of seven water-soluble vitamins in biological fluids (plasma and urine): a validation study and application.

    PubMed

    Giorgi, Margherita Grotzkyj; Howland, Kevin; Martin, Colin; Bonner, Adrian B

    2012-01-01

    An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B(1), B(2), B(5), B(6), B(9), B(12)) in biological matrices (plasma and urine). Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males). Interday and intraday precision were <4% and <7%, respectively, for all vitamins. Recovery percentages ranged from 93% to 100%.

  9. The impact of chief executive officer personality on top management team dynamics:one mechanism by which leadership affects organizational performance.

    PubMed

    Peterson, Randall S; Smith, D Brent; Martorana, Paul V; Owens, Pamela D

    2003-10-01

    This article explores 1 mechanism by which leader personality affects organizational performance. The authors hypothesized and tested the effects of leader personality on the group dynamics of the top management team (TMT) and of TMT dynamics on organizational performance. To test their hypotheses, the authors used the group dynamics q-sort method, which is designed to permit rigorous, quantitative comparisons of data derived from qualitative sources. Results from independent observations of chief executive officer (CEO) personality and TMT dynamics for 17 CEOs supported the authors' hypothesized relationships both between CEO personality and TMT group dynamics and between TMT dynamics and organizational performance.

  10. A generic simulation model to assess the performance of sterilization services in health establishments.

    PubMed

    Di Mascolo, Maria; Gouin, Alexia

    2013-03-01

    The work presented here is with a view to improving performance of sterilization services in hospitals. We carried out a survey in a large number of health establishments in the Rhône-Alpes region in France. Based on the results of this survey and a detailed study of a specific service, we have built a generic model. The generic nature of the model relies on a common structure with a high level of detail. This model can be used to improve the performance of a specific sterilization service and/or to dimension its resources. It can also serve for quantitative comparison of performance indicators of various sterilization services.

  11. Visual enhancements in pick-and-place tasks: Human operators controlling a simulated cylindrical manipulator

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Tendick, Frank; Stark, Lawrence

    1989-01-01

    A teleoperation simulator was constructed with vector display system, joysticks, and a simulated cylindrical manipulator, in order to quantitatively evaluate various display conditions. The first of two experiments conducted investigated the effects of perspective parameter variations on human operators' pick-and-place performance, using a monoscopic perspective display. The second experiment involved visual enhancements of the monoscopic perspective display, by adding a grid and reference lines, by comparison with visual enhancements of a stereoscopic display; results indicate that stereoscopy generally permits superior pick-and-place performance, but that monoscopy nevertheless allows equivalent performance when defined with appropriate perspective parameter values and adequate visual enhancements.

  12. OdorMapComparer: an application for quantitative analyses and comparisons of fMRI brain odor maps.

    PubMed

    Liu, Nian; Xu, Fuqiang; Miller, Perry L; Shepherd, Gordon M

    2007-01-01

    Brain odor maps are reconstructed flat images that describe the spatial activity patterns in the glomerular layer of the olfactory bulbs in animals exposed to different odor stimuli. We have developed a software application, OdorMapComparer, to carry out quantitative analyses and comparisons of the fMRI odor maps. This application is an open-source window program that first loads two odor map images being compared. It allows image transformations including scaling, flipping, rotating, and warping so that the two images can be appropriately aligned to each other. It performs simple subtraction, addition, and average of signals in the two images. It also provides comparative statistics including the normalized correlation (NC) and spatial correlation coefficient. Experimental studies showed that the rodent fMRI odor maps for aliphatic aldehydes displayed spatial activity patterns that are similar in gross outlines but somewhat different in specific subregions. Analyses with OdorMapComparer indicate that the similarity between odor maps decreases with increasing difference in the length of carbon chains. For example, the map of butanal is more closely related to that of pentanal (with a NC = 0.617) than to that of octanal (NC = 0.082), which is consistent with animal behavioral studies. The study also indicates that fMRI odor maps are statistically odor-specific and repeatable across both the intra- and intersubject trials. OdorMapComparer thus provides a tool for quantitative, statistical analyses and comparisons of fMRI odor maps in a fashion that is integrated with the overall odor mapping techniques.

  13. Specification and estimation of sources of bias affecting neurological studies in PET/MR with an anatomical brain phantom

    NASA Astrophysics Data System (ADS)

    Teuho, J.; Johansson, J.; Linden, J.; Saunavaara, V.; Tolvanen, T.; Teräs, M.

    2014-01-01

    Selection of reconstruction parameters has an effect on the image quantification in PET, with an additional contribution from a scanner-specific attenuation correction method. For achieving comparable results in inter- and intra-center comparisons, any existing quantitative differences should be identified and compensated for. In this study, a comparison between PET, PET/CT and PET/MR is performed by using an anatomical brain phantom, to identify and measure the amount of bias caused due to differences in reconstruction and attenuation correction methods especially in PET/MR. Differences were estimated by using visual, qualitative and quantitative analysis. The qualitative analysis consisted of a line profile analysis for measuring the reproduction of anatomical structures and the contribution of the amount of iterations to image contrast. The quantitative analysis consisted of measurement and comparison of 10 anatomical VOIs, where the HRRT was considered as the reference. All scanners reproduced the main anatomical structures of the phantom adequately, although the image contrast on the PET/MR was inferior when using a default clinical brain protocol. Image contrast was improved by increasing the amount of iterations from 2 to 5 while using 33 subsets. Furthermore, a PET/MR-specific bias was detected, which resulted in underestimation of the activity values in anatomical structures closest to the skull, due to the MR-derived attenuation map that ignores the bone. Thus, further improvements for the PET/MR reconstruction and attenuation correction could be achieved by optimization of RAMLA-specific reconstruction parameters and implementation of bone to the attenuation template.

  14. Inter- and intra-observer agreement of BI-RADS-based subjective visual estimation of amount of fibroglandular breast tissue with magnetic resonance imaging: comparison to automated quantitative assessment.

    PubMed

    Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja

    2016-11-01

    To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.

  15. Age-class separation of blue-winged ducks

    USGS Publications Warehouse

    Hohman, W.L.; Moore, J.L.; Twedt, D.J.; Mensik, John G.; Logerwell, E.

    1995-01-01

    Accurate determination of age is of fundamental importance to population and life history studies of waterfowl and their management. Therefore, we developed quantitative methods that separate adult and immature blue-winged teal (Anas discors), cinnamon teal (A. cyanoptera), and northern shovelers (A. clypeata) during spring and summer. To assess suitability of discriminant models using 9 remigial measurements, we compared model performance (% agreement between predicted age and age assigned to birds on the basis of definitive cloacal or rectral feather characteristics) in different flyways (Mississippi and Pacific) and between years (1990-91 and 1991-92). We also applied age-classification models to wings obtained from U.S. Fish and Wildlife Service harvest surveys in the Mississippi and Central-Pacific flyways (wing-bees) for which age had been determined using qualitative characteristics (i.e., remigial markings, shape, or wear). Except for male northern shovelers, models correctly aged lt 90% (range 70-86%) of blue-winged ducks. Model performance varied among species and differed between sexes and years. Proportions of individuals that were correctly aged were greater for males (range 63-86%) than females (range 39-69%). Models for northern shovelers performed better in flyway comparisons within year (1991-92, La. model applied to Calif. birds, and Calif. model applied to La. birds: 90 and 94% for M, and 89 and 76% for F, respectively) than in annual comparisons within the Mississippi Flyway (1991-92 model applied to 1990-91 data: 79% for M, 50% for F). Exclusion of measurements that varied by flyway or year did not improve model performance. Quantitative methods appear to be of limited value for age separation of female blue-winged ducks. Close agreement between predicted age and age assigned to wings from the wing-bees suggests that qualitative and quantitative methods may be equally accurate for age separation of male blue-winged ducks. We interpret annual and flyway differences in remigial measurements and reduced performance of age classification models as evidence of high variability in size of blue-winged ducks' remiges. Variability in remigial size of these and other small-bodied waterfowl may be related to nutrition during molt.

  16. Drift mobility of photo-electrons in organic molecular crystals: Quantitative comparison between theory and experiment

    NASA Astrophysics Data System (ADS)

    Reineker, P.; Kenkre, V. M.; Kühne, R.

    1981-08-01

    A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.

  17. Differential reinforcement and resistance to change of divided-attention performance.

    PubMed

    Podlesnik, Christopher A; Thrailkill, Eric; Shahan, Timothy A

    2012-06-01

    Behavioral momentum theory provides a framework for understanding how conditions of reinforcement influence instrumental response strength under conditions of disruption (i.e., resistance to change). The present experiment examined resistance to change of divided-attention performance when different overall probabilities of reinforcement were arranged across two components of a multiple schedule. Pigeons responded in a delayed-matching-to-sample procedure with compound samples (color + line orientation) and element comparisons (two colors or two line orientations). Reinforcement ratios of 1:9, 1:1, and 9:1 for accurate matches on the two types of comparison trials were examined across conditions using reinforcement probabilities (color/lines) of .9/.1, .5/.5, and .1/.9 in the rich component and .18/.02, .1/.1, and .02/.18 in the lean component. Relative accuracy with color and line comparisons was an orderly function of relative reinforcement, but this relation did not depend on the overall rate of reinforcement between components. The resistance to change of divided-attention performance was greater for both trial types in the rich component with presession feeding and extinction, but not with decreases in sample duration. These findings suggest promise for the applicability of quantitative models of operant behavior to divided-attention performance, but they highlight the need to further explore conditions impacting the resistance to change of attending.

  18. Differential reinforcement and resistance to change of divided-attention performance

    PubMed Central

    Thrailkill, Eric

    2016-01-01

    Behavioral momentum theory provides a framework for understanding how conditions of reinforcement influence instrumental response strength under conditions of disruption (i.e., resistance to change). The present experiment examined resistance to change of divided-attention performance when different overall probabilities of reinforcement were arranged across two components of a multiple schedule. Pigeons responded in a delayed-matching-to-sample procedure with compound samples (color + line orientation) and element comparisons (two colors or two line orientations). Reinforcement ratios of 1:9, 1:1, and 9:1 for accurate matches on the two types of comparison trials were examined across conditions using reinforcement probabilities (color/lines) of .9/.1, .5/.5, and .1/.9 in the rich component and .18/.02, .1/.1, and .02/.18 in the lean component. Relative accuracy with color and line comparisons was an orderly function of relative reinforcement, but this relation did not depend on the overall rate of reinforcement between components. The resistance to change of divided-attention performance was greater for both trial types in the rich component with presession feeding and extinction, but not with decreases in sample duration. These findings suggest promise for the applicability of quantitative models of operant behavior to divided-attention performance, but they highlight the need to further explore conditions impacting the resistance to change of attending. PMID:22038737

  19. A revision of the gamma-evaluation concept for the comparison of dose distributions.

    PubMed

    Bakai, Annemarie; Alber, Markus; Nüsslin, Fridtjof

    2003-11-07

    A method for the quantitative four-dimensional (4D) evaluation of discrete dose data based on gradient-dependent local acceptance thresholds is presented. The method takes into account the local dose gradients of a reference distribution for critical appraisal of misalignment and collimation errors. These contribute to the maximum tolerable dose error at each evaluation point to which the local dose differences between comparison and reference data are compared. As shown, the presented concept is analogous to the gamma-concept of Low et al (1998a Med. Phys. 25 656-61) if extended to (3+1) dimensions. The pointwise dose comparisons of the reformulated concept are easier to perform and speed up the evaluation process considerably, especially for fine-grid evaluations of 3D dose distributions. The occurrences of false negative indications due to the discrete nature of the data are reduced with the method. The presented method was applied to film-measured, clinical data and compared with gamma-evaluations. 4D and 3D evaluations were performed. Comparisons prove that 4D evaluations have to be given priority, especially if complex treatment situations are verified, e.g., non-coplanar beam configurations.

  20. Cerebral Metabolic Rate of Oxygen (CMRO2 ) Mapping by Combining Quantitative Susceptibility Mapping (QSM) and Quantitative Blood Oxygenation Level-Dependent Imaging (qBOLD).

    PubMed

    Cho, Junghun; Kee, Youngwook; Spincemaille, Pascal; Nguyen, Thanh D; Zhang, Jingwei; Gupta, Ajay; Zhang, Shun; Wang, Yi

    2018-03-07

    To map the cerebral metabolic rate of oxygen (CMRO 2 ) by estimating the oxygen extraction fraction (OEF) from gradient echo imaging (GRE) using phase and magnitude of the GRE data. 3D multi-echo gradient echo imaging and perfusion imaging with arterial spin labeling were performed in 11 healthy subjects. CMRO 2 and OEF maps were reconstructed by joint quantitative susceptibility mapping (QSM) to process GRE phases and quantitative blood oxygen level-dependent (qBOLD) modeling to process GRE magnitudes. Comparisons with QSM and qBOLD alone were performed using ROI analysis, paired t-tests, and Bland-Altman plot. The average CMRO 2 value in cortical gray matter across subjects were 140.4 ± 14.9, 134.1 ± 12.5, and 184.6 ± 17.9 μmol/100 g/min, with corresponding OEFs of 30.9 ± 3.4%, 30.0 ± 1.8%, and 40.9 ± 2.4% for methods based on QSM, qBOLD, and QSM+qBOLD, respectively. QSM+qBOLD provided the highest CMRO 2 contrast between gray and white matter, more uniform OEF than QSM, and less noisy OEF than qBOLD. Quantitative CMRO 2 mapping that fits the entire complex GRE data is feasible by combining QSM analysis of phase and qBOLD analysis of magnitude. © 2018 International Society for Magnetic Resonance in Medicine.

  1. Semi-quantitative methods yield greater inter- and intraobserver agreement than subjective methods for interpreting 99m technetium-hydroxymethylene-diphosphonate uptake in equine thoracic processi spinosi.

    PubMed

    van Zadelhoff, Claudia; Ehrle, Anna; Merle, Roswitha; Jahn, Werner; Lischer, Christoph

    2018-05-09

    Scintigraphy is a standard diagnostic method for evaluating horses with back pain due to suspected thoracic processus spinosus pathology. Lesion detection is based on subjective or semi-quantitative assessments of increased uptake. This retrospective, analytical study is aimed to compare semi-quantitative and subjective methods in the evaluation of scintigraphic images of the processi spinosi in the equine thoracic spine. Scintigraphic images of 20 Warmblood horses, presented for assessment of orthopedic conditions between 2014 and 2016, were included in the study. Randomized, blinded image evaluation was performed by 11 veterinarians using subjective and semi-quantitative methods. Subjective grading was performed for the analysis of red-green-blue and grayscale scintigraphic images, which were presented in full-size or as masked images. For the semi-quantitative assessment, observers placed regions of interest over each processus spinosus. The uptake ratio of each processus spinosus in comparison to a reference region of interest was determined. Subsequently, a modified semi-quantitative calculation was developed whereby only the highest counts-per-pixel for a specified number of pixels was processed. Inter- and intraobserver agreement was calculated using intraclass correlation coefficients. Inter- and intraobserver intraclass correlation coefficients were 41.65% and 71.39%, respectively, for the subjective image assessment. Additionally, a correlation between intraobserver agreement, experience, and grayscale images was identified. The inter- and intraobserver agreement was significantly increased when using semi-quantitative analysis (97.35% and 98.36%, respectively) or the modified semi-quantitative calculation (98.61% and 98.82%, respectively). The proposed modified semi-quantitative technique showed a higher inter- and intraobserver agreement when compared to other methods, which makes it a useful tool for the analysis of scintigraphic images. The association of the findings from this study with clinical and radiological examinations requires further investigation. © 2018 American College of Veterinary Radiology.

  2. Improving Large Cetacean Implantable Satellite Tag Designs to Maximize Tag Robustness and Minimize Health Effects to Individual Animals

    DTIC Science & Technology

    2014-09-30

    for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...relatively small for quantitative comparisons and some of the deployed tags are still transmitting, their overall performance appears to have improved. 2

  3. A Small-Scale 3D Imaging Platform for Algorithm Performance Evaluation

    DTIC Science & Technology

    2007-06-01

    between their acquisitions will form the basis for stereo analysis , and thus a 3D perception of the observed scene. Several low cost and economic...incident light versus 2% on a photographic type film [6]. The CCD camera then transforms these patterns of light into electrical signals. First...sources of lux or illumination. Table 1: Lux (Illumination) Quantitative Comparisons. Luminance Example 0.00005 lux Starlight 1 lux Moonlight 10

  4. Work Stress Interventions in Hospital Care: Effectiveness of the DISCovery Method

    PubMed Central

    Niks, Irene; Gevers, Josette

    2018-01-01

    Effective interventions to prevent work stress and to improve health, well-being, and performance of employees are of the utmost importance. This quasi-experimental intervention study presents a specific method for diagnosis of psychosocial risk factors at work and subsequent development and implementation of tailored work stress interventions, the so-called DISCovery method. This method aims at improving employee health, well-being, and performance by optimizing the balance between job demands, job resources, and recovery from work. The aim of the study is to quantitatively assess the effectiveness of the DISCovery method in hospital care. Specifically, we used a three-wave longitudinal, quasi-experimental multiple-case study approach with intervention and comparison groups in health care work. Positive changes were found for members of the intervention groups, relative to members of the corresponding comparison groups, with respect to targeted work-related characteristics and targeted health, well-being, and performance outcomes. Overall, results lend support for the effectiveness of the DISCovery method in hospital care. PMID:29438350

  5. Statistical comparison of various interpolation algorithms for reconstructing regional grid ionospheric maps over China

    NASA Astrophysics Data System (ADS)

    Li, Min; Yuan, Yunbin; Wang, Ningbo; Li, Zishen; Liu, Xifeng; Zhang, Xiao

    2018-07-01

    This paper presents a quantitative comparison of several widely used interpolation algorithms, i.e., Ordinary Kriging (OrK), Universal Kriging (UnK), planar fit and Inverse Distance Weighting (IDW), based on a grid-based single-shell ionosphere model over China. The experimental data were collected from the Crustal Movement Observation Network of China (CMONOC) and the International GNSS Service (IGS), covering the days of year 60-90 in 2015. The quality of these interpolation algorithms was assessed by cross-validation in terms of both the ionospheric correction performance and Single-Frequency (SF) Precise Point Positioning (PPP) accuracy on an epoch-by-epoch basis. The results indicate that the interpolation models perform better at mid-latitudes than low latitudes. For the China region, the performance of OrK and UnK is relatively better than the planar fit and IDW model for estimating ionospheric delay and positioning. In addition, the computational efficiencies of the IDW and planar fit models are better than those of OrK and UnK.

  6. Work Stress Interventions in Hospital Care: Effectiveness of the DISCovery Method.

    PubMed

    Niks, Irene; de Jonge, Jan; Gevers, Josette; Houtman, Irene

    2018-02-13

    Effective interventions to prevent work stress and to improve health, well-being, and performance of employees are of the utmost importance. This quasi-experimental intervention study presents a specific method for diagnosis of psychosocial risk factors at work and subsequent development and implementation of tailored work stress interventions, the so-called DISCovery method. This method aims at improving employee health, well-being, and performance by optimizing the balance between job demands, job resources, and recovery from work. The aim of the study is to quantitatively assess the effectiveness of the DISCovery method in hospital care. Specifically, we used a three-wave longitudinal, quasi-experimental multiple-case study approach with intervention and comparison groups in health care work. Positive changes were found for members of the intervention groups, relative to members of the corresponding comparison groups, with respect to targeted work-related characteristics and targeted health, well-being, and performance outcomes. Overall, results lend support for the effectiveness of the DISCovery method in hospital care.

  7. Comparative evaluation of performance measures for shading correction in time-lapse fluorescence microscopy.

    PubMed

    Liu, L; Kan, A; Leckie, C; Hodgkin, P D

    2017-04-01

    Time-lapse fluorescence microscopy is a valuable technology in cell biology, but it suffers from the inherent problem of intensity inhomogeneity due to uneven illumination or camera nonlinearity, known as shading artefacts. This will lead to inaccurate estimates of single-cell features such as average and total intensity. Numerous shading correction methods have been proposed to remove this effect. In order to compare the performance of different methods, many quantitative performance measures have been developed. However, there is little discussion about which performance measure should be generally applied for evaluation on real data, where the ground truth is absent. In this paper, the state-of-the-art shading correction methods and performance evaluation methods are reviewed. We implement 10 popular shading correction methods on two artificial datasets and four real ones. In order to make an objective comparison between those methods, we employ a number of quantitative performance measures. Extensive validation demonstrates that the coefficient of joint variation (CJV) is the most applicable measure in time-lapse fluorescence images. Based on this measure, we have proposed a novel shading correction method that performs better compared to well-established methods for a range of real data tested. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  8. Satellite Derived Volcanic Ash Product Inter-Comparison in Support to SCOPE-Nowcasting

    NASA Astrophysics Data System (ADS)

    Siddans, Richard; Thomas, Gareth; Pavolonis, Mike; Bojinski, Stephan

    2016-04-01

    In support of aeronautical meteorological services, WMO organized a satellite-based volcanic ash retrieval algorithm inter-comparison activity, to improve the consistency of quantitative volcanic ash products from satellites, under the Sustained, Coordinated Processing of Environmental Satellite Data for Nowcasting (SCOPEe Nowcasting) initiative (http:/ jwww.wmo.int/pagesjprogjsatjscopee nowcasting_en.php). The aims of the intercomparison were as follows: 1. Select cases (Sarychev Peak 2009, Eyjafyallajökull 2010, Grimsvötn 2011, Puyehue-Cordón Caulle 2011, Kirishimayama 2011, Kelut 2014), and quantify the differences between satellite-derived volcanic ash cloud properties derived from different techniques and sensors; 2. Establish a basic validation protocol for satellite-derived volcanic ash cloud properties; 3. Document the strengths and weaknesses of different remote sensing approaches as a function of satellite sensor; 4. Standardize the units and quality flags associated with volcanic cloud geophysical parameters; 5. Provide recommendations to Volcanic Ash Advisory Centers (VAACs) and other users on how to best to utilize quantitative satellite products in operations; 6. Create a "road map" for future volcanic ash related scientific developments and inter-comparison/validation activities that can also be applied to SO2 clouds and emergent volcanic clouds. Volcanic ash satellite remote sensing experts from operational and research organizations were encouraged to participate in the inter-comparison activity, to establish the plans for the inter-comparison and to submit data sets. RAL was contracted by EUMETSAT to perform a systematic inter-comparison of all submitted datasets and results were reported at the WMO International Volcanic Ash Inter-comparison Meeting to held on 29 June - 2 July 2015 in Madison, WI, USA (http:/ /cimss.ssec.wisc.edujmeetings/vol_ash14). 26 different data sets were submitted, from a range of passive imagers and spectrometers and these were inter-compared against each other and against validation data such as CALIPSO lidar, ground-based lidar and aircraft observations. Results of the comparison exercise will be presented together with the conclusions and recommendations arising from the activity.

  9. Vertebral heights and ratios are not only race-specific, but also gender- and region-specific: establishment of reference values for mainland Chinese.

    PubMed

    Ning, Lei; Song, Li-Jiang; Fan, Shun-Wu; Zhao, Xing; Chen, Yi-Lei; Li, Zhao-Zhi; Hu, Zi-Ang

    2017-10-11

    This study established gender-specific reference values in mainland Chinese (MC) and is important for quantitative morphometry for diagnosis and epidemiological study of osteoporotic vertebral compressive fracture. Comparisons of reference values among different racial populations are then performed to demonstrate the MC-specific characteristic. Osteoporotic vertebral compressive fracture (OVCF) is a common complication of osteoporosis in the elder population. Clinical diagnosis and epidemiological study of OVCF often employ quantitative morphometry, which relies heavily on the comparison of patients' vertebral parameters to existing reference values derived from the normal population. Thus, reference values are crucial in clinical diagnosis. To our knowledge, this is the first study to establish reference values of the mainland Chinese (MC) for quantitative morphometry. Vertebral heights including anterior (Ha), middle (Hm), posterior (Hp) heights, and predicted posterior height (pp) from T4 to L5 were obtained; and ratios of Ha/Hp, Hm/Hp and Hp/pp. were calculated from 585 MC (both female and male) for establishing reference values and subsequent comparisons with other studies. Vertebral heights increased progressively from T4 to L3 but then decreased in L4 and L5. Both genders showed similar ratios of vertebral dimensions, but male vertebrae were statistically larger than those of female (P < 0.01). Vertebral size of MC population was smaller than that of US and UK population, but was surprisingly larger than that of Hong Kong Chinese, although these two are commonly considered as one race. Data from different racial populations showed similar dimensional ratios in all vertebrae. We established gender-specific reference values for MC. Our results also indicated the necessity of establishing reference values that are not only race- and gender-specific, but also population- or region-specific for accurate quantitative morphometric assessment of OVCF.

  10. Landmark-Based 3D Elastic Registration of Pre- and Postoperative Liver CT Data

    NASA Astrophysics Data System (ADS)

    Lange, Thomas; Wörz, Stefan; Rohr, Karl; Schlag, Peter M.

    The qualitative and quantitative comparison of pre- and postoperative image data is an important possibility to validate computer assisted surgical procedures. Due to deformations after surgery a non-rigid registration scheme is a prerequisite for a precise comparison. Interactive landmark-based schemes are a suitable approach. Incorporation of a priori knowledge about the anatomical structures to be registered may help to reduce interaction time and improve accuracy. Concerning pre- and postoperative CT data of oncological liver resections the intrahepatic vessels are suitable anatomical structures. In addition to using landmarks at vessel branchings, we here introduce quasi landmarks at vessel segments with anisotropic localization precision. An experimental comparison of interpolating thin-plate splines (TPS) and Gaussian elastic body splines (GEBS) as well as approximating GEBS on both types of landmarks is performed.

  11. Quantitative Tools for Examining the Vocalizations of Juvenile Songbirds

    PubMed Central

    Wellock, Cameron D.; Reeke, George N.

    2012-01-01

    The singing of juvenile songbirds is highly variable and not well stereotyped, a feature that makes it difficult to analyze with existing computational techniques. We present here a method suitable for analyzing such vocalizations, windowed spectral pattern recognition (WSPR). Rather than performing pairwise sample comparisons, WSPR measures the typicality of a sample against a large sample set. We also illustrate how WSPR can be used to perform a variety of tasks, such as sample classification, song ontogeny measurement, and song variability measurement. Finally, we present a novel measure, based on WSPR, for quantifying the apparent complexity of a bird's singing. PMID:22701474

  12. What good is SWIR? Passive day comparison of VIS, NIR, and SWIR

    NASA Astrophysics Data System (ADS)

    Driggers, Ronald G.; Hodgkin, Van; Vollmerhausen, Richard

    2013-06-01

    This paper is the first of three papers associated with the military benefits of SWIR imaging. This paper describes the benefits associated with passive daytime operations with comparisons of SWIR, NIR, and VIS bands and sensors. This paper includes quantitative findings from previously published papers, analysis of open source data, summaries of various expert analyses, and calculations of notional system performance. We did not accept anecdotal findings as acceptable benefits. Topics include haze and fog penetration, atmospheric transmission, cloud and smoke penetration, target and background contrasts, spectral discrimination, turbulence degradation, and long range target identification. The second and third papers will address passive night imaging and active night imaging.

  13. Thermal comparison of buried-heterostructure and shallow-ridge lasers

    NASA Astrophysics Data System (ADS)

    Rustichelli, V.; Lemaître, F.; Ambrosius, H. P. M. M.; Brenot, R.; Williams, K. A.

    2018-02-01

    We present finite difference thermal modeling to predict temperature distribution, heat flux, and thermal resistance inside lasers with different waveguide geometries. We provide a quantitative experimental and theoretical comparison of the thermal behavior of shallow-ridge (SR) and buried-heterostructure (BH) lasers. We investigate the influence of a split heat source to describe p-layer Joule heating and nonradiative energy loss in the active layer and the heat-sinking from top as well as bottom when quantifying thermal impedance. From both measured values and numerical modeling we can quantify the thermal resistance for BH lasers and SR lasers, showing an improved thermal performance from 50K/W to 30K/W for otherwise equivalent BH laser designs.

  14. Sub-band denoising and spline curve fitting method for hemodynamic measurement in perfusion MRI

    NASA Astrophysics Data System (ADS)

    Lin, Hong-Dun; Huang, Hsiao-Ling; Hsu, Yuan-Yu; Chen, Chi-Chen; Chen, Ing-Yi; Wu, Liang-Chi; Liu, Ren-Shyan; Lin, Kang-Ping

    2003-05-01

    In clinical research, non-invasive MR perfusion imaging is capable of investigating brain perfusion phenomenon via various hemodynamic measurements, such as cerebral blood volume (CBV), cerebral blood flow (CBF), and mean trasnit time (MTT). These hemodynamic parameters are useful in diagnosing brain disorders such as stroke, infarction and periinfarct ischemia by further semi-quantitative analysis. However, the accuracy of quantitative analysis is usually affected by poor signal-to-noise ratio image quality. In this paper, we propose a hemodynamic measurement method based upon sub-band denoising and spline curve fitting processes to improve image quality for better hemodynamic quantitative analysis results. Ten sets of perfusion MRI data and corresponding PET images were used to validate the performance. For quantitative comparison, we evaluate gray/white matter CBF ratio. As a result, the hemodynamic semi-quantitative analysis result of mean gray to white matter CBF ratio is 2.10 +/- 0.34. The evaluated ratio of brain tissues in perfusion MRI is comparable to PET technique is less than 1-% difference in average. Furthermore, the method features excellent noise reduction and boundary preserving in image processing, and short hemodynamic measurement time.

  15. Velocity Measurement in Carotid Artery: Quantitative Comparison of Time-Resolved 3D Phase-Contrast MRI and Image-based Computational Fluid Dynamics

    PubMed Central

    Sarrami-Foroushani, Ali; Nasr Esfahany, Mohsen; Nasiraei Moghaddam, Abbas; Saligheh Rad, Hamidreza; Firouznia, Kavous; Shakiba, Madjid; Ghanaati, Hossein; Wilkinson, Iain David; Frangi, Alejandro Federico

    2015-01-01

    Background: Understanding hemodynamic environment in vessels is important for realizing the mechanisms leading to vascular pathologies. Objectives: Three-dimensional velocity vector field in carotid bifurcation is visualized using TR 3D phase-contrast magnetic resonance imaging (TR 3D PC MRI) and computational fluid dynamics (CFD). This study aimed to present a qualitative and quantitative comparison of the velocity vector field obtained by each technique. Subjects and Methods: MR imaging was performed on a 30-year old male normal subject. TR 3D PC MRI was performed on a 3 T scanner to measure velocity in carotid bifurcation. 3D anatomical model for CFD was created using images obtained from time-of-flight MR angiography. Velocity vector field in carotid bifurcation was predicted using CFD and PC MRI techniques. A statistical analysis was performed to assess the agreement between the two methods. Results: Although the main flow patterns were the same for the both techniques, CFD showed a greater resolution in mapping the secondary and circulating flows. Overall root mean square (RMS) errors for all the corresponding data points in PC MRI and CFD were 14.27% in peak systole and 12.91% in end diastole relative to maximum velocity measured at each cardiac phase. Bland-Altman plots showed a very good agreement between the two techniques. However, this study was not aimed to validate any of methods, instead, the consistency was assessed to accentuate the similarities and differences between Time-resolved PC MRI and CFD. Conclusion: Both techniques provided quantitatively consistent results of in vivo velocity vector fields in right internal carotid artery (RCA). PC MRI represented a good estimation of main flow patterns inside the vasculature, which seems to be acceptable for clinical use. However, limitations of each technique should be considered while interpreting results. PMID:26793288

  16. Functional ankle instability as a risk factor for osteoarthritis: using T2-mapping to analyze early cartilage degeneration in the ankle joint of young athletes.

    PubMed

    Golditz, T; Steib, S; Pfeifer, K; Uder, M; Gelse, K; Janka, R; Hennig, F F; Welsch, G H

    2014-10-01

    The aim of this study was to investigate, using T2-mapping, the impact of functional instability in the ankle joint on the development of early cartilage damage. Ethical approval for this study was provided. Thirty-six volunteers from the university sports program were divided into three groups according to their ankle status: functional ankle instability (FAI, initial ankle sprain with residual instability); ankle sprain Copers (initial sprain, without residual instability); and controls (without a history of ankle injuries). Quantitative T2-mapping magnetic resonance imaging (MRI) was performed at the beginning ('early-unloading') and at the end ('late-unloading') of the MR-examination, with a mean time span of 27 min. Zonal region-of-interest T2-mapping was performed on the talar and tibial cartilage in the deep and superficial layers. The inter-group comparisons of T2-values were analyzed using paired and unpaired t-tests. Statistical analysis of variance was performed. T2-values showed significant to highly significant differences in 11 of 12 regions throughout the groups. In early-unloading, the FAI-group showed a significant increase in quantitative T2-values in the medial, talar regions (P = 0.008, P = 0.027), whereas the Coper-group showed this enhancement in the central-lateral regions (P = 0.05). Especially the comparison of early-loading to late-unloading values revealed significantly decreasing T2-values over time laterally and significantly increasing T2-values medially in the FAI-group, which were not present in the Coper- or control-group. Functional instability causes unbalanced loading in the ankle joint, resulting in cartilage alterations as assessed by quantitative T2-mapping. This approach can visualize and localize early cartilage abnormalities, possibly enabling specific treatment options to prevent osteoarthritis in young athletes. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  17. Housekeeping genes as internal standards: use and limits.

    PubMed

    Thellin, O; Zorzi, W; Lakaye, B; De Borman, B; Coumans, B; Hennen, G; Grisar, T; Igout, A; Heinen, E

    1999-10-08

    Quantitative studies are commonly realised in the biomedical research to compare RNA expression in different experimental or clinical conditions. These quantifications are performed through their comparison to the expression of the housekeeping gene transcripts like glyceraldehyde-3-phosphate dehydrogenase (G3PDH), albumin, actins, tubulins, cyclophilin, hypoxantine phsophoribosyltransferase (HRPT), L32. 28S, and 18S rRNAs are also used as internal standards. In this paper, it is recalled that the commonly used internal standards can quantitatively vary in response to various factors. Possible variations are illustrated using three experimental examples. Preferred types of internal standards are then proposed for each of these samples and thereafter the general procedure concerning the choice of an internal standard and the way to manage its uses are discussed.

  18. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    PubMed

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  19. PET imaging and quantitation of Internet-addicted patients and normal controls

    NASA Astrophysics Data System (ADS)

    Jeong, Ha-Kyu; Kim, Hee-Joung; Jung, Haijo; Son, Hye-Kyung; Kim, Dong-Hyeon; Yun, Mijin; Shin, Yee-Jin; Lee, Jong-Doo

    2002-04-01

    Internet addicted patients (IAPs) have widely been increased, as Internet games are becoming very popular in daily life. The purpose of this study was to investigate regional brain activation patterns associated with excessive use of Internet games in adolescents. Six normal controls (NCs) and eight IAPs who were classified as addiction group by adapted version of DSM-IV for pathologic gambling were participated. 18F-FDG PET studies were performed for all adolescents at their rest and activated condition after 20 minutes of each subject's favorite Internet game. To investigate quantitative metabolic differences in both groups, all possible combinations of group comparison were carried out using Statistical Parametric Mapping (SPM 99). Regional brain activation foci were identified on Talairach coordinate. SPM results showed increased metabolic activation in occipital lobes for both groups. Higher metabolisms were seen at resting condition in IAPs than that of in NCs. In comparison to both groups, IAPs showed different patterns of regional brain metabolic activation compared with that of NCs. It suggests that addictive use of Internet games may result in functional alteration of developing brain in adolescents.

  20. Implications of the Java language on computer-based patient records.

    PubMed

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs.

  1. [Free crystalline silica: a comparison of methods for its determination in total dust].

    PubMed

    Maciejewska, Aleksandra; Szadkowska-Stańczyk, Irena; Kondratowicz, Grzegorz

    2005-01-01

    The major objective of the study was to compare and investigate the usefulness of quantitative analyses of free crystalline silica (FCS) in the assessment of dust exposure in samples of total dust of varied composition, using three methods: chemical method in common use in Poland; infrared spectrometry; and x-ray powder diffraction. Mineral composition and FCS contents were investigated in 9 laboratory samples of raw materials, materials, and industrial wastes, containing from about 2 to over 80% of crystalline silica and reduced to particles of size corresponding with that of total dust. Sample components were identified using XRD and FT-IR methods. Ten independent determinations of FCS with each of the three study methods were performed in dust samples. An analysis of linear correlation was applied to investigate interrelationship between mean FCS determinations. In analyzed dust samples, along with silica dust there were numerous minerals interfering with silica during the quantitative analysis. Comparison of mean results of FCS determinations showed that the results obtained using the FT-IR method were by 12-13% lower than those obtained with two other methods. However, the differences observed were within the limits of changeability of results associated with their precision and dependence on reference materials used. Assessment of occupational exposure to dusts containing crystalline silica can be performed on the basis of quantitative analysis of FCS in total dusts using each of the compared methods. The FT-IR method is most appropriate for the FCS determination in samples of small amount of silica or collected at low dust concentrations; the XRD method for the analysis of multicomponent samples; and the chemical method in the case of medium and high FCS contents in samples or high concentrations of dusts in the work environment.

  2. Comparison analysis between filtered back projection and algebraic reconstruction technique on microwave imaging

    NASA Astrophysics Data System (ADS)

    Ramadhan, Rifqi; Prabowo, Rian Gilang; Aprilliyani, Ria; Basari

    2018-02-01

    Victims of acute cancer and tumor are growing each year and cancer becomes one of the causes of human deaths in the world. Cancers or tumor tissue cells are cells that grow abnormally and turn to take over and damage the surrounding tissues. At the beginning, cancers or tumors do not have definite symptoms in its early stages, and can even attack the tissues inside of the body. This phenomena is not identifiable under visual human observation. Therefore, an early detection system which is cheap, quick, simple, and portable is essensially required to anticipate the further development of cancer or tumor. Among all of the modalities, microwave imaging is considered to be a cheaper, simple, and portable system method. There are at least two simple image reconstruction algorithms i.e. Filtered Back Projection (FBP) and Algebraic Reconstruction Technique (ART), which have been adopted in some common modalities. In this paper, both algorithms will be compared by reconstructing the image from an artificial tissue model (i.e. phantom), which has two different dielectric distributions. We addressed two performance comparisons, namely quantitative and qualitative analysis. Qualitative analysis includes the smoothness of the image and also the success in distinguishing dielectric differences by observing the image with human eyesight. In addition, quantitative analysis includes Histogram, Structural Similarity Index (SSIM), Mean Squared Error (MSE), and Peak Signal-to-Noise Ratio (PSNR) calculation were also performed. As a result, quantitative parameters of FBP might show better values than the ART. However, ART is likely more capable to distinguish two different dielectric value than FBP, due to higher contrast in ART and wide distribution grayscale level.

  3. Comparison of quantitative cytomegalovirus (CMV) PCR in plasma and CMV antigenemia assay: clinical utility of the prototype AMPLICOR CMV MONITOR test in transplant recipients.

    PubMed

    Caliendo, A M; St George, K; Kao, S Y; Allega, J; Tan, B H; LaFontaine, R; Bui, L; Rinaldo, C R

    2000-06-01

    The correlation between the prototype AMPLICOR CMV MONITOR test (Roche Molecular Systems), a quantitative PCR assay, and the cytomegalovirus (CMV) pp65 antigenemia assay was evaluated in transplant recipients. Sequential blood specimens were collected on 29 patients (491 specimens), the leukocyte fraction was tested by CMV antigenemia, and quantitative PCR was performed on plasma specimens. None of the 15 patients (242 specimens) who were antigenemia negative were positive for CMV DNA by PCR, and none of these patients developed active CMV disease. There were 14 antigenemia-positive patients, 8 of whom developed active CMV disease. In all patients, there was a good association between the antigenemia and PCR assays. Ganciclovir-resistant virus was isolated from three patients with active CMV disease. These three patients had persistently elevated levels of antigenemia and CMV DNA by PCR when resistance to ganciclovir developed. This standardized, quantitative CMV PCR assay on plasma has clinical utility for the diagnosis of active disease and in monitoring the response to antiviral therapy in transplant recipients.

  4. Comparison of Quantitative Cytomegalovirus (CMV) PCR in Plasma and CMV Antigenemia Assay: Clinical Utility of the Prototype AMPLICOR CMV MONITOR Test in Transplant Recipients

    PubMed Central

    Caliendo, Angela M.; St. George, Kirsten; Kao, Shaw-Yi; Allega, Jessica; Tan, Ban-Hock; LaFontaine, Robert; Bui, Larry; Rinaldo, Charles R.

    2000-01-01

    The correlation between the prototype AMPLICOR CMV MONITOR test (Roche Molecular Systems), a quantitative PCR assay, and the cytomegalovirus (CMV) pp65 antigenemia assay was evaluated in transplant recipients. Sequential blood specimens were collected on 29 patients (491 specimens), the leukocyte fraction was tested by CMV antigenemia, and quantitative PCR was performed on plasma specimens. None of the 15 patients (242 specimens) who were antigenemia negative were positive for CMV DNA by PCR, and none of these patients developed active CMV disease. There were 14 antigenemia-positive patients, 8 of whom developed active CMV disease. In all patients, there was a good association between the antigenemia and PCR assays. Ganciclovir-resistant virus was isolated from three patients with active CMV disease. These three patients had persistently elevated levels of antigenemia and CMV DNA by PCR when resistance to ganciclovir developed. This standardized, quantitative CMV PCR assay on plasma has clinical utility for the diagnosis of active disease and in monitoring the response to antiviral therapy in transplant recipients. PMID:10834964

  5. Quantitation of repaglinide and metabolites in mouse whole-body thin tissue sections using droplet-based liquid microjunction surface sampling-high-performance liquid chromatography-electrospray ionization tandem mass spectrometry.

    PubMed

    Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J; Kertesz, Vilmos; Gan, Jinping

    2016-03-25

    Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites were studied. Major organs (brain, lung, liver, kidney and muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed the same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. In addition, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Quantitation of repaglinide and metabolites in mouse whole-body thin tissue sections using droplet-based liquid microjunction surface sampling-high-performance liquid chromatography-electrospray ionization tandem mass spectrometry

    DOE PAGES

    Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.; ...

    2015-11-03

    Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less

  7. Quantitation of repaglinide and metabolites in mouse whole-body thin tissue sections using droplet-based liquid microjunction surface sampling-high-performance liquid chromatography-electrospray ionization tandem mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.

    Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less

  8. A Preliminary Quantitative Comparison of Vibratory Amplitude Using Rigid and Flexible Stroboscopic Assessment.

    PubMed

    Hosbach-Cannon, Carly J; Lowell, Soren Y; Kelley, Richard T; Colton, Raymond H

    2016-07-01

    The purpose of this study was to establish preliminary, quantitative data on amplitude of vibration during stroboscopic assessment in healthy speakers with normal voice characteristics. Amplitude of vocal fold vibration is a core physiological parameter used in diagnosing voice disorders, yet quantitative data are lacking to guide the determination of what constitutes normal vibratory amplitude. Eleven participants were assessed during sustained vowel production using rigid and flexible endoscopy with stroboscopy. Still images were extracted from digital recordings of a sustained /i/ produced at a comfortable pitch and loudness, with F0 controlled so that levels were within ±15% of each participant's comfortable mean level as determined from connected speech. Glottal width (GW), true vocal fold (TVF) length, and TVF width were measured from still frames representing the maximum open phase of the vibratory cycle. To control for anatomic and magnification differences across participants, GW was normalized to TVF length. GW as a ratio of TVF width was also computed for comparison with prior studies. Mean values and standard deviations were computed for the normalized measures. Paired t tests showed no significant differences between rigid and flexible endoscopy methods. Interrater and intrarater reliability values for raw measurements were found to be high (0.89-0.99). These preliminary quantitative data may be helpful in determining normality or abnormality of vocal fold vibration. Results indicate that quantified amplitude of vibration is similar between endoscopic methods, a clinically relevant finding for individuals performing and interpreting stroboscopic assessments. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  9. Investigating Children's Abilities to Count and Make Quantitative Comparisons

    ERIC Educational Resources Information Center

    Lee, Joohi; Md-Yunus, Sham'ah

    2016-01-01

    This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…

  10. The Relative Performance of High Resolution Quantitative Precipitation Estimates in the Russian River Basin

    NASA Astrophysics Data System (ADS)

    Bytheway, J. L.; Biswas, S.; Cifelli, R.; Hughes, M.

    2017-12-01

    The Russian River carves a 110 mile path through Mendocino and Sonoma counties in western California, providing water for thousands of residents and acres of agriculture as well as a home for several species of endangered fish. The Russian River basin receives almost all of its precipitation during the October through March wet season, and the systems bringing this precipitation are often impacted by atmospheric river events as well as the complex topography of the region. This study will examine the performance of several high resolution (hourly, < 5km) estimates of precipitation from observational products and forecasts over the 2015-2016 and 2016-2017 wet seasons. Comparisons of event total rainfall as well as hourly rainfall will be performed using 1) rain gauges operated by the National Oceanic and Atmospheric Administration (NOAA) Physical Sciences Division (PSD), 2) products from the Multi-Radar/Multi-Sensor (MRMS) QPE dataset, and 3) quantitative precipitation forecasts from the High Resolution Rapid Refresh (HRRR) model at 1, 3, 6, and 12 hour lead times. Further attention will be given to cases or locations representing large disparities between the estimates.

  11. An approach to quantitative sustainability assessment in the early stages of process design.

    PubMed

    Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio

    2008-06-15

    A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.

  12. Quantitative comparison of 3D third harmonic generation and fluorescence microscopy images.

    PubMed

    Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C

    2018-01-01

    Third harmonic generation (THG) microscopy is a label-free imaging technique that shows great potential for rapid pathology of brain tissue during brain tumor surgery. However, the interpretation of THG brain images should be quantitatively linked to images of more standard imaging techniques, which so far has been done qualitatively only. We establish here such a quantitative link between THG images of mouse brain tissue and all-nuclei-highlighted fluorescence images, acquired simultaneously from the same tissue area. For quantitative comparison of a substantial pair of images, we present here a segmentation workflow that is applicable for both THG and fluorescence images, with a precision of 91.3 % and 95.8 % achieved respectively. We find that the correspondence between the main features of the two imaging modalities amounts to 88.9 %, providing quantitative evidence of the interpretation of dark holes as brain cells. Moreover, 80 % bright objects in THG images overlap with nuclei highlighted in the fluorescence images, and they are 2 times smaller than the dark holes, showing that cells of different morphologies can be recognized in THG images. We expect that the described quantitative comparison is applicable to other types of brain tissue and with more specific staining experiments for cell type identification. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  14. The Application of SILAC Mouse in Human Body Fluid Proteomics Analysis Reveals Protein Patterns Associated with IgA Nephropathy.

    PubMed

    Zhao, Shilin; Li, Rongxia; Cai, Xiaofan; Chen, Wanjia; Li, Qingrun; Xing, Tao; Zhu, Wenjie; Chen, Y Eugene; Zeng, Rong; Deng, Yueyi

    2013-01-01

    Body fluid proteome is the most informative proteome from a medical viewpoint. But the lack of accurate quantitation method for complicated body fluid limited its application in disease research and biomarker discovery. To address this problem, we introduced a novel strategy, in which SILAC-labeled mouse serum was used as internal standard for human serum and urine proteome analysis. The SILAC-labeled mouse serum was mixed with human serum and urine, and multidimensional separation coupled with tandem mass spectrometry (IEF-LC-MS/MS) analysis was performed. The shared peptides between two species were quantified by their SILAC pairs, and the human-only peptides were quantified by mouse peptides with coelution. The comparison for the results from two replicate experiments indicated the high repeatability of our strategy. Then the urine from Immunoglobulin A nephropathy patients treated and untreated was compared by this quantitation strategy. Fifty-three peptides were found to be significantly changed between two groups, including both known diagnostic markers for IgAN and novel candidates, such as Complement C3, Albumin, VDBP, ApoA,1 and IGFBP7. In conclusion, we have developed a practical and accurate quantitation strategy for comparison of complicated human body fluid proteome. The results from such strategy could provide potential disease-related biomarkers for evaluation of treatment.

  15. Application of real-time PCR for total airborne bacterial assessment: Comparison with epifluorescence microscopy and culture-dependent methods

    NASA Astrophysics Data System (ADS)

    Rinsoz, Thomas; Duquenne, Philippe; Greff-Mirguet, Guylaine; Oppliger, Anne

    Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count non-culturable or non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescence microscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the "impaction on nutrient agar" method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria.

  16. Quantitative assessment of integrated phrenic nerve activity.

    PubMed

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. A quantitative and qualitative comparison of illumina MiSeq and 454 amplicon sequencing for genotyping the highly polymorphic major histocompatibility complex (MHC) in a non-model species.

    PubMed

    Razali, Haslina; O'Connor, Emily; Drews, Anna; Burke, Terry; Westerdahl, Helena

    2017-07-28

    High-throughput sequencing enables high-resolution genotyping of extremely duplicated genes. 454 amplicon sequencing (454) has become the standard technique for genotyping the major histocompatibility complex (MHC) genes in non-model organisms. However, illumina MiSeq amplicon sequencing (MiSeq), which offers a much higher read depth, is now superseding 454. The aim of this study was to quantitatively and qualitatively evaluate the performance of MiSeq in relation to 454 for genotyping MHC class I alleles using a house sparrow (Passer domesticus) dataset with pedigree information. House sparrows provide a good study system for this comparison as their MHC class I genes have been studied previously and, consequently, we had prior expectations concerning the number of alleles per individual. We found that 454 and MiSeq performed equally well in genotyping amplicons with low diversity, i.e. amplicons from individuals that had fewer than 6 alleles. Although there was a higher rate of failure in the 454 dataset in resolving amplicons with higher diversity (6-9 alleles), the same genotypes were identified by both 454 and MiSeq in 98% of cases. We conclude that low diversity amplicons are equally well genotyped using either 454 or MiSeq, but the higher coverage afforded by MiSeq can lead to this approach outperforming 454 in amplicons with higher diversity.

  18. Validity and sensitivity to change of the semi-quantitative OMERACT ultrasound scoring system for tenosynovitis in patients with rheumatoid arthritis.

    PubMed

    Ammitzbøll-Danielsen, Mads; Østergaard, Mikkel; Naredo, Esperanza; Terslev, Lene

    2016-12-01

    The aim was to evaluate the metric properties of the semi-quantitative OMERACT US scoring system vs a novel quantitative US scoring system for tenosynovitis, by testing its intra- and inter-reader reliability, sensitivity to change and comparison with clinical tenosynovitis scoring in a 6-month follow-up study. US and clinical assessments of the tendon sheaths of the clinically most affected hand and foot were performed at baseline, 3 and 6 months in 51 patients with RA. Tenosynovitis was assessed using the semi-quantitative scoring system (0-3) proposed by the OMERACT US group and a new quantitative US evaluation (0-100). A sum for US grey scale (GS), colour Doppler (CD) and pixel index (PI), respectively, was calculated for each patient. In 20 patients, intra- and inter-observer agreement was established between two independent investigators. A binary clinical tenosynovitis score was performed, calculating a sum score per patient. The intra- and inter-observer agreements for US tenosynovitis assessments were very good at baseline and for change for GS and CD, but less good for PI. The smallest detectable change was 0.97 for GS, 0.93 for CD and 30.1 for PI. The sensitivity to change from month 0 to 6 was high for GS and CD, and slightly higher than for clinical tenosynovitis score and PI. This study demonstrated an excellent intra- and inter-reader agreement between two investigators for the OMERACT US scoring system for tenosynovitis and a high ability to detect changes over time. Quantitative assessment by PI did not add further information. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Scanning transmission ion microscopy mass measurements for quantitative trace element analysis within biological samples and validation using atomic force microscopy thickness measurements

    NASA Astrophysics Data System (ADS)

    Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard

    2004-10-01

    We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).

  20. Change Detection Algorithms for Surveillance in Visual IoT: A Comparative Study

    NASA Astrophysics Data System (ADS)

    Akram, Beenish Ayesha; Zafar, Amna; Akbar, Ali Hammad; Wajid, Bilal; Chaudhry, Shafique Ahmad

    2018-01-01

    The VIoT (Visual Internet of Things) connects virtual information world with real world objects using sensors and pervasive computing. For video surveillance in VIoT, ChD (Change Detection) is a critical component. ChD algorithms identify regions of change in multiple images of the same scene recorded at different time intervals for video surveillance. This paper presents performance comparison of histogram thresholding and classification ChD algorithms using quantitative measures for video surveillance in VIoT based on salient features of datasets. The thresholding algorithms Otsu, Kapur, Rosin and classification methods k-means, EM (Expectation Maximization) were simulated in MATLAB using diverse datasets. For performance evaluation, the quantitative measures used include OSR (Overall Success Rate), YC (Yule's Coefficient) and JC (Jaccard's Coefficient), execution time and memory consumption. Experimental results showed that Kapur's algorithm performed better for both indoor and outdoor environments with illumination changes, shadowing and medium to fast moving objects. However, it reflected degraded performance for small object size with minor changes. Otsu algorithm showed better results for indoor environments with slow to medium changes and nomadic object mobility. k-means showed good results in indoor environment with small object size producing slow change, no shadowing and scarce illumination changes.

  1. Technology efficacy in active prosthetic knees for transfemoral amputees: a quantitative evaluation.

    PubMed

    El-Sayed, Amr M; Hamzaid, Nur Azah; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development.

  2. Quantitative assessment of joint position sense recovery in subacute stroke patients: a pilot study.

    PubMed

    Kattenstroth, Jan-Christoph; Kalisch, Tobias; Kowalewski, Rebecca; Tegenthoff, Martin; Dinse, Hubert R

    2013-11-01

    To assess joint position sense performance in subacute stroke patients using a novel quantitative assessment. Proof-of-principle pilot study with a group of subacute stroke patients. Assessment at baseline and after 2 weeks of intervention. Additional data for a healthy age-matched control group. Ten subacute stroke patients (aged 65.41 years (standard deviation 2.5), 4 females, 2.3 weeks (standard deviation 0.2)) post-stroke receiving in-patient standard rehabilitation and repetitive electrical stimulation of the affected hand. Joint position sense was assessed based on the ability of correctly perceiving the opening angles of the finger joints. Patients had to report size differences of polystyrene balls of various sizes, whilst the balls were enclosed simultaneously by the affected and the non-affected hands. A total of 21 pairwise size comparisons was used to quantify joint position performance. After 2 weeks of therapeutic intervention a significant improvement in joint position sense performance was observed; however, the performance level was still below that of a healthy control group. The results indicate high feasibility and sensitivity of the joint position test in subacute stroke patients. Testing allowed quantification of both the deficit and the rehabilitation outcome.

  3. Intraindividual Crossover Comparison of Gadoxetic Acid Dose for Liver MRI in Normal Volunteers.

    PubMed

    Motosugi, Utaroh; Bannas, Peter; Hernando, Diego; Salmani Rahimi, Mahdi; Holmes, James H; Reeder, Scott B

    2016-01-01

    We performed a quantitative intraindividual comparison of the performance of 0.025- and 0.05-mmol/kg doses for gadoxetic acid-enhanced liver magnetic resonance (MR) imaging. Eleven healthy volunteers underwent liver MR imaging twice, once with a 0.025- and once with a 0.05-mmol/kg dose of gadoxetic acid. MR spectroscopy and 3-dimensional gradient-echo T1-weighted images (3D-GRE) were obtained before and 3, 10, and 20 min after injection of the contrast medium to measure T1 and T2 values and signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) performance. During the dynamic phase, highly time-resolved 3D-GRE was used to estimate the relative CNR (CNRrel) of the hepatic artery and portal vein (PV) to the liver. We used paired t-tests to compare the results of different doses. During the hepatobiliary phase, we observed shorter T1 values and higher SNRs of the liver (P < 0.001) and higher liver-to-PV and liver-to-muscle CNRs (P < 0.002) using 0.05 mmol/kg compared to 0.025 mmol/kg. Increasing the dose to 0.05 mmol/kg yielded a greater T1-shortening effect at 10 min delay even compared with 0.025 mmol/kg at 20 min (P < 0.001). During the dynamic phase, the peak CNRrel for the hepatic artery and portal vein were higher using 0.05 mmol/kg (P = 0.007 to 0.035). Use of gadoxetic acid at a dose of 0.05 mmol/kg leads to significantly higher SNR and CNR performance than with 0.025 mmol/kg. Quantitatively, a 10-min delay may be feasible for hepatobiliary-phase imaging when using 0.05 mmol/kg of gadoxetic acid.

  4. Scoring severity in trauma: comparison of prehospital scoring systems in trauma ICU patients.

    PubMed

    Llompart-Pou, J A; Chico-Fernández, M; Sánchez-Casado, M; Salaberria-Udabe, R; Carbayo-Górriz, C; Guerrero-López, F; González-Robledo, J; Ballesteros-Sanz, M Á; Herrán-Monge, R; Servià-Goixart, L; León-López, R; Val-Jordán, E

    2017-06-01

    We evaluated the predictive ability of mechanism, Glasgow coma scale, age and arterial pressure (MGAP), Glasgow coma scale, age and systolic blood pressure (GAP), and triage-revised trauma Score (T-RTS) scores in patients from the Spanish trauma ICU registry using the trauma and injury severity score (TRISS) as a reference standard. Patients admitted for traumatic disease in the participating ICU were included. Quantitative data were reported as median [interquartile range (IQR), categorical data as number (percentage)]. Comparisons between groups with quantitative variables and categorical variables were performed using Student's T Test and Chi Square Test, respectively. We performed receiving operating curves (ROC) and evaluated the area under the curve (AUC) with its 95 % confidence interval (CI). Sensitivity, specificity, positive predictive and negative predictive values and accuracy were evaluated in all the scores. A value of p < 0.05 was considered significant. The final sample included 1361 trauma ICU patients. Median age was 45 (30-61) years. 1092 patients (80.3 %) were male. Median ISS was 18 (13-26) and median T-RTS was 11 (10-12). Median GAP was 20 (15-22) and median MGAP 24 (20-27). Observed mortality was 17.7 % whilst predicted mortality using TRISS was 16.9 %. The AUC in the scores evaluated was: TRISS 0.897 (95 % CI 0.876-0.918), MGAP 0.860 (95 % CI 0.835-0.886), GAP 0.849 (95 % CI 0.823-0.876) and T-RTS 0.796 (95 % CI 0.762-0.830). Both MGAP and GAP scores performed better than the T-RTS in the prediction of hospital mortality in Spanish trauma ICU patients. Since these are easy-to-perform scores, they should be incorporated in clinical practice as a triaging tool.

  5. The Local Geometry of Multiattribute Tradeoff Preferences

    PubMed Central

    McGeachie, Michael; Doyle, Jon

    2011-01-01

    Existing representations for multiattribute ceteris paribus preference statements have provided useful treatments and clear semantics for qualitative comparisons, but have not provided similarly clear representations or semantics for comparisons involving quantitative tradeoffs. We use directional derivatives and other concepts from elementary differential geometry to interpret conditional multiattribute ceteris paribus preference comparisons that state bounds on quantitative tradeoff ratios. This semantics extends the familiar economic notion of marginal rate of substitution to multiple continuous or discrete attributes. The same geometric concepts also provide means for interpreting statements about the relative importance of different attributes. PMID:21528018

  6. Reinforcer control by comparison-stimulus color and location in a delayed matching-to-sample task.

    PubMed

    Alsop, Brent; Jones, B Max

    2008-05-01

    Six pigeons were trained in a delayed matching-to-sample task involving bright- and dim-yellow samples on a central key, a five-peck response requirement to either sample, a constant 1.5-s delay, and the presentation of comparison stimuli composed of red on the left key and green on the right key or vice versa. Green-key responses were occasionally reinforced following the dimmer-yellow sample, and red-key responses were occasionally reinforced following the brighter-yellow sample. Reinforcer delivery was controlled such that the distribution of reinforcers across both comparison-stimulus color and comparison-stimulus location could be varied systematically and independently across conditions. Matching accuracy was high throughout. The ratio of left to right side-key responses increased as the ratio of left to right reinforcers increased, the ratio of red to green responses increased as the ratio of red to green reinforcers increased, and there was no interaction between these variables. However, side-key biases were more sensitive to the distribution of reinforcers across key location than were comparison-color biases to the distribution of reinforcers across key color. An extension of Davison and Tustin's (1978) model of DMTS performance fit the data well, but the results were also consistent with an alternative theory of conditional discrimination performance (Jones, 2003) that calls for a conceptually distinct quantitative model.

  7. Chemometric Methods to Quantify 1D and 2D NMR Spectral Differences Among Similar Protein Therapeutics.

    PubMed

    Chen, Kang; Park, Junyong; Li, Feng; Patil, Sharadrao M; Keire, David A

    2018-04-01

    NMR spectroscopy is an emerging analytical tool for measuring complex drug product qualities, e.g., protein higher order structure (HOS) or heparin chemical composition. Most drug NMR spectra have been visually analyzed; however, NMR spectra are inherently quantitative and multivariate and thus suitable for chemometric analysis. Therefore, quantitative measurements derived from chemometric comparisons between spectra could be a key step in establishing acceptance criteria for a new generic drug or a new batch after manufacture change. To measure the capability of chemometric methods to differentiate comparator NMR spectra, we calculated inter-spectra difference metrics on 1D/2D spectra of two insulin drugs, Humulin R® and Novolin R®, from different manufacturers. Both insulin drugs have an identical drug substance but differ in formulation. Chemometric methods (i.e., principal component analysis (PCA), 3-way Tucker3 or graph invariant (GI)) were performed to calculate Mahalanobis distance (D M ) between the two brands (inter-brand) and distance ratio (D R ) among the different lots (intra-brand). The PCA on 1D inter-brand spectral comparison yielded a D M value of 213. In comparing 2D spectra, the Tucker3 analysis yielded the highest differentiability value (D M  = 305) in the comparisons made followed by PCA (D M  = 255) then the GI method (D M  = 40). In conclusion, drug quality comparisons among different lots might benefit from PCA on 1D spectra for rapidly comparing many samples, while higher resolution but more time-consuming 2D-NMR-data-based comparisons using Tucker3 analysis or PCA provide a greater level of assurance for drug structural similarity evaluation between drug brands.

  8. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  9. Three-Dimensional Registration for Handheld Profiling Systems Based on Multiple Shot Structured Light

    PubMed Central

    Ayaz, Shirazi Muhammad; Kim, Min Young

    2018-01-01

    In this article, a multi-view registration approach for the 3D handheld profiling system based on the multiple shot structured light technique is proposed. The multi-view registration approach is categorized into coarse registration and point cloud refinement using the iterative closest point (ICP) algorithm. Coarse registration of multiple point clouds was performed using relative orientation and translation parameters estimated via homography-based visual navigation. The proposed system was evaluated using an artificial human skull and a paper box object. For the quantitative evaluation of the accuracy of a single 3D scan, a paper box was reconstructed, and the mean errors in its height and breadth were found to be 9.4 μm and 23 μm, respectively. A comprehensive quantitative evaluation and comparison of proposed algorithm was performed with other variants of ICP. The root mean square error for the ICP algorithm to register a pair of point clouds of the skull object was also found to be less than 1 mm. PMID:29642552

  10. Option generation in the treatment of unstable patients: An experienced-novice comparison study.

    PubMed

    Whyte, James; Pickett-Hauber, Roxanne; Whyte, Maria D

    2016-09-01

    There are a dearth of studies that quantitatively measure nurses' appreciation of stimuli and the subsequent generation of options in practice environments. The purpose of this paper was to provide an examination of nurses' ability to solve problems while quantifying the stimuli upon which they focus during patient care activities. The study used a quantitative descriptive method that gathered performance data from a simulated task environment using multi-angle video and audio. These videos were coded and transcripts of all of the actions that occurred in the scenario and the verbal reports of the participants were compiled. The results revealed a pattern of superiority of the experienced exemplar group. Novice actions were characterized by difficulty in following common protocols, inconsistencies in their evaluative approaches, and a pattern of omissions of key actions. The study provides support for the deliberate practice-based programs designed to facilitate higher-level performance in novices. © 2016 John Wiley & Sons Australia, Ltd.

  11. A new software for dimensional measurements in 3D endodontic root canal instrumentation.

    PubMed

    Sinibaldi, Raffaele; Pecci, Raffaella; Somma, Francesco; Della Penna, Stefania; Bedini, Rossella

    2012-01-01

    The main issue to be faced to get size estimates of 3D modification of the dental canal after endodontic treatment is the co-registration of the image stacks obtained through micro computed tomography (micro-CT) scans before and after treatment. Here quantitative analysis of micro-CT images have been performed by means of new dedicated software targeted to the analysis of root canal after endodontic instrumentation. This software analytically calculates the best superposition between the pre and post structures using the inertia tensor of the tooth. This strategy avoid minimization procedures, which can be user dependent, and time consuming. Once the co-registration have been achieved dimensional measurements have then been performed by contemporary evaluation of quantitative parameters over the two superimposed stacks of micro-CT images. The software automatically calculated the changes of volume, surface and symmetry axes in 3D occurring after the instrumentation. The calculation is based on direct comparison of the canal and canal branches selected by the user on the pre treatment image stack.

  12. Neutron activation analysis: A primary method of measurement

    NASA Astrophysics Data System (ADS)

    Greenberg, Robert R.; Bode, Peter; De Nadai Fernandes, Elisabete A.

    2011-03-01

    Neutron activation analysis (NAA), based on the comparator method, has the potential to fulfill the requirements of a primary ratio method as defined in 1998 by the Comité Consultatif pour la Quantité de Matière — Métrologie en Chimie (CCQM, Consultative Committee on Amount of Substance — Metrology in Chemistry). This thesis is evidenced in this paper in three chapters by: demonstration that the method is fully physically and chemically understood; that a measurement equation can be written down in which the values of all parameters have dimensions in SI units and thus having the potential for metrological traceability to these units; that all contributions to uncertainty of measurement can be quantitatively evaluated, underpinning the metrological traceability; and that the performance of NAA in CCQM key-comparisons of trace elements in complex matrices between 2000 and 2007 is similar to the performance of Isotope Dilution Mass Spectrometry (IDMS), which had been formerly designated by the CCQM as a primary ratio method.

  13. Keratinocyte growth factor and the expression of wound-healing-related genes in primary human keratinocytes from burn patients.

    PubMed

    Chomiski, Verônica; Gragnani, Alfredo; Bonucci, Jéssica; Correa, Silvana Aparecida Alves; Noronha, Samuel Marcos Ribeiro de; Ferreira, Lydia Masako

    2016-08-01

    To evaluate the effect of keratinocyte growth factor (KGF) treatment on the expression of wound-healing-related genes in cultured keratinocytes from burn patients. Keratinocytes were cultured and divided into 4 groups (n=4 in each group): TKB (KGF-treated keratinocytes from burn patients), UKB (untreated keratinocytes from burn patients), TKC (KGF-treated keratinocytes from controls), and UKC (untreated keratinocytes from controls). Gene expression analysis using quantitative polymerase chain reaction (qPCR) array was performed to compare (1) TKC versus UKC, (2) UKB versus UKC, (3) TKB versus UKC, (4) TKB versus UKB, (5) TKB versus TKC, and (6) UKB versus TKC. Comparison 1 showed one down-regulated and one up-regulated gene; comparisons 2 and 3 resulted in the same five down-regulated genes; comparison 4 had no significant difference in relative gene expression; comparison 5 showed 26 down-regulated and 7 up-regulated genes; and comparison 6 showed 25 down-regulated and 11 up-regulated genes. There was no differential expression of wound-healing-related genes in cultured primary keratinocytes from burn patients treated with keratinocyte growth factor.

  14. Deep Learning for Magnetic Resonance Fingerprinting: A New Approach for Predicting Quantitative Parameter Values from Time Series.

    PubMed

    Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas

    2017-01-01

    The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.

  15. Spatially Regularized Machine Learning for Task and Resting-state fMRI

    PubMed Central

    Song, Xiaomu; Panych, Lawrence P.; Chen, Nan-kuei

    2015-01-01

    Background Reliable mapping of brain function across sessions and/or subjects in task- and resting-state has been a critical challenge for quantitative fMRI studies although it has been intensively addressed in the past decades. New Method A spatially regularized support vector machine (SVM) technique was developed for the reliable brain mapping in task- and resting-state. Unlike most existing SVM-based brain mapping techniques, which implement supervised classifications of specific brain functional states or disorders, the proposed method performs a semi-supervised classification for the general brain function mapping where spatial correlation of fMRI is integrated into the SVM learning. The method can adapt to intra- and inter-subject variations induced by fMRI nonstationarity, and identify a true boundary between active and inactive voxels, or between functionally connected and unconnected voxels in a feature space. Results The method was evaluated using synthetic and experimental data at the individual and group level. Multiple features were evaluated in terms of their contributions to the spatially regularized SVM learning. Reliable mapping results in both task- and resting-state were obtained from individual subjects and at the group level. Comparison with Existing Methods A comparison study was performed with independent component analysis, general linear model, and correlation analysis methods. Experimental results indicate that the proposed method can provide a better or comparable mapping performance at the individual and group level. Conclusions The proposed method can provide accurate and reliable mapping of brain function in task- and resting-state, and is applicable to a variety of quantitative fMRI studies. PMID:26470627

  16. Flow cytometric immunobead assay for quantitative detection of platelet autoantibodies in immune thrombocytopenia patients.

    PubMed

    Zhai, Juping; Ding, Mengyuan; Yang, Tianjie; Zuo, Bin; Weng, Zhen; Zhao, Yunxiao; He, Jun; Wu, Qingyu; Ruan, Changgeng; He, Yang

    2017-10-23

    Platelet autoantibody detection is critical for immune thrombocytopenia (ITP) diagnosis and prognosis. Therefore, we aimed to establish a quantitative flow cytometric immunobead assay (FCIA) for ITP platelet autoantibodies evaluation. Capture microbeads coupled with anti-GPIX, -GPIb, -GPIIb, -GPIIIa and P-selectin antibodies were used to bind the platelet-bound autoantibodies complex generated from plasma samples of 250 ITP patients, 163 non-ITP patients and 243 healthy controls, a fluorescein isothiocyanate (FITC)-conjugated secondary antibody was the detector reagent and mean fluorescence intensity (MFI) signals were recorded by flow cytometry. Intra- and inter-assay variations of the quantitative FCIA assay were assessed. Comparisons of the specificity, sensitivity and accuracy between quantitative and qualitative FCIA or monoclonal antibody immobilization of platelet antigen (MAIPA) assay were performed. Finally, treatment process was monitored by our quantitative FCIA in 8 newly diagnosed ITPs. The coefficient of variations (CV) of the quantitative FCIA assay were respectively 9.4, 3.8, 5.4, 5.1 and 5.8% for anti-GPIX, -GPIb, -GPIIIa, -GPIIb and -P-selectin autoantibodies. Elevated levels of autoantibodies against platelet glycoproteins GPIX, GPIb, GPIIIa, GPIIb and P-selectin were detected by our quantitative FCIA in ITP patients compared to non-ITP patients or healthy controls. The sensitivity, specificity and accuracy of our quantitative assay were respectively 73.13, 81.98 and 78.65% when combining all 5 autoantibodies, while the sensitivity, specificity and accuracy of MAIPA assay were respectively 41.46, 90.41 and 72.81%. A quantitative FCIA assay was established. Reduced levels of platelet autoantibodies could be confirmed by our quantitative FCIA in ITP patients after corticosteroid treatment. Our quantitative assay is not only good for ITP diagnosis but also for ITP treatment monitoring.

  17. Cost benefit analysis of the transfer of NASA remote sensing technology to the state of Georgia

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P. (Principal Investigator); Wilkins, R. D.; Kelly, D. L.; Brown, D. M.

    1977-01-01

    The author has identified the following significant results. First order benefits can generally be quantified, thus allowing quantitative comparisons of candidate land cover data systems. A meaningful dollar evaluation of LANDSAT can be made by a cost comparison with equally effective data systems. Users of LANDSAT data can be usefully categorized as performing three general functions: planning, permitting, and enforcing. The value of LANDSAT data to the State of Georgia is most sensitive to the parameters: discount rate, digitization cost, and photo acquisition cost. Under a constrained budget, LANDSAT could provide digitized land cover information roughly seven times more frequently than could otherwise be obtained. Thus on one hand, while the services derived from LANDSAT data in comparison to the baseline system has a positive net present value, on the other hand if the budget were constrained, more frequent information could be provided using the LANDSAT system than otherwise be obtained.

  18. Evaluation of CAMEL - comprehensive areal model of earthquake-induced landslides

    USGS Publications Warehouse

    Miles, S.B.; Keefer, D.K.

    2009-01-01

    A new comprehensive areal model of earthquake-induced landslides (CAMEL) has been developed to assist in planning decisions related to disaster risk reduction. CAMEL provides an integrated framework for modeling all types of earthquake-induced landslides using fuzzy logic systems and geographic information systems. CAMEL is designed to facilitate quantitative and qualitative representation of terrain conditions and knowledge about these conditions on the likely areal concentration of each landslide type. CAMEL has been empirically evaluated with respect to disrupted landslides (Category I) using a case study of the 1989 M = 6.9 Loma Prieta, CA earthquake. In this case, CAMEL performs best in comparison to disrupted slides and falls in soil. For disrupted rock fall and slides, CAMEL's performance was slightly poorer. The model predicted a low occurrence of rock avalanches, when none in fact occurred. A similar comparison with the Loma Prieta case study was also conducted using a simplified Newmark displacement model. The area under the curve method of evaluation was used in order to draw comparisons between both models, revealing improved performance with CAMEL. CAMEL should not however be viewed as a strict alternative to Newmark displacement models. CAMEL can be used to integrate Newmark displacements with other, previously incompatible, types of knowledge. ?? 2008 Elsevier B.V.

  19. Usefulness of quantitative susceptibility mapping for the diagnosis of Parkinson disease.

    PubMed

    Murakami, Y; Kakeda, S; Watanabe, K; Ueda, I; Ogasawara, A; Moriya, J; Ide, S; Futatsuya, K; Sato, T; Okada, K; Uozumi, T; Tsuji, S; Liu, T; Wang, Y; Korogi, Y

    2015-06-01

    Quantitative susceptibility mapping allows overcoming several nonlocal restrictions of susceptibility-weighted and phase imaging and enables quantification of magnetic susceptibility. We compared the diagnostic accuracy of quantitative susceptibility mapping and R2* (1/T2*) mapping to discriminate between patients with Parkinson disease and controls. For 21 patients with Parkinson disease and 21 age- and sex-matched controls, 2 radiologists measured the quantitative susceptibility mapping values and R2* values in 6 brain structures (the thalamus, putamen, caudate nucleus, pallidum, substantia nigra, and red nucleus). The quantitative susceptibility mapping values and R2* values of the substantia nigra were significantly higher in patients with Parkinson disease (P < .01); measurements in other brain regions did not differ significantly between patients and controls. For the discrimination of patients with Parkinson disease from controls, receiver operating characteristic analysis suggested that the optimal cutoff values for the substantia nigra, based on the Youden Index, were >0.210 for quantitative susceptibility mapping and >28.8 for R2*. The sensitivity, specificity, and accuracy of quantitative susceptibility mapping were 90% (19 of 21), 86% (18 of 21), and 88% (37 of 42), respectively; for R2* mapping, they were 81% (17 of 21), 52% (11 of 21), and 67% (28 of 42). Pair-wise comparisons showed that the areas under the receiver operating characteristic curves were significantly larger for quantitative susceptibility mapping than for R2* mapping (0.91 versus 0.69, P < .05). Quantitative susceptibility mapping showed higher diagnostic performance than R2* mapping for the discrimination between patients with Parkinson disease and controls. © 2015 by American Journal of Neuroradiology.

  20. Comparative Validation of Five Quantitative Rapid Test Kits for the Analysis of Salt Iodine Content: Laboratory Performance, User- and Field-Friendliness

    PubMed Central

    Rohner, Fabian; Kangambèga, Marcelline O.; Khan, Noor; Kargougou, Robert; Garnier, Denis; Sanou, Ibrahima; Ouaro, Bertine D.; Petry, Nicolai; Wirth, James P.; Jooste, Pieter

    2015-01-01

    Background Iodine deficiency has important health and development consequences and the introduction of iodized salt as national programs has been a great public health success in the past decades. To render national salt iodization programs sustainable and ensure adequate iodization levels, simple methods to quantitatively assess whether salt is adequately iodized are required. Several methods claim to be simple and reliable, and are available on the market or are in development. Objective This work has validated the currently available quantitative rapid test kits (quantRTK) in a comparative manner for both their laboratory performance and ease of use in field settings. Methods Laboratory performance parameters (linearity, detection and quantification limit, intra- and inter-assay imprecision) were conducted on 5 quantRTK. We assessed inter-operator imprecision using salt of different quality along with the comparison of 59 salt samples from across the globe; measurements were made both in a laboratory and a field setting by technicians and non-technicians. Results from the quantRTK were compared against iodometric titration for validity. An ‘ease-of-use’ rating system was developed to identify the most suitable quantRTK for a given task. Results Most of the devices showed acceptable laboratory performance, but for some of the devices, use by non-technicians revealed poorer performance when working in a routine manner. Of the quantRTK tested, the iCheck® and I-Reader® showed most consistent performance and ease of use, and a newly developed paper-based method (saltPAD) holds promise if further developed. Conclusions User- and field-friendly devices are now available and the most appropriate quantRTK can be selected depending on the number of samples and the budget available. PMID:26401655

  1. Numerical and Qualitative Contrasts of Two Statistical Models ...

    EPA Pesticide Factsheets

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and products. This study provided an empirical and qualitative comparison of both models using 29 years of data for two discrete time series of chlorophyll-a (chl-a) in the Patuxent River estuary. Empirical descriptions of each model were based on predictive performance against the observed data, ability to reproduce flow-normalized trends with simulated data, and comparisons of performance with validation datasets. Between-model differences were apparent but minor and both models had comparable abilities to remove flow effects from simulated time series. Both models similarly predicted observations for missing data with different characteristics. Trends from each model revealed distinct mainstem influences of the Chesapeake Bay with both models predicting a roughly 65% increase in chl-a over time in the lower estuary, whereas flow-normalized predictions for the upper estuary showed a more dynamic pattern, with a nearly 100% increase in chl-a in the last 10 years. Qualitative comparisons highlighted important differences in the statistical structure, available products, and characteristics of the data and desired analysis. This manuscript describes a quantitative comparison of two recently-

  2. Experimental comparison of landmark-based methods for 3D elastic registration of pre- and postoperative liver CT data

    NASA Astrophysics Data System (ADS)

    Lange, Thomas; Wörz, Stefan; Rohr, Karl; Schlag, Peter M.

    2009-02-01

    The qualitative and quantitative comparison of pre- and postoperative image data is an important possibility to validate surgical procedures, in particular, if computer assisted planning and/or navigation is performed. Due to deformations after surgery, partially caused by the removal of tissue, a non-rigid registration scheme is a prerequisite for a precise comparison. Interactive landmark-based schemes are a suitable approach, if high accuracy and reliability is difficult to achieve by automatic registration approaches. Incorporation of a priori knowledge about the anatomical structures to be registered may help to reduce interaction time and improve accuracy. Concerning pre- and postoperative CT data of oncological liver resections the intrahepatic vessels are suitable anatomical structures. In addition to using branching landmarks for registration, we here introduce quasi landmarks at vessel segments with high localization precision perpendicular to the vessels and low precision along the vessels. A comparison of interpolating thin-plate splines (TPS), interpolating Gaussian elastic body splines (GEBS) and approximating GEBS on landmarks at vessel branchings as well as approximating GEBS on the introduced vessel segment landmarks is performed. It turns out that the segment landmarks provide registration accuracies as good as branching landmarks and can improve accuracy if combined with branching landmarks. For a low number of landmarks segment landmarks are even superior.

  3. A Revised Validation Process for Ice Accretion Codes

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Porter, Christopher E.

    2017-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. A quantitative comparison of the results against a database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will extend the comparison of ice shapes between LEWICE 3.5 and experimental data from a previous paper. Comparisons of lift and drag are made between experimentally collected data from experimentally obtained ice shapes and simulated (CFD) data on simulated (LEWICE) ice shapes. Comparisons are also made between experimentally collected and simulated performance data on select experimental ice shapes to ensure the CFD solver, FUN3D, is valid within the flight regime. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  4. Validation Process for LEWICE by Use of a Navier-Stokes Solver

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Porter, Christopher E.

    2017-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. A quantitative comparison of the results against a database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will extend the comparison of ice shapes between LEWICE 3.5 and experimental data from a previous paper. Comparisons of lift and drag are made between experimentally collected data from experimentally obtained ice shapes and simulated (CFD) data on simulated (LEWICE) ice shapes. Comparisons are also made between experimentally collected and simulated performance data on select experimental ice shapes to ensure the CFD solver, FUN3D, is valid within the flight regime. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  5. Estimates of occupational safety and health impacts resulting from large-scale production of major photovoltaic technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, T.; Ungers, L.; Briggs, T.

    1980-08-01

    The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many ofmore » the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.« less

  6. The photospheric magnetic flux budget

    NASA Technical Reports Server (NTRS)

    Schrijver, C. J.; Harvey, K. L.

    1994-01-01

    The ensemble of bipolar regions and the magnetic network both contain a substantial and strongly variable part of the photospheric magnetic flux at any phase in the solar cycle. The time-dependent distribution of the magnetic flux over and within these components reflects the action of the dynamo operating in the solar interior. We perform a quantitative comparison of the flux emerging in the ensemble of magnetic bipoles with the observed flux content of the solar photosphere. We discuss the photospheric flux budget in terms of flux appearance and disappearance, and argue that a nonlinear dependence exists between the flux present in the photosphere and the rate of flux appearance and disappearance. In this context, we discuss the problem of making quantitative statements about dynamos in cool stars other than the Sun.

  7. AutoQSAR: an automated machine learning tool for best-practice quantitative structure-activity relationship modeling.

    PubMed

    Dixon, Steven L; Duan, Jianxin; Smith, Ethan; Von Bargen, Christopher D; Sherman, Woody; Repasky, Matthew P

    2016-10-01

    We introduce AutoQSAR, an automated machine-learning application to build, validate and deploy quantitative structure-activity relationship (QSAR) models. The process of descriptor generation, feature selection and the creation of a large number of QSAR models has been automated into a single workflow within AutoQSAR. The models are built using a variety of machine-learning methods, and each model is scored using a novel approach. Effectiveness of the method is demonstrated through comparison with literature QSAR models using identical datasets for six end points: protein-ligand binding affinity, solubility, blood-brain barrier permeability, carcinogenicity, mutagenicity and bioaccumulation in fish. AutoQSAR demonstrates similar or better predictive performance as compared with published results for four of the six endpoints while requiring minimal human time and expertise.

  8. A Taylor weak-statement algorithm for hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Kim, J. W.

    1987-01-01

    Finite element analysis, applied to computational fluid dynamics (CFD) problem classes, presents a formal procedure for establishing the ingredients of a discrete approximation numerical solution algorithm. A classical Galerkin weak-statement formulation, formed on a Taylor series extension of the conservation law system, is developed herein that embeds a set of parameters eligible for constraint according to specification of suitable norms. The derived family of Taylor weak statements is shown to contain, as special cases, over one dozen independently derived CFD algorithms published over the past several decades for the high speed flow problem class. A theoretical analysis is completed that facilitates direct qualitative comparisons. Numerical results for definitive linear and nonlinear test problems permit direct quantitative performance comparisons.

  9. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  10. Internal tibial torsion correction study. [measurements of strain for corrective rotation of stressed tibia

    NASA Technical Reports Server (NTRS)

    Cantu, J. M.; Madigan, C. M.

    1974-01-01

    A quantitative study of internal torsion in the entire tibial bone was performed by using strain gauges to measure the amount of deformation occuring at different locations. Comparison of strain measurements with physical dimensions of the bone produced the modulus of rigidity and its behavior under increased torque. Computerized analysis of the stress distribution shows that more strain occurs near the torqued ends of the bones where also most of the twisting and fracturing takes place.

  11. Energy deposition dynamics of femtosecond pulses in water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minardi, Stefano, E-mail: stefano@stefanominardi.eu; Pertsch, Thomas; Milián, Carles

    2014-12-01

    We exploit inverse Raman scattering and solvated electron absorption to perform a quantitative characterization of the energy loss and ionization dynamics in water with tightly focused near-infrared femtosecond pulses. A comparison between experimental data and numerical simulations suggests that the ionization energy of water is 8 eV, rather than the commonly used value of 6.5 eV. We also introduce an equation for the Raman gain valid for ultra-short pulses that validates our experimental procedure.

  12. Evaluation of the clinical sensitivity for the quantification of human immunodeficiency virus type 1 RNA in plasma: Comparison of the new COBAS TaqMan HIV-1 with three current HIV-RNA assays--LCx HIV RNA quantitative, VERSANT HIV-1 RNA 3.0 (bDNA) and COBAS AMPLICOR HIV-1 Monitor v1.5.

    PubMed

    Katsoulidou, Antigoni; Petrodaskalaki, Maria; Sypsa, Vana; Papachristou, Eleni; Anastassopoulou, Cleo G; Gargalianos, Panagiotis; Karafoulidou, Anastasia; Lazanas, Marios; Kordossis, Theodoros; Andoniadou, Anastasia; Hatzakis, Angelos

    2006-02-01

    The COBAS TaqMan HIV-1 test (Roche Diagnostics) was compared with the LCx HIV RNA quantitative assay (Abbott Laboratories), the Versant HIV-1 RNA 3.0 (bDNA) assay (Bayer) and the COBAS Amplicor HIV-1 Monitor v1.5 test (Roche Diagnostics), using plasma samples of various viral load levels from HIV-1-infected individuals. In the comparison of TaqMan with LCx, TaqMan identified as positive 77.5% of the 240 samples versus 72.1% identified by LCx assay, while their overall agreement was 94.6% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.91). Similarly, in the comparison of TaqMan with bDNA 3.0, both methods identified 76.3% of the 177 samples as positive, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.95). Finally, in the comparison of TaqMan with Monitor v1.5, TaqMan identified 79.5% of the 156 samples as positive versus 80.1% identified by Monitor v1.5, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.96). In conclusion, the new COBAS TaqMan HIV-1 test showed excellent agreement with other widely used commercially available tests for the quantitation of HIV-1 viral load.

  13. COMPARISON OF GENETIC METHODS TO OPTICAL METHODS IN THE IDENTIFICATION AND ASSESSMENT OF MOLD IN THE BUILT ENVIRONMENT -- COMPARISON OF TAQMAN AND MICROSCOPIC ANALYSIS OF CLADOSPORIUM SPORES RETRIEVED FROM ZEFON AIR-O-CELL TRACES

    EPA Science Inventory

    Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.

    In this pilot study, quantitative...

  14. Quantitative Analysis and Comparison of Four Major Flavonol Glycosides in the Leaves of Toona sinensis (A. Juss.) Roemer (Chinese Toon) from Various Origins by High-Performance Liquid Chromatography-Diode Array Detector and Hierarchical Clustering Analysis

    PubMed Central

    Sun, Xiaoxiang; Zhang, Liting; Cao, Yaqi; Gu, Qinying; Yang, Huan; Tam, James P.

    2016-01-01

    Background: Toona sinensis (A. Juss.) Roemer is an endemic species of Toona genus native to Asian area. Its dried leaves are applied in the treatment of many diseases; however, few investigations have been reported for the quantitative analysis and comparison of major bioactive flavonol glycosides in the leaves harvested from various origins. Objective: To quantitatively analyze four major flavonol glycosides including rutinoside, quercetin-3-O-β-D-glucoside, quercetin-3-O-α-L-rhamnoside, and kaempferol-3-O-α-L-rhamnoside in the leaves from different production sites and classify them according to the content of these glycosides. Materials and Methods: A high-performance liquid chromatography-diode array detector (HPLC-DAD) method for their simultaneous determination was developed and validated for linearity, precision, accuracy, stability, and repeatability. Moreover, the method established was then employed to explore the difference in the content of these four glycosides in raw materials. Finally, a hierarchical clustering analysis was performed to classify 11 voucher specimens. Results: The separation was performed on a Waters XBridge Shield RP18 column (150 mm × 4.6 mm, 3.5 μm) kept at 35°C, and acetonitrile and H2O containing 0.30% trifluoroacetic acid as mobile phase was driven at 1.0 mL/min during the analysis. Ten microliters of solution were injected and 254 nm was selected to monitor the separation. A strong linear relationship between the peak area and concentration of four analytes was observed. And, the method was also validated to be repeatable, stable, precise, and accurate. Conclusion: An efficient and reliable HPLC-DAD method was established and applied in the assays for the samples from 11 origins successfully. Moreover, the content of those flavonol glycosides varied much among different batches, and the flavonoids could be considered as biomarkers to control the quality of Chinese Toon. SUMMARY Four major flavonol glycosides in the leaves of Toona sinensis were determined by HPLC-DAD and their contents were compared among various origins by HCA. Abbreviations used: HPLC-DAD: High-performance liquid chromatography-diode array detector, HCA: Hierarchical clustering analysis, MS: Mass spectrometry, RSD: Relative standard deviation. PMID:27279719

  15. Assessment of simple colorimetric procedures to determine smoking status of diabetic subjects.

    PubMed

    Smith, R F; Mather, H M; Ellard, G A

    1998-02-01

    The performance of a simple colorimetric assay for urinary nicotine metabolites to assess smoking status in diabetic subjects (n = 251) was investigated. Several variations of the colorimetric assay and a qualitative extraction procedure were evaluated in comparison with a cotinine immunoassay as the "gold standard." Among these, the best overall performance was achieved with the qualitative test (sensitivity 95%; specificity 100%). The quantitative measurement of total nicotine metabolites performed less well (sensitivity 92%; specificity 97%) but could be improved by incorporating a blank extraction (sensitivity 98%; specificity 98%). Allowance for diuresis appeared to offer no advantage over the other methods. These results support previous findings regarding the use of these colorimetric procedures in nondiabetic subjects and, contrary to other recent observations, their performance was not impaired in diabetic patients.

  16. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  17. Quantitative Measurements of Nitric Oxide Concentration in High-Pressure, Swirl-Stabilized Spray Flames

    NASA Technical Reports Server (NTRS)

    Cooper, Clayton S.; Laurendeau, Normand M.; Hicks, Yolanda R. (Technical Monitor)

    2000-01-01

    Lean direct-injection (LDI) spray flames offer the possibility of reducing NO(sub x) emissions from gas turbines by rapid mixing of the liquid fuel and air so as to drive the flame structure toward partially-premixed conditions. We consider the technical approaches required to utilize laser-induced fluorescence methods for quantitatively measuring NO concentrations in high-pressure LDI spray flames. In the progression from atmospheric to high-pressure measurements, the LIF method requires a shift from the saturated to the linear regime of fluorescence measurements. As such, we discuss quantitative, spatially resolved laser-saturated fluorescence (LSF), linear laser-induced fluorescence (LIF), and planar laser-induced fluorescence (PLIF) measurements of NO concentration in LDI spray flames. Spatially-resolved LIF measurements of NO concentration (ppm) are reported for preheated, LDI spray flames at pressures of two to five atmospheres. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q(sub 2)(26.5) transition of the gamma(0,0) band. Detection is performed in a two nanometer region centered on the gamma(0,1) band. A complete scheme is developed by which quantitative NO concentrations in high-pressure LDI spray flames can be measured by applying linear LIF. NO is doped into the reactants and convected through the flame with no apparent destruction, thus allowing a NO fluorescence calibration to be taken inside the flame environment. The in-situ calibration scheme is validated by comparisons to a reference flame. Quantitative NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors. Moreover, parametric studies are provided for variations in pressure, air-preheat temperature, and equivalence ratio. Similar parametric studies are performed for lean, premixed-prevaporized flames to permit comparisons to those for LDI flames. Finally, PLIF is expanded to high pressure in an effort to quantify the detected fluorescence image for LDI flames. Success is achieved by correcting the PLIF calibration via a single-point LIF measurement. This procedure removes the influence of any preferential background that occurs in the PLIF detection window. In general, both the LIF and PLIF measurements verify that the LDI strategy could be used to reduce NO(sub x) emissions in future gas turbine combustors.

  18. A systematic review of quantitative burn wound microbiology in the management of burns patients.

    PubMed

    Halstead, Fenella D; Lee, Kwang Chear; Kwei, Johnny; Dretzke, Janine; Oppenheim, Beryl A; Moiemen, Naiem S

    2018-02-01

    The early diagnosis of infection or sepsis in burns are important for patient care. Globally, a large number of burn centres advocate quantitative cultures of wound biopsies for patient management, since there is assumed to be a direct link between the bioburden of a burn wound and the risk of microbial invasion. Given the conflicting study findings in this area, a systematic review was warranted. Bibliographic databases were searched with no language restrictions to August 2015. Study selection, data extraction and risk of bias assessment were performed in duplicate using pre-defined criteria. Substantial heterogeneity precluded quantitative synthesis, and findings were described narratively, sub-grouped by clinical question. Twenty six laboratory and/or clinical studies were included. Substantial heterogeneity hampered comparisons across studies and interpretation of findings. Limited evidence suggests that (i) more than one quantitative microbiology sample is required to obtain reliable estimates of bacterial load; (ii) biopsies are more sensitive than swabs in diagnosing or predicting sepsis; (iii) high bacterial loads may predict worse clinical outcomes, and (iv) both quantitative and semi-quantitative culture reports need to be interpreted with caution and in the context of other clinical risk factors. The evidence base for the utility and reliability of quantitative microbiology for diagnosing or predicting clinical outcomes in burns patients is limited and often poorly reported. Consequently future research is warranted. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  19. Review of GEM Radiation Belt Dropout and Buildup Challenges

    NASA Astrophysics Data System (ADS)

    Tu, Weichao; Li, Wen; Morley, Steve; Albert, Jay

    2017-04-01

    In Summer 2015 the US NSF GEM (Geospace Environment Modeling) focus group named "Quantitative Assessment of Radiation Belt Modeling" started the "RB dropout" and "RB buildup" challenges, focused on quantitative modeling of the radiation belt buildups and dropouts. This is a community effort which includes selecting challenge events, gathering model inputs that are required to model the radiation belt dynamics during these events (e.g., various magnetospheric waves, plasmapause and density models, electron phase space density data), simulating the challenge events using different types of radiation belt models, and validating the model results by comparison to in situ observations of radiation belt electrons (from Van Allen Probes, THEMIS, GOES, LANL/GEO, etc). The goal is to quantitatively assess the relative importance of various acceleration, transport, and loss processes in the observed radiation belt dropouts and buildups. Since 2015, the community has selected four "challenge" events under four different categories: "storm-time enhancements", "non-storm enhancements", "storm-time dropouts", and "non-storm dropouts". Model inputs and data for each selected event have been coordinated and shared within the community to establish a common basis for simulations and testing. Modelers within and outside US with different types of radiation belt models (diffusion-type, diffusion-convection-type, test particle codes, etc.) have participated in our challenge and shared their simulation results and comparison with spacecraft measurements. Significant progress has been made in quantitative modeling of the radiation belt buildups and dropouts as well as accessing the modeling with new measures of model performance. In this presentation, I will review the activities from our "RB dropout" and "RB buildup" challenges and the progresses achieved in understanding radiation belt physics and improving model validation and verification.

  20. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    PubMed

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (p<0.05). However, the V e values decreased significantly only at week 9 (p=0.032), and no difference in the K ep was found between two groups. The BMD values of the OVX group decreased significantly compared with those of the control group from week 3 (p<0.05). Transmission electron microscopy showed tighter gaps between vascular endothelial cells with swollen mitochondria in the OVX group from week 3. The MVD values of the OVX group decreased significantly compared with those of the control group only at week 12 (p=0.023). A weak positive correlation of E max and a strong positive correlation of K trans with MVD were found. Compared with semi-quantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.

  1. Real-scale comparison between simple and composite raw sewage sampling

    NASA Astrophysics Data System (ADS)

    Sergio Scalize, Paulo; Moraes Frazão, Juliana

    2018-06-01

    The present study performed a qualitative and quantitative characterization of the raw sewage collected at the entrance of the sewage treatment station of the city of Itumbiara, state of Goiás. Samples were collected every two hours over a period of seven consecutive days. Characterization of both point samples and composite samples was performed. The parameters analyzed were: temperature, pH, alkalinity, chemical oxygen demand, oil and grease, electric conductivity, total phosphorus, settleable solids, ammoniacal nitrogen, total suspended solids, volatile suspended solids, fixed suspended solids and turbidity. These results allowed us to verify that it is possible to perform the collection and analysis of a point sample, instead of a composite sample, as a way of monitoring the efficiency of a sewage treatment plant.

  2. Quantity discrimination in canids: Dogs (Canis familiaris) and wolves (Canis lupus) compared.

    PubMed

    Miletto Petrazzini, Maria Elena; Wynne, Clive D L

    2017-11-01

    Accumulating evidence indicates that animals are able to discriminate between quantities. Recent studies have shown that dogs' and coyotes' ability to discriminate between quantities of food items decreases with increasing numerical ratio. Conversely, wolves' performance is not affected by numerical ratio. Cross-species comparisons are difficult because of differences in the methodologies employed, and hence it is still unclear whether domestication altered quantitative abilities in canids. Here we used the same procedure to compare pet dogs and wolves in a spontaneous food choice task. Subjects were presented with two quantities of food items and allowed to choose only one option. Four numerical contrasts of increasing difficulty (range 1-4) were used to assess the influence of numerical ratio on the performance of the two species. Dogs' accuracy was affected by numerical ratio, while no ratio effect was observed in wolves. These results align with previous findings and reinforce the idea of different quantitative competences in dogs and wolves. Although we cannot exclude that other variables might have played a role in shaping quantitative abilities in these two species, our results might suggest that the interspecific differences here reported may have arisen as a result of domestication. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Quantitation of apolipoprotein epsilon gene expression by competitive polymerase chain reaction in a patient with familial apolipoprotein E deficiency.

    PubMed

    Dobmeyer, J M; Rexin, M; Dobmeyer, T S; Klein, S A; Rossol, R; Feussner, G

    1998-06-22

    A simple method of obtaining semiquantitative and reliable data on apolipoprotein (apo) sigma gene expression is described. We detected apo sigma specific sequences by reverse transcription (rT)-PCR. For quantitative measurement, an apo sigma DNA standard was produced allowing the development of a competitive PCR-method. The efficiency of RNA extraction and cDNA synthesis was controlled by quantitation of a housekeeping gene (glyceraldehyde-3-phosphatedehydrogenase, G3PDH) in separate reactions. To imitate a defined induction of apo sigma gene expression, serial twofold dilutions of total RNA were reversely transcribed and the respective cDNAs used to perform a competitive apo sigma and G3PDH PCR. The change in apo sigma cDNA and G3PDH cDNA was 1.7-2.3-fold with an expected value of 2.0-fold. Standard deviations in three independently performed experiments were within a range of < 15% of the mean, indicating low intra-assay variation and high reproducibility. To illustrate this method, apo sigma gene expression was measured in a patient with complete lack of functional active apo E in comparison to healthy controls. The method presented here might be valuable in assessment of apo sigma gene expression in human disease.

  4. Simultaneous determination of eight major steroids from Polyporus umbellatus by high-performance liquid chromatography coupled with mass spectrometry detections.

    PubMed

    Zhao, Ying-yong; Cheng, Xian-long; Zhang, Yongmin; Zhao, Ye; Lin, Rui-chao; Sun, Wen-ji

    2010-02-01

    Polyporus umbellatus is a widely used diuretic herbal medicine. In this study, a high-performance liquid chromatography coupled with atmospheric pressure chemical ionization-mass spectrometric detection (HPLC-APCI-MS) method was developed for qualitative and quantitative analysis of steroids, as well as for the quality control of Polyporus umbellatus. The selectivity, reproducibility and sensitivity were compared with HPLC with photodiode array detection and evaporative light scattering detection (ELSD). Selective ion monitoring in positive mode was used for qualitative and quantitative analysis of eight major components and beta-ecdysterone was used as the internal standard. Limits of detection and quantification fell in the ranges 7-21 and 18-63 ng/mL for the eight analytes with an injection of 10 microL samples, and all calibration curves showed good linear regression (r(2) > 0.9919) within the test range. The quantitative results demonstrated that samples from different localities showed different qualities. Advantages, in comparison with conventional HPLC-diode array detection and HPLC-ELSD, are that reliable identification of target compounds could be achieved by accurate mass measurements along with characteristic retention time, and the great enhancement in selectivity and sensitivity allows identification and quantification of low levels of constituents in complex Polyporus umbellatus matrixes. (c) 2009 John Wiley & Sons, Ltd.

  5. CNV-ROC: A cost effective, computer-aided analytical performance evaluator of chromosomal microarrays

    PubMed Central

    Goodman, Corey W.; Major, Heather J.; Walls, William D.; Sheffield, Val C.; Casavant, Thomas L.; Darbro, Benjamin W.

    2016-01-01

    Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. PMID:25595567

  6. Congruent climate-related genecological responses from molecular markers and quantitative traits for western white pine (Pinus monticola)

    Treesearch

    Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim

    2009-01-01

    Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...

  7. Further Refinement of the LEWICE SLD Model

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    2006-01-01

    A research project is underway at NASA Glenn Research Center to produce computer software that can accurately predict ice growth for any meteorological conditions for any aircraft surface. This report will present results from version 3.2 of this software, which is called LEWICE. This version differs from previous releases in that it incorporates additional thermal analysis capabilities, a pneumatic boot model, interfaces to external computational fluid dynamics (CFD) flow solvers and has an empirical model for the supercooled large droplet (SLD) regime. An extensive comparison against the database of ice shapes and collection efficiencies that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. The complete set of data used for this comparison will eventually be available in a contractor report. This paper will show the differences in collection efficiency and ice shape between LEWICE 3.2 and experimental data. This report will first describe the LEWICE 3.2 SLD model. A semi-empirical approach was used to incorporate first order physical effects of large droplet phenomena into icing software. Comparisons are then made to every two-dimensional case in the water collection database and the ice shape database. Each collection efficiency condition was run using the following four assumptions: 1) potential flow, no splashing; 2) potential flow, with splashing; 3) Navior-Stokes, no splashing; 4) Navi r-Stokes, with splashing. All cases were run with 21 bin drop size distributions and a lift correction (angle of attack adjustment). Quantitative comparisons are shown for impingement limit, maximum water catch, and total collection efficiency. Due to the large number of ice shape cases, comprehensive comparisons were limited to potential flow cases with and without splashing. Quantitative comparisons are shown for horn height, horn angle, icing limit, area, and leading edge thickness. The results show that the predicted results for both ice shape and water collection are within the accuracy limits of the experimental data for the majority of cases.

  8. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  9. Comparison of the uncertainties of several European low-dose calibration facilities

    NASA Astrophysics Data System (ADS)

    Dombrowski, H.; Cornejo Díaz, N. A.; Toni, M. P.; Mihelic, M.; Röttger, A.

    2018-04-01

    The typical uncertainty of a low-dose rate calibration of a detector, which is calibrated in a dedicated secondary national calibration laboratory, is investigated, including measurements in the photon field of metrology institutes. Calibrations at low ambient dose equivalent rates (at the level of the natural ambient radiation) are needed when environmental radiation monitors are to be characterised. The uncertainties of calibration measurements in conventional irradiation facilities above ground are compared with those obtained in a low-dose rate irradiation facility located deep underground. Four laboratories quantitatively evaluated the uncertainties of their calibration facilities, in particular for calibrations at low dose rates (250 nSv/h and 1 μSv/h). For the first time, typical uncertainties of European calibration facilities are documented in a comparison and the main sources of uncertainty are revealed. All sources of uncertainties are analysed, including the irradiation geometry, scattering, deviations of real spectra from standardised spectra, etc. As a fundamental metrological consequence, no instrument calibrated in such a facility can have a lower total uncertainty in subsequent measurements. For the first time, the need to perform calibrations at very low dose rates (< 100 nSv/h) deep underground is underpinned on the basis of quantitative data.

  10. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  11. Quantitative assessment of paretic limb dexterity and interlimb coordination during bilateral arm rehabilitation training.

    PubMed

    Xu, Chang; Li, Siyi; Wang, Kui; Hou, Zengguang; Yu, Ningbo

    2017-07-01

    In neuro-rehabilitation after stroke, the conventional constrained induced movement therapy (CIMT) has been well-accepted. Existing bilateral trainings are mostly on mirrored symmetrical motion. However, complementary bilateral movements are dominantly involved in activities of daily living (ADLs), and functional bilateral therapies may bring better skill transfer from trainings to daily life. Neurophysiological evidence is also growing. In this work, we firstly introduce our bilateral arm training system realized with a haptic interface and a motion sensor, as well as the tasks that have been designed to train both the manipulation function of the paretic arm and coordination of bilateral upper limbs. Then, we propose quantitative measures for functional assessment of complementary bilateral training performance, including kinematic behavior indices, smoothness, submovement and bimanual coordination. After that, we describe the experiments with healthy subjects and the results with respect to these quantitative measures. Feasibility and sensitivity of the proposed indices were evaluated through comparison of unilateral and bilateral training outcomes. The proposed bilateral training system and tasks, as well as the quantitative measures, have been demonstrated effective for training and assessment of unilateral and bilateral arm functions.

  12. Technical Note: Rod phantom analysis for comparison of PET detector sampling and reconstruction methods.

    PubMed

    Wollenweber, Scott D; Kemp, Brad J

    2016-11-01

    This investigation aimed to develop a scanner quantification performance methodology and compare multiple metrics between two scanners under different imaging conditions. Most PET scanners are designed to work over a wide dynamic range of patient imaging conditions. Clinical constraints, however, often impact the realization of the entitlement performance for a particular scanner design. Using less injected dose and imaging for a shorter time are often key considerations, all while maintaining "acceptable" image quality and quantitative capability. A dual phantom measurement including resolution inserts was used to measure the effects of in-plane (x, y) and axial (z) system resolution between two PET/CT systems with different block detector crystal dimensions. One of the scanners had significantly thinner slices. Several quantitative measures, including feature contrast recovery, max/min value, and feature profile accuracy were derived from the resulting data and compared between the two scanners and multiple phantoms and alignments. At the clinically relevant count levels used, the scanner with thinner slices had improved performance of approximately 2%, averaged over phantom alignments, measures, and reconstruction methods, for the head-sized phantom, mainly demonstrated with the rods aligned perpendicular to the scanner axis. That same scanner had a slightly decreased performance of -1% for the larger body-size phantom, mostly due to an apparent noise increase in the images. Most of the differences in the metrics between the two scanners were less than 10%. Using the proposed scanner performance methodology, it was shown that smaller detector elements and a larger number of image voxels require higher count density in order to demonstrate improved image quality and quantitation. In a body imaging scenario under typical clinical conditions, the potential advantages of the design must overcome increases in noise due to lower count density.

  13. 75 FR 68468 - List of Fisheries for 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...

  14. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.

  15. High-field open versus short-bore magnetic resonance imaging of the spine: a randomized controlled comparison of image quality.

    PubMed

    Enders, Judith; Rief, Matthias; Zimmermann, Elke; Asbach, Patrick; Diederichs, Gerd; Wetz, Christoph; Siebert, Eberhard; Wagner, Moritz; Hamm, Bernd; Dewey, Marc

    2013-01-01

    The purpose of the present study was to compare the image quality of spinal magnetic resonance (MR) imaging performed on a high-field horizontal open versus a short-bore MR scanner in a randomized controlled study setup. Altogether, 93 (80% women, mean age 53) consecutive patients underwent spine imaging after random assignement to a 1-T horizontal open MR scanner with a vertical magnetic field or a 1.5-T short-bore MR scanner. This patient subset was part of a larger cohort. Image quality was assessed by determining qualitative parameters, signal-to-noise (SNR) and contrast-to-noise ratios (CNR), and quantitative contour sharpness. The image quality parameters were higher for short-bore MR imaging. Regarding all sequences, the relative differences were 39% for the mean overall qualitative image quality, 53% for the mean SNR values, and 34-37% for the quantitative contour sharpness (P<0.0001). The CNR values were also higher for images obtained with the short-bore MR scanner. No sequence was of very poor (nondiagnostic) image quality. Scanning times were significantly longer for examinations performed on the open MR scanner (mean: 32±22 min versus 20±9 min; P<0.0001). In this randomized controlled comparison of spinal MR imaging with an open versus a short-bore scanner, short-bore MR imaging revealed considerably higher image quality with shorter scanning times. ClinicalTrials.gov NCT00715806.

  16. High-Field Open versus Short-Bore Magnetic Resonance Imaging of the Spine: A Randomized Controlled Comparison of Image Quality

    PubMed Central

    Zimmermann, Elke; Asbach, Patrick; Diederichs, Gerd; Wetz, Christoph; Siebert, Eberhard; Wagner, Moritz; Hamm, Bernd; Dewey, Marc

    2013-01-01

    Background The purpose of the present study was to compare the image quality of spinal magnetic resonance (MR) imaging performed on a high-field horizontal open versus a short-bore MR scanner in a randomized controlled study setup. Methods Altogether, 93 (80% women, mean age 53) consecutive patients underwent spine imaging after random assignement to a 1-T horizontal open MR scanner with a vertical magnetic field or a 1.5-T short-bore MR scanner. This patient subset was part of a larger cohort. Image quality was assessed by determining qualitative parameters, signal-to-noise (SNR) and contrast-to-noise ratios (CNR), and quantitative contour sharpness. Results The image quality parameters were higher for short-bore MR imaging. Regarding all sequences, the relative differences were 39% for the mean overall qualitative image quality, 53% for the mean SNR values, and 34–37% for the quantitative contour sharpness (P<0.0001). The CNR values were also higher for images obtained with the short-bore MR scanner. No sequence was of very poor (nondiagnostic) image quality. Scanning times were significantly longer for examinations performed on the open MR scanner (mean: 32±22 min versus 20±9 min; P<0.0001). Conclusions In this randomized controlled comparison of spinal MR imaging with an open versus a short-bore scanner, short-bore MR imaging revealed considerably higher image quality with shorter scanning times. Trial Registration ClinicalTrials.gov NCT00715806 PMID:24391767

  17. Development and validation of a generic high-performance liquid chromatography for the simultaneous separation and determination of six cough ingredients: Robustness study on core-shell particles.

    PubMed

    Yehia, Ali Mohamed; Essam, Hebatallah Mohamed

    2016-09-01

    A generally applicable high-performance liquid chromatographic method for the qualitative and quantitative determination of pharmaceutical preparations containing phenylephrine hydrochloride, paracetamol, ephedrine hydrochloride, guaifenesin, doxylamine succinate, and dextromethorphan hydrobromide is developed. Optimization of chromatographic conditions was performed for the gradient elution using different buffer pH values, flow rates and two C18 stationary phases. The method was developed using a Kinetex® C18 column as a core-shell stationary phase with a gradient profile using buffer pH 5.0 and acetonitrile at 2.0 mL/min flow rate. Detection was carried out at 220 nm and linear calibrations were obtained for all components within the studied ranges. The method was fully validated in agreement with ICH guidelines. The proposed method is specific, accurate and precise (RSD% < 3%). Limits of detection are lower than 2.0 μg/mL. Qualitative and quantitative responses were evaluated using experimental design to assist the method robustness. The method was proved to be highly robust against 10% change in buffer pH and flow rate (RSD% < 10%), however, the flow rate may significantly influence the quantitative responses of phenylephrine, paracetamol, and doxylamine (RSD% > 10%). Satisfactory results were obtained for commercial combinations analyses. Statistical comparison between the proposed chromatographic and official methods revealed no significant difference. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Quantitative head ultrasound measurements to determine thresholds for preterm neonates requiring interventional therapies following intraventricular hemorrhage

    NASA Astrophysics Data System (ADS)

    Kishimoto, Jessica; Fenster, Aaron; Salehi, Fateme; Romano, Walter; Lee, David S. C.; de Ribaupierre, Sandrine

    2016-04-01

    Dilation of the cerebral ventricles is a common condition in preterm neonates with intraventricular hemorrhage (IVH). This post hemorrhagic ventricle dilation (PHVD) can lead to lifelong neurological impairment through ischemic injury due to increased intracranial pressure and without treatment, can lead to death. Clinically, 2D ultrasound (US) through the fontanelles ('soft spots') of the patients are serially acquired to monitor the progression of the ventricle dilation. These images are used to determine when interventional therapies such as needle aspiration of the built up cerebrospinal fluid (CSF) ('ventricle tap', VT) might be indicated for a patient; however, quantitative measurements of the growth of the ventricles are often not performed. There is no consensus on when a neonate with PHVD should have an intervention and often interventions are performed after the potential for brain damage is quite high. Previously we have developed and validated a 3D US system to monitor the progression of ventricle volumes (VV) in IVH patients. We will describe the potential utility of quantitative 2D and 3D US to monitor and manage PHVD in neonates. Specifically, we will look to determine image-based measurement thresholds for patients who will require VT in comparison to patients with PHVD who resolve without intervention. Additionally, since many patients who have an initial VT will require subsequent interventions, we look at the potential for US to determine which PHVD patients will require additional VT after the initial one has been performed.

  19. Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.

    2004-05-01

    Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.

  20. A practical technique for quantifying the performance of acoustic emission systems on plate-like structures.

    PubMed

    Scholey, J J; Wilcox, P D; Wisnom, M R; Friswell, M I

    2009-06-01

    A model for quantifying the performance of acoustic emission (AE) systems on plate-like structures is presented. Employing a linear transfer function approach the model is applicable to both isotropic and anisotropic materials. The model requires several inputs including source waveforms, phase velocity and attenuation. It is recognised that these variables may not be readily available, thus efficient measurement techniques are presented for obtaining phase velocity and attenuation in a form that can be exploited directly in the model. Inspired by previously documented methods, the application of these techniques is examined and some important implications for propagation characterisation in plates are discussed. Example measurements are made on isotropic and anisotropic plates and, where possible, comparisons with numerical solutions are made. By inputting experimentally obtained data into the model, quantitative system metrics are examined for different threshold values and sensor locations. By producing plots describing areas of hit success and source location error, the ability to measure the performance of different AE system configurations is demonstrated. This quantitative approach will help to place AE testing on a more solid foundation, underpinning its use in industrial AE applications.

  1. Effects of complex aural stimuli on mental performance.

    PubMed

    Vij, Mohit; Aghazadeh, Fereydoun; Ray, Thomas G; Hatipkarasulu, Selen

    2003-06-01

    The objective of this study is to investigate the effect of complex aural stimuli on mental performance. A series of experiments were designed to obtain data for two different analyses. The first analysis is a "Stimulus" versus "No-stimulus" comparison for each of the four dependent variables, i.e. quantitative ability, reasoning ability, spatial ability and memory of an individual, by comparing the control treatment with the rest of the treatments. The second set of analysis is a multi-variant analysis of variance for component level main effects and interactions. The two component factors are tempo of the complex aural stimuli and sound volume level, each administered at three discrete levels for all four dependent variables. Ten experiments were conducted on eleven subjects. It was found that complex aural stimuli influence the quantitative and spatial aspect of the mind, while the reasoning ability was unaffected by the stimuli. Although memory showed a trend to be worse with the presence of complex aural stimuli, the effect was statistically insignificant. Variation in tempo and sound volume level of an aural stimulus did not significantly affect the mental performance of an individual. The results of these experiments can be effectively used in designing work environments.

  2. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    PubMed

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P < .001). There were no significant differences among LLC, T2-weighted short inversion time inversion recovery (STIR) sequences, early (EGE), and late (LGE) gadolinium-enhancement sequences for diagnosis of AM. The AUC for qualitative (T2-weighted STIR 0.92, EGE 0.87 and LGE 0.88) and quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  3. Prediction of the retention of s-triazines in reversed-phase high-performance liquid chromatography under linear gradient-elution conditions.

    PubMed

    D'Archivio, Angelo Antonio; Maggi, Maria Anna; Ruggieri, Fabrizio

    2014-08-01

    In this paper, a multilayer artificial neural network is used to model simultaneously the effect of solute structure and eluent concentration profile on the retention of s-triazines in reversed-phase high-performance liquid chromatography under linear gradient elution. The retention data of 24 triazines, including common herbicides and their metabolites, are collected under 13 different elution modes, covering the following experimental domain: starting acetonitrile volume fraction ranging between 40 and 60% and gradient slope ranging between 0 and 1% acetonitrile/min. The gradient parameters together with five selected molecular descriptors, identified by quantitative structure-retention relationship modelling applied to individual separation conditions, are the network inputs. Predictive performance of this model is evaluated on six external triazines and four unseen separation conditions. For comparison, retention of triazines is modelled by both quantitative structure-retention relationships and response surface methodology, which describe separately the effect of molecular structure and gradient parameters on the retention. Although applied to a wider variable domain, the network provides a performance comparable to that of the above "local" models and retention times of triazines are modelled with accuracy generally better than 7%. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  5. Corneal topography with high-speed swept source OCT in clinical examination

    PubMed Central

    Karnowski, Karol; Kaluzny, Bartlomiej J.; Szkulmowski, Maciej; Gora, Michalina; Wojtkowski, Maciej

    2011-01-01

    We present the applicability of high-speed swept source (SS) optical coherence tomography (OCT) for quantitative evaluation of the corneal topography. A high-speed OCT device of 108,000 lines/s permits dense 3D imaging of the anterior segment within a time period of less than one fourth of second, minimizing the influence of motion artifacts on final images and topographic analysis. The swept laser performance was specially adapted to meet imaging depth requirements. For the first time to our knowledge the results of a quantitative corneal analysis based on SS OCT for clinical pathologies such as keratoconus, a cornea with superficial postinfectious scar, and a cornea 5 months after penetrating keratoplasty are presented. Additionally, a comparison with widely used commercial systems, a Placido-based topographer and a Scheimpflug imaging-based topographer, is demonstrated. PMID:21991558

  6. Quantitative genetic-interaction mapping in mammalian cells

    PubMed Central

    Roguev, Assen; Talbot, Dale; Negri, Gian Luca; Shales, Michael; Cagney, Gerard; Bandyopadhyay, Sourav; Panning, Barbara; Krogan, Nevan J

    2013-01-01

    Mapping genetic interactions (GIs) by simultaneously perturbing pairs of genes is a powerful tool for understanding complex biological phenomena. Here we describe an experimental platform for generating quantitative GI maps in mammalian cells using a combinatorial RNA interference strategy. We performed ~11,000 pairwise knockdowns in mouse fibroblasts, focusing on 130 factors involved in chromatin regulation to create a GI map. Comparison of the GI and protein-protein interaction (PPI) data revealed that pairs of genes exhibiting positive GIs and/or similar genetic profiles were predictive of the corresponding proteins being physically associated. The mammalian GI map identified pathways and complexes but also resolved functionally distinct submodules within larger protein complexes. By integrating GI and PPI data, we created a functional map of chromatin complexes in mouse fibroblasts, revealing that the PAF complex is a central player in the mammalian chromatin landscape. PMID:23407553

  7. Detection and characterization of lesions on low-radiation-dose abdominal CT images postprocessed with noise reduction filters.

    PubMed

    Kalra, Mannudeep K; Maher, Michael M; Blake, Michael A; Lucey, Brian C; Karau, Kelly; Toth, Thomas L; Avinash, Gopal; Halpern, Elkan F; Saini, Sanjay

    2004-09-01

    To assess the effect of noise reduction filters on detection and characterization of lesions on low-radiation-dose abdominal computed tomographic (CT) images. Low-dose CT images of abdominal lesions in 19 consecutive patients (11 women, eight men; age range, 32-78 years) were obtained at reduced tube currents (120-144 mAs). These baseline low-dose CT images were postprocessed with six noise reduction filters; the resulting postprocessed images were then randomly assorted with baseline images. Three radiologists performed independent evaluation of randomized images for presence, number, margins, attenuation, conspicuity, calcification, and enhancement of lesions, as well as image noise. Side-by-side comparison of baseline images with postprocessed images was performed by using a five-point scale for assessing lesion conspicuity and margins, image noise, beam hardening, and diagnostic acceptability. Quantitative noise and contrast-to-noise ratio were obtained for all liver lesions. Statistical analysis was performed by using the Wilcoxon signed rank test, Student t test, and kappa test of agreement. Significant reduction of noise was observed in images postprocessed with filter F compared with the noise in baseline nonfiltered images (P =.004). Although the number of lesions seen on baseline images and that seen on postprocessed images were identical, lesions were less conspicuous on postprocessed images than on baseline images. A decrease in quantitative image noise and contrast-to-noise ratio for liver lesions was noted with all noise reduction filters. There was good interobserver agreement (kappa = 0.7). Although the use of currently available noise reduction filters improves image noise and ameliorates beam-hardening artifacts at low-dose CT, such filters are limited by a compromise in lesion conspicuity and appearance in comparison with lesion conspicuity and appearance on baseline low-dose CT images. Copyright RSNA, 2004

  8. Penetration of pyrotechnic effects with SWIR laser gated viewing in comparison to VIS and thermal IR bands

    NASA Astrophysics Data System (ADS)

    Göhler, Benjamin; Lutzmann, Peter

    2016-10-01

    In this paper, the potential capability of short-wavelength infrared laser gated-viewing for penetrating the pyrotechnic effects smoke and light/heat has been investigated by evaluating data from conducted field trials. The potential of thermal infrared cameras for this purpose has also been considered and the results have been compared to conventional visible cameras as benchmark. The application area is the use in soccer stadiums where pyrotechnics are illegally burned in dense crowds of people obstructing visibility of stadium safety staff and police forces into the involved section of the stadium. Quantitative analyses have been carried out to identify sensor performances. Further, qualitative image comparisons have been presented to give impressions of image quality during the disruptive effects of burning pyrotechnics.

  9. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-04-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.

  10. Comparison of GEANT4 very low energy cross section models with experimental data in water.

    PubMed

    Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C

    2010-09-01

    The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.

  11. Viterbi decoding for satellite and space communication.

    NASA Technical Reports Server (NTRS)

    Heller, J. A.; Jacobs, I. M.

    1971-01-01

    Convolutional coding and Viterbi decoding, along with binary phase-shift keyed modulation, is presented as an efficient system for reliable communication on power limited satellite and space channels. Performance results, obtained theoretically and through computer simulation, are given for optimum short constraint length codes for a range of code constraint lengths and code rates. System efficiency is compared for hard receiver quantization and 4 and 8 level soft quantization. The effects on performance of varying of certain parameters relevant to decoder complexity and cost are examined. Quantitative performance degradation due to imperfect carrier phase coherence is evaluated and compared to that of an uncoded system. As an example of decoder performance versus complexity, a recently implemented 2-Mbit/sec constraint length 7 Viterbi decoder is discussed. Finally a comparison is made between Viterbi and sequential decoding in terms of suitability to various system requirements.

  12. Cosmic-ray discrimination capabilities of /ΔE-/E silicon nuclear telescopes using neural networks

    NASA Astrophysics Data System (ADS)

    Ambriola, M.; Bellotti, R.; Cafagna, F.; Castellano, M.; Ciacio, F.; Circella, M.; Marzo, C. N. D.; Montaruli, T.

    2000-02-01

    An isotope classifier of cosmic-ray events collected by space detectors has been implemented using a multi-layer perceptron neural architecture. In order to handle a great number of different isotopes a modular architecture of the ``mixture of experts'' type is proposed. The performance of this classifier has been tested on simulated data and has been compared with a ``classical'' classifying procedure. The quantitative comparison with traditional techniques shows that the neural approach has classification performances comparable - within /1% - with that of the classical one, with efficiency of the order of /98%. A possible hardware implementation of such a kind of neural architecture in future space missions is considered.

  13. Fast globally optimal segmentation of cells in fluorescence microscopy images.

    PubMed

    Bergeest, Jan-Philip; Rohr, Karl

    2011-01-01

    Accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression in high-throughput screening applications. We propose a new approach for segmenting cell nuclei which is based on active contours and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images of different cell types. We have also performed a quantitative comparison with previous segmentation approaches.

  14. Ascertainment bias from imputation methods evaluation in wheat.

    PubMed

    Brandariz, Sofía P; González Reymúndez, Agustín; Lado, Bettina; Malosetti, Marcos; Garcia, Antonio Augusto Franco; Quincke, Martín; von Zitzewitz, Jarislav; Castro, Marina; Matus, Iván; Del Pozo, Alejandro; Castro, Ariel J; Gutiérrez, Lucía

    2016-10-04

    Whole-genome genotyping techniques like Genotyping-by-sequencing (GBS) are being used for genetic studies such as Genome-Wide Association (GWAS) and Genomewide Selection (GS), where different strategies for imputation have been developed. Nevertheless, imputation error may lead to poor performance (i.e. smaller power or higher false positive rate) when complete data is not required as it is for GWAS, and each marker is taken at a time. The aim of this study was to compare the performance of GWAS analysis for Quantitative Trait Loci (QTL) of major and minor effect using different imputation methods when no reference panel is available in a wheat GBS panel. In this study, we compared the power and false positive rate of dissecting quantitative traits for imputed and not-imputed marker score matrices in: (1) a complete molecular marker barley panel array, and (2) a GBS wheat panel with missing data. We found that there is an ascertainment bias in imputation method comparisons. Simulating over a complete matrix and creating missing data at random proved that imputation methods have a poorer performance. Furthermore, we found that when QTL were simulated with imputed data, the imputation methods performed better than the not-imputed ones. On the other hand, when QTL were simulated with not-imputed data, the not-imputed method and one of the imputation methods performed better for dissecting quantitative traits. Moreover, larger differences between imputation methods were detected for QTL of major effect than QTL of minor effect. We also compared the different marker score matrices for GWAS analysis in a real wheat phenotype dataset, and we found minimal differences indicating that imputation did not improve the GWAS performance when a reference panel was not available. Poorer performance was found in GWAS analysis when an imputed marker score matrix was used, no reference panel is available, in a wheat GBS panel.

  15. High-pitch dual-source CT angiography without ECG-gating for imaging the whole aorta: intraindividual comparison with standard pitch single-source technique without ECG-gating

    PubMed Central

    Manna, Carmelinda; Silva, Mario; Cobelli, Rocco; Poggesi, Sara; Rossi, Cristina; Sverzellati, Nicola

    2017-01-01

    PURPOSE We aimed to perform intraindividual comparison of computed tomography (CT) parameters, image quality, and radiation exposure between standard CT angiography (CTA) and high-pitch dual source (DS)-CTA, in subjects undergoing serial CTA of thoracoabdominal aorta. METHODS Eighteen subjects with thoracoabdominal CTA by standard technique and high-pitch DS-CTA technique within 6 months of each other were retrieved for intraindividual comparison of image quality in thoracic and abdominal aorta. Quantitative analysis was performed by comparison of mean aortic attenuation, noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Qualitative analysis was performed by visual assessment of motion artifacts and diagnostic confidence. Radiation exposure was quantified by effective dose. Image quality was apportioned to radiation exposure by means of figure of merit. RESULTS Mean aortic attenuation and noise were higher in high-pitch DS-CTA of thoracoabdominal aorta, whereas SNR and CNR were similar in thoracic aorta and significantly lower in high-pitch DS-CTA of abdominal aorta (P = 0.024 and P = 0.016). High-pitch DS-CTA was significantly better in the first segment of thoracic aorta. Effective dose was reduced by 72% in high-pitch DS-CTA. CONCLUSION High-pitch DS-CTA without electrocardiography-gating is an effective technique for imaging aorta with very low radiation exposure and with significant reduction of motion artifacts in ascending aorta; however, the overall quality of high-pitch DS-CTA in abdominal aorta is lower than standard CTA. PMID:28703104

  16. KIN-Nav navigation system for kinematic assessment in anterior cruciate ligament reconstruction: features, use, and perspectives.

    PubMed

    Martelli, S; Zaffagnini, S; Bignozzi, S; Lopomo, N F; Iacono, F; Marcacci, M

    2007-10-01

    In this paper a new navigation system, KIN-Nav, developed for research and used during 80 anterior cruciate ligament (ACL) reconstructions is described. KIN-Nav is a user-friendly navigation system for flexible intraoperative acquisitions of anatomical and kinematic data, suitable for validation of biomechanical hypotheses. It performs real-time quantitative evaluation of antero-posterior, internal-external, and varus-valgus knee laxity at any degree of flexion and provides a new interface for this task, suitable also for comparison of pre-operative and post-operative knee laxity and surgical documentation. In this paper the concept and features of KIN-Nav, which represents a new approach to navigation and allows the investigation of new quantitative measurements in ACL reconstruction, are described. Two clinical studies are reported, as examples of clinical potentiality and correct use of this methodology. In this paper a preliminary analysis of KIN-Nav's reliability and clinical efficacy, performed during blinded repeated measures by three independent examiners, is also given. This analysis is the first assessment of the potential of navigation systems for evaluating knee kinematics.

  17. Molecule kernels: a descriptor- and alignment-free quantitative structure-activity relationship approach.

    PubMed

    Mohr, Johannes A; Jain, Brijnesh J; Obermayer, Klaus

    2008-09-01

    Quantitative structure activity relationship (QSAR) analysis is traditionally based on extracting a set of molecular descriptors and using them to build a predictive model. In this work, we propose a QSAR approach based directly on the similarity between the 3D structures of a set of molecules measured by a so-called molecule kernel, which is independent of the spatial prealignment of the compounds. Predictors can be build using the molecule kernel in conjunction with the potential support vector machine (P-SVM), a recently proposed machine learning method for dyadic data. The resulting models make direct use of the structural similarities between the compounds in the test set and a subset of the training set and do not require an explicit descriptor construction. We evaluated the predictive performance of the proposed method on one classification and four regression QSAR datasets and compared its results to the results reported in the literature for several state-of-the-art descriptor-based and 3D QSAR approaches. In this comparison, the proposed molecule kernel method performed better than the other QSAR methods.

  18. Benchmarking the performance of fixed-image receptor digital radiographic systems part 1: a novel method for image quality analysis.

    PubMed

    Lee, Kam L; Ireland, Timothy A; Bernardo, Michael

    2016-06-01

    This is the first part of a two-part study in benchmarking the performance of fixed digital radiographic general X-ray systems. This paper concentrates on reporting findings related to quantitative analysis techniques used to establish comparative image quality metrics. A systematic technical comparison of the evaluated systems is presented in part two of this study. A novel quantitative image quality analysis method is presented with technical considerations addressed for peer review. The novel method was applied to seven general radiographic systems with four different makes of radiographic image receptor (12 image receptors in total). For the System Modulation Transfer Function (sMTF), the use of grid was found to reduce veiling glare and decrease roll-off. The major contributor in sMTF degradation was found to be focal spot blurring. For the System Normalised Noise Power Spectrum (sNNPS), it was found that all systems examined had similar sNNPS responses. A mathematical model is presented to explain how the use of stationary grid may cause a difference between horizontal and vertical sNNPS responses.

  19. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  20. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  1. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  2. A university teaching simulation facility

    NASA Technical Reports Server (NTRS)

    Stark, Lawrence; Kim, Won-Soo; Tendick, Frank; Tyler, Mitchell; Hannaford, Blake; Barakat, Wissam; Bergengruen, Olaf; Braddi, Louis; Eisenberg, Joseph; Ellis, Stephen

    1987-01-01

    An experimental telerobotics (TR) simulation is described suitable for studying human operator (HO) performance. Simple manipulator pick-and-place and tracking tasks allowed quantitative comparison of a number of calligraphic display viewing conditions. A number of control modes could be compared in this TR simulation, including displacement, rate, and acceleratory control using position and force joysticks. A homeomorphic controller turned out to be no better than joysticks; the adaptive properties of the HO can apparently permit quite good control over a variety of controller configurations and control modes. Training by optimal control example seemed helpful in preliminary experiments.

  3. Parameters for Quantitative Comparison of Two-, Three-, and Four-level Laser Media, Operating Wavelengths, and Temperatures

    DTIC Science & Technology

    2010-08-01

    S) AND ADDRESS(ES) U.S. Army Research Laboratory ATTN: RDRL-SEE-O 2800 Powder Mill Road Adelphi, MD 20783-1197 8. PERFORMING ORGANIZATION...Kudryashov, and D. Gar - buzov, “Resonant pumping and upconversion in 1.6 m Er lasers,” J. Opt. Soc. Amer. B., vol. 24, pp. 2454–2460, Sep. 2007. [4] A...1 DIRECTOR US ARMY RESEARCH LAB IMNE ALC HRR 2800 POWDER MILL RD ADELPHI MD 20783-1197 1 DIRECTOR US ARMY RESEARCH LAB RDRL CIM

  4. High pressure rinsing system comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Sertore; M. Fusetti; P. Michelato

    2007-06-01

    High pressure rinsing (HPR) is a key process for the surface preparation of high field superconducting cavities. A portable apparatus for the water jet characterization, based on the transferred momentum between the water jet and a load cell, has been used in different laboratories. This apparatus allows to collected quantitative parameters that characterize the HPR water jet. In this paper, we present a quantitative comparison of the different water jet produced by various nozzles routinely used in different laboratories for the HPR process

  5. Field Demonstration Report Applied Innovative Technologies for Characterization of Nitrocellulose- and Nitroglycerine Contaminated Buildings and Soils, Rev 1

    DTIC Science & Technology

    2007-01-05

    positive / false negatives. The quantitative on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison...Conclusion ...............................................................................................3-9 3.2 Quantitative Analysis Using CRREL...3-37 3.3 Quantitative Analysis for NG by GC/TID.........................................................3-38 3.3.1 Introduction

  6. Quantitative analysis of Ni2+/Ni3+ in Li[NixMnyCoz]O2 cathode materials: Non-linear least-squares fitting of XPS spectra

    NASA Astrophysics Data System (ADS)

    Fu, Zewei; Hu, Juntao; Hu, Wenlong; Yang, Shiyu; Luo, Yunfeng

    2018-05-01

    Quantitative analysis of Ni2+/Ni3+ using X-ray photoelectron spectroscopy (XPS) is important for evaluating the crystal structure and electrochemical performance of Lithium-nickel-cobalt-manganese oxide (Li[NixMnyCoz]O2, NMC). However, quantitative analysis based on Gaussian/Lorentzian (G/L) peak fitting suffers from the challenges of reproducibility and effectiveness. In this study, the Ni2+ and Ni3+ standard samples and a series of NMC samples with different Ni doping levels were synthesized. The Ni2+/Ni3+ ratios in NMC were quantitatively analyzed by non-linear least-squares fitting (NLLSF). Two Ni 2p overall spectra of synthesized Li [Ni0.33Mn0.33Co0.33]O2(NMC111) and bulk LiNiO2 were used as the Ni2+ and Ni3+ reference standards. Compared to G/L peak fitting, the fitting parameters required no adjustment, meaning that the spectral fitting process was free from operator dependence and the reproducibility was improved. Comparison of residual standard deviation (STD) showed that the fitting quality of NLLSF was superior to that of G/L peaks fitting. Overall, these findings confirmed the reproducibility and effectiveness of the NLLSF method in XPS quantitative analysis of Ni2+/Ni3+ ratio in Li[NixMnyCoz]O2 cathode materials.

  7. Electroencephalography reactivity for prognostication of post-anoxic coma after cardiopulmonary resuscitation: A comparison of quantitative analysis and visual analysis.

    PubMed

    Liu, Gang; Su, Yingying; Jiang, Mengdi; Chen, Weibi; Zhang, Yan; Zhang, Yunzhou; Gao, Daiquan

    2016-07-28

    Electroencephalogram reactivity (EEG-R) is a positive predictive factor for assessing outcomes in comatose patients. Most studies assess the prognostic value of EEG-R utilizing visual analysis; however, this method is prone to subjectivity. We sought to categorize EEG-R with a quantitative approach. We retrospectively studied consecutive comatose patients who had an EEG-R recording performed 1-3 days after cardiopulmonary resuscitation (CPR) or during normothermia after therapeutic hypothermia. EEG-R was assessed via visual analysis and quantitative analysis separately. Clinical outcomes were followed-up at 3-month and dichotomized as recovery of awareness or no recovery of awareness. A total of 96 patients met the inclusion criteria, and 38 (40%) patients recovered awareness at 3-month followed-up. Of 27 patients with EEG-R measured with visual analysis, 22 patients recovered awareness; and of the 69 patients who did not demonstrated EEG-R, 16 patients recovered awareness. The sensitivity and specificity of visually measured EEG-R were 58% and 91%, respectively. The area under the receiver operating characteristic curve for the quantitative analysis was 0.92 (95% confidence interval, 0.87-0.97), with the best cut-off value of 0.10. EEG-R through quantitative analysis might be a good method in predicting the recovery of awareness in patients with post-anoxic coma after CPR. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Digitized hand-wrist radiographs: comparison of subjective and software-derived image quality at various compression ratios.

    PubMed

    McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R

    2007-05-01

    The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.

  9. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    PubMed

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.

  10. Numerical formulation for the prediction of solid/liquid change of a binary alloy

    NASA Technical Reports Server (NTRS)

    Schneider, G. E.; Tiwari, S. N.

    1990-01-01

    A computational model is presented for the prediction of solid/liquid phase change energy transport including the influence of free convection fluid flow in the liquid phase region. The computational model considers the velocity components of all non-liquid phase change material control volumes to be zero but fully solves the coupled mass-momentum problem within the liquid region. The thermal energy model includes the entire domain and uses an enthalpy like model and a recently developed method for handling the phase change interface nonlinearity. Convergence studies are performed and comparisons made with experimental data for two different problem specifications. The convergence studies indicate that grid independence was achieved and the comparison with experimental data indicates excellent quantitative prediction of the melt fraction evolution. Qualitative data is also provided in the form of velocity vector diagrams and isotherm plots for selected times in the evolution of both problems. The computational costs incurred are quite low by comparison with previous efforts on solving these problems.

  11. Haemodynamics of giant cerebral aneurysm: A comparison between the rigid-wall, one-way and two-way FSI models

    NASA Astrophysics Data System (ADS)

    Khe, A. K.; Cherevko, A. A.; Chupakhin, A. P.; Bobkova, M. S.; Krivoshapkin, A. L.; Orlov, K. Yu

    2016-06-01

    In this paper a computer simulation of a blood flow in cerebral vessels with a giant saccular aneurysm at the bifurcation of the basilar artery is performed. The modelling is based on patient-specific clinical data (both flow domain geometry and boundary conditions for the inlets and outlets). The hydrodynamic and mechanical parameters are calculated in the frameworks of three models: rigid-wall assumption, one-way FSI approach, and full (two-way) hydroelastic model. A comparison of the numerical solutions shows that mutual fluid- solid interaction can result in qualitative changes in the structure of the fluid flow. Other characteristics of the flow (pressure, stress, strain and displacement) qualitatively agree with each other in different approaches. However, the quantitative comparison shows that accounting for the flow-vessel interaction, in general, decreases the absolute values of these parameters. Solving of the hydroelasticity problem gives a more detailed solution at a cost of highly increased computational time.

  12. Comparison of droplet digital PCR with quantitative real-time PCR for determination of zygosity in transgenic maize.

    PubMed

    Xu, Xiaoli; Peng, Cheng; Wang, Xiaofu; Chen, Xiaoyun; Wang, Qiang; Xu, Junfeng

    2016-12-01

    This study evaluated the applicability of droplet digital PCR (ddPCR) as a tool for maize zygosity determination using quantitative real-time PCR (qPCR) as a reference technology. Quantitative real-time PCR is commonly used to determine transgene copy number or GMO zygosity characterization. However, its effectiveness is based on identical reaction efficiencies for the transgene and the endogenous reference gene. Additionally, a calibrator sample should be utilized for accuracy. Droplet digital PCR is a DNA molecule counting technique that directly counts the absolute number of target and reference DNA molecules in a sample, independent of assay efficiency or external calibrators. The zygosity of the transgene can be easily determined using the ratio of the quantity of the target gene to the reference single copy endogenous gene. In this study, both the qPCR and ddPCR methods were used to determine insect-resistant transgenic maize IE034 zygosity. Both methods performed well, but the ddPCR method was more convenient because of its absolute quantification property.

  13. CNV-ROC: A cost effective, computer-aided analytical performance evaluator of chromosomal microarrays.

    PubMed

    Goodman, Corey W; Major, Heather J; Walls, William D; Sheffield, Val C; Casavant, Thomas L; Darbro, Benjamin W

    2015-04-01

    Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Quantitative CT characterization of pediatric lung development using routine clinical imaging

    PubMed Central

    Stein, Jill M.; Walkup, Laura L.; Brody, Alan S.; Fleck, Robert J.

    2016-01-01

    Background The use of quantitative CT analysis in children is limited by lack of normal values of lung parenchymal attenuation. These characteristics are important because normal lung development yields significant parenchymal attenuation changes as children age. Objective To perform quantitative characterization of normal pediatric lung parenchymal X-ray CT attenuation under routine clinical conditions in order to establish a baseline comparison to that seen in pathological lung conditions. Materials and methods We conducted a retrospective query of normal CT chest examinations in children ages 0–7 years from 2004 to 2014 using standard clinical protocol. During these examinations semi-automated lung parenchymal segmentation was performed to measure lung volume and mean lung attenuation. Results We analyzed 42 CT examinations in 39 children, ages 3 days to 83 months (mean ± standard deviation [SD] = 42±27 months). Lung volume ranged 0.10–1.72 liters (L). Mean lung attenuation was much higher in children younger than 12 months, with values as high as −380 Hounsfield units (HU) in neonates (lung volume 0.10 L). Lung volume decreased to approximately −650 HU by age 2 years (lung volume 0.47 L), with subsequently slower exponential decrease toward a relatively constant value of −860 HU as age and lung volume increased. Conclusion Normal lung parenchymal X-ray CT attenuation decreases with increasing lung volume and age; lung attenuation decreases rapidly in the first 2 years of age and more slowly thereafter. This change in normal lung attenuation should be taken into account as quantitative CT methods are translated to pediatric pulmonary imaging. PMID:27576458

  15. Facile quantitation of free thiols in a recombinant monoclonal antibody by reversed-phase high performance liquid chromatography with hydrophobicity-tailored thiol derivatization.

    PubMed

    Welch, Leslie; Dong, Xiao; Hewitt, Daniel; Irwin, Michelle; McCarty, Luke; Tsai, Christina; Baginski, Tomasz

    2018-06-02

    Free thiol content, and its consistency, is one of the product quality attributes of interest during technical development of manufactured recombinant monoclonal antibodies (mAbs). We describe a new, mid/high-throughput reversed-phase-high performance liquid chromatography (RP-HPLC) method coupled with derivatization of free thiols, for the determination of total free thiol content in an E. coli-expressed therapeutic monovalent monoclonal antibody mAb1. Initial selection of the derivatization reagent used an hydrophobicity-tailored approach. Maleimide-based thiol-reactive reagents with varying degrees of hydrophobicity were assessed to identify and select one that provided adequate chromatographic resolution and robust quantitation of free thiol-containing mAb1 forms. The method relies on covalent derivatization of free thiols in denatured mAb1 with N-tert-butylmaleimide (NtBM) label, followed by RP-HPLC separation with UV-based quantitation of native (disulfide containing) and labeled (free thiol containing) forms. The method demonstrated good specificity, precision, linearity, accuracy and robustness. Accuracy of the method, for samples with a wide range of free thiol content, was demonstrated using admixtures as well as by comparison to an orthogonal LC-MS peptide mapping method with isotope tagging of free thiols. The developed method has a facile workflow which fits well into both R&D characterization and quality control (QC) testing environments. The hydrophobicity-tailored approach to the selection of free thiol derivatization reagent is easily applied to the rapid development of free thiol quantitation methods for full-length recombinant antibodies. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Quantitative assessment of ischemia and reactive hyperemia of the dermal layers using multi - spectral imaging on the human arm

    NASA Astrophysics Data System (ADS)

    Kainerstorfer, Jana M.; Amyot, Franck; Demos, Stavros G.; Hassan, Moinuddin; Chernomordik, Victor; Hitzenberger, Christoph K.; Gandjbakhche, Amir H.; Riley, Jason D.

    2009-07-01

    Quantitative assessment of skin chromophores in a non-invasive fashion is often desirable. Especially pixel wise assessment of blood volume and blood oxygenation is beneficial for improved diagnostics. We utilized a multi-spectral imaging system for acquiring diffuse reflectance images of healthy volunteers' lower forearm. Ischemia and reactive hyperemia was introduced by occluding the upper arm with a pressure cuff for 5min with 180mmHg. Multi-spectral images were taken every 30s, before, during and after occlusion. Image reconstruction for blood volume and blood oxygenation was performed, using a two layered skin model. As the images were taken in a non-contact way, strong artifacts related to the shape (curvature) of the arms were observed, making reconstruction of optical / physiological parameters highly inaccurate. We developed a curvature correction method, which is based on extracting the curvature directly from the intensity images acquired and does not require any additional measures on the object imaged. The effectiveness of the algorithm was demonstrated, on reconstruction results of blood volume and blood oxygenation for in vivo data during occlusion of the arm. Pixel wise assessment of blood volume and blood oxygenation was made possible over the entire image area and comparison of occlusion effects between veins and surrounding skin was performed. Induced ischemia during occlusion and reactive hyperemia afterwards was observed and quantitatively assessed. Furthermore, the influence of epidermal thickness on reconstruction results was evaluated and the exact knowledge of this parameter for fully quantitative assessment was pointed out.

  17. A comparison of cosegregation analysis methods for the clinical setting.

    PubMed

    Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H

    2018-04-01

    Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.

  18. Development of a Quantitative Decision Metric for Selecting the Most Suitable Discretization Method for SN Transport Problems

    NASA Astrophysics Data System (ADS)

    Schunert, Sebastian

    In this work we develop a quantitative decision metric for spatial discretization methods of the SN equations. The quantitative decision metric utilizes performance data from selected test problems for computing a fitness score that is used for the selection of the most suitable discretization method for a particular SN transport application. The fitness score is aggregated as a weighted geometric mean of single performance indicators representing various performance aspects relevant to the user. Thus, the fitness function can be adjusted to the particular needs of the code practitioner by adding/removing single performance indicators or changing their importance via the supplied weights. Within this work a special, broad class of methods is considered, referred to as nodal methods. This class is naturally comprised of the DGFEM methods of all function space families. Within this work it is also shown that the Higher Order Diamond Difference (HODD) method is a nodal method. Building on earlier findings that the Arbitrarily High Order Method of the Nodal type (AHOTN) is also a nodal method, a generalized finite-element framework is created to yield as special cases various methods that were developed independently using profoundly different formalisms. A selection of test problems related to a certain performance aspect are considered: an Method of Manufactured Solutions (MMS) test suite for assessing accuracy and execution time, Lathrop's test problem for assessing resilience against occurrence of negative fluxes, and a simple, homogeneous cube test problem to verify if a method possesses the thick diffusive limit. The contending methods are implemented as efficiently as possible under a common SN transport code framework to level the playing field for a fair comparison of their computational load. Numerical results are presented for all three test problems and a qualitative rating of each method's performance is provided for each aspect: accuracy/efficiency, resilience against negative fluxes, and possession of the thick diffusion limit, separately. The choice of the most efficient method depends on the utilized error norm: in Lp error norms higher order methods such as the AHOTN method of order three perform best, while for computing integral quantities the linear nodal (LN) method is most efficient. The most resilient method against occurrence of negative fluxes is the simple corner balance (SCB) method. A validation of the quantitative decision metric is performed based on the NEA box-inbox suite of test problems. The validation exercise comprises two stages: first prediction of the contending methods' performance via the decision metric and second computing the actual scores based on data obtained from the NEA benchmark problem. The comparison of predicted and actual scores via a penalty function (ratio of predicted best performer's score to actual best score) completes the validation exercise. It is found that the decision metric is capable of very accurate predictions (penalty < 10%) in more than 83% of the considered cases and features penalties up to 20% for the remaining cases. An exception to this rule is the third test case NEA-III intentionally set up to incorporate a poor match of the benchmark with the "data" problems. However, even under these worst case conditions the decision metric's suggestions are never detrimental. Suggestions for improving the decision metric's accuracy are to increase the pool of employed data, to refine the mapping of a given configuration to a case in the database, and to better characterize the desired target quantities.

  19. Can performance-based incentives improve motivation of nurses and midwives in primary facilities in northern Ghana? A quasi-experimental study.

    PubMed

    Aninanya, Gifty Apiung; Howard, Natasha; Williams, John E; Apam, Benjamin; Prytherch, Helen; Loukanova, Svetla; Kamara, Eunice Karanja; Otupiri, Easmon

    2016-01-01

    Lack of an adequate and well-performing health workforce has emerged as the biggest barrier to scaling up health services provision in sub-Saharan Africa. As the global community commits to the Sustainable Development Goals and universal health coverage, health workforce challenges are critical. In northern Ghana, performance-based incentives (PBIs) were introduced to improve health worker motivation and service quality. The goal of this study was to determine the impact of PBIs on maternal health worker motivation in two districts in northern Ghana. A quasi-experimental study design with pre- and post-intervention measurement was used. PBIs were implemented for 2 years in six health facilities in Kassena-Nankana District with six health facilities in Builsa District serving as comparison sites. Fifty pre- and post-intervention structured interviews and 66 post-intervention in-depth interviews were conducted with health workers. Motivation was assessed using constructs for job satisfaction, pride, intrinsic motivation, timelines/attendance, and organisational commitment. Quantitative data were analysed to determine changes in motivation between intervention and comparison facilities pre- and post-intervention using STATA™ version 13. Qualitative data were analysed thematically using NVivo 10 to explore possible reasons for quantitative findings. PBIs were associated with slightly improved maternal health worker motivation. Mean values for overall motivation between intervention and comparison health workers were 0.6 versus 0.7 at baseline and 0.8 versus 0.7 at end line, respectively. Differences at baseline and end line were 0.1 ( p =0.40 and p =0.50 respectively), with an overall 0.01 difference in difference ( p =0.90). Qualitative interviews indicated that PBIs encouraged health workers to work harder and be more punctual, increasing reported pride and job satisfaction. The results contribute evidence on the effects of PBIs on motivational constructs among maternal health workers in primary care facilities in northern Ghana. PBIs appeared to improve motivation, but not dramatically, and the long-term and unintended effects of their introduction require additional study.

  20. Can performance-based incentives improve motivation of nurses and midwives in primary facilities in northern Ghana? A quasi-experimental study

    PubMed Central

    Aninanya, Gifty Apiung; Howard, Natasha; Williams, John E.; Apam, Benjamin; Prytherch, Helen; Loukanova, Svetla; Kamara, Eunice Karanja; Otupiri, Easmon

    2016-01-01

    Background Lack of an adequate and well-performing health workforce has emerged as the biggest barrier to scaling up health services provision in sub-Saharan Africa. As the global community commits to the Sustainable Development Goals and universal health coverage, health workforce challenges are critical. In northern Ghana, performance-based incentives (PBIs) were introduced to improve health worker motivation and service quality. Objective The goal of this study was to determine the impact of PBIs on maternal health worker motivation in two districts in northern Ghana. Design A quasi-experimental study design with pre- and post-intervention measurement was used. PBIs were implemented for 2 years in six health facilities in Kassena-Nankana District with six health facilities in Builsa District serving as comparison sites. Fifty pre- and post-intervention structured interviews and 66 post-intervention in-depth interviews were conducted with health workers. Motivation was assessed using constructs for job satisfaction, pride, intrinsic motivation, timelines/attendance, and organisational commitment. Quantitative data were analysed to determine changes in motivation between intervention and comparison facilities pre- and post-intervention using STATA™ version 13. Qualitative data were analysed thematically using NVivo 10 to explore possible reasons for quantitative findings. Results PBIs were associated with slightly improved maternal health worker motivation. Mean values for overall motivation between intervention and comparison health workers were 0.6 versus 0.7 at baseline and 0.8 versus 0.7 at end line, respectively. Differences at baseline and end line were 0.1 (p=0.40 and p=0.50 respectively), with an overall 0.01 difference in difference (p=0.90). Qualitative interviews indicated that PBIs encouraged health workers to work harder and be more punctual, increasing reported pride and job satisfaction. Conclusions The results contribute evidence on the effects of PBIs on motivational constructs among maternal health workers in primary care facilities in northern Ghana. PBIs appeared to improve motivation, but not dramatically, and the long-term and unintended effects of their introduction require additional study. PMID:27741956

  1. Comparison of Activity Determination of Radium 226 in FUSRAP Soil using Various Energy Lines - 12299

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, Brian; Donakowski, Jough; Hays, David

    2012-07-01

    Gamma spectroscopy is used at the Formerly Utilized Sites Remedial Action Program (FUSRAP) Maywood Superfund Site as the primary radioanalytical tool for quantization of activities of the radionuclides of concern in site soil. When selecting energy lines in gamma spectroscopy, a number of factors are considered including assumptions concerning secondary equilibrium, interferences, and the strength of the lines. The case of the Maywood radionuclide of concern radium-226 (Ra-226) is considered in this paper. At the FUSRAP Maywood Superfund Site, one of the daughters produced from radioactive decay of Ra-226, lead-214 (Pb- 214), is used to quantitate Ra-226. Another Ra-226 daughter,more » bismuth-214 (Bi-214), also may be used to quantitate Ra-226. In this paper, a comparison of Ra-226 to Pb-214 activities and Ra-226 to Bi-214 activities, obtained using gamma spectrometry for a large number of soil samples, was performed. The Pb-214, Bi-214, and Ra-226 activities were quantitated using the 352 kilo electron volt (keV), 609 keV, and 186 keV lines, respectively. The comparisons were made after correcting the Ra-226 activities by a factor of 0.571 and both ignoring and accounting for the contribution of a U-235 interfering line to the Ra-226 line. For the Pb-214 and Bi-214 activities, a mean in-growth factor was employed. The gamma spectrometer was calibrated for efficiency and energy using a mixed gamma standard and an energy range of 59 keV to 1830 keV. The authors expect other sites with Ra-226 contamination in soil may benefit from the discussions and points in this paper. Proper use of correction factors and comparison of the data from three different gamma-emitting radionuclides revealed agreement with expectations and provided confidence that using such correction factors generates quality data. The results indicate that if contamination is low level and due to NORM, the Ra-226 can be measured directly if corrected to subtract the contribution from U-235. If there is any indication that technologically enhanced uranium may be present, the preferred measurement approach for quantitation of Ra-226 activity is detection of one of the Ra-226 daughters, Pb-214 or Bi-214, using a correction factor obtained from an in-growth curve. The results also show that the adjusted Ra-226 results compare very well with both the Pb-214 and Bi-214 results obtained using an in-growth curve correction factor. (authors)« less

  2. Analytical and clinical performance of thyroglobulin autoantibody assays in thyroid cancer follow-up.

    PubMed

    Katrangi, Waddah; Grebe, Stephan K G; Algeciras-Schimnich, Alicia

    2017-10-26

    While thyroglobulin autoantibodies (TgAb) can result in false low serum thyroglobulin (Tg) immunoassay (IA) measurements, they might also be indicators of disease persistence/recurrence. Hence, accurate TgAb measurement, in addition to Tg quantification, is crucial for thyroid cancer monitoring. We compared the analytical and clinical performance of four commonly used TgAb IAs. We measured Tg by mass spectrometry (Tg-MS) and by four pairs of Tg and TgAb IAs (Beckman, Roche, Siemens, Thermo) in 576 samples. Limit of quantitation (LOQ) and manufacturers' upper reference interval cut-off (URI) were used for comparisons. Clinical performance was assessed by receiving operator characteristics (ROC) curve analysis. Quantitative and qualitative agreement between TgAb-IAs was moderate with R2 of 0.20-0.70 and κ from 0.41-0.66 using LOQ and 0.47-0.71 using URI. In samples with TgAb interference, detection rates of TgAb were similar using LOQ and URI for Beckman, Siemens, and Thermo, but much lower for the Roche TgAb-IA when the URI was used. In TgAb positive cases, the ROC areas under the curve (AUC) for the TgAb-IAs were 0.59 (Beckman), 0.62 (Siemens), 0.59 (Roche), and 0.59 (Thermo), similar to ROC AUCs achieved with Tg. Combining Tg and TgAb measurements improved the ROC AUCs compared to Tg or TgAb alone. TgAb-IAs show significant qualitative and quantitative differences. For 2 of the 4 TgAb-IAs, using the LOQ improves the detection of interfering TgAbs. All assays showed suboptimal clinical performance when used as surrogate markers of disease, with modest improvements when Tg and TgAb were combined.

  3. Characterization of E 471 food emulsifiers by high-performance thin-layer chromatography-fluorescence detection.

    PubMed

    Oellig, Claudia; Brändle, Klara; Schwack, Wolfgang

    2018-07-13

    Mono- and diacylglycerol (MAG and DAG) emulsifiers, also known as food additive E 471, are widely used to adjust techno-functional properties in various foods. Besides MAGs and DAGs, E 471 emulsifiers additionally comprise different amounts of triacylglycerols (TAGs) and free fatty acids (FFAs). MAGs, DAGs, TAGs and FFAs are generally determined by high-performance liquid chromatography (HPLC) or gas chromatography (GC) coupled to mass selective detection, analyzing the individual representatives of the lipid classes. In this work we present a rapid and sensitive method for the determination of MAGs, DAGs, TAGs and FFAs in E 471 emulsifiers by high-performance thin-layer chromatography with fluorescence detection (HPTLC-FLD), including a response factor system for quantitation. Samples were simply dissolved and diluted with t-butyl methyl ether before a two-fold development was performed on primuline pre-impregnated LiChrospher silica gel plates with diethyl ether and n-pentane/n-hexane/diethyl ether (52:20:28, v/v/v) as the mobile phases to 18 and 75 mm, respectively. For quantitation, the plate was scanned in the fluorescence mode at UV 366/>400 nm, when the cumulative signal for each lipid class was used. Calibration was done with 1,2-distearin and amounts of lipid classes were calculated with response factors and expressed as monostearin, distearin, tristearin and stearic acid. Limits of detection and quantitation were 1 and 4 ng/zone, respectively, for 1,2-distearin. Thus, the HPTLC-FLD approach represents a simple, rapid and convenient screening alternative to HPLC and GC analysis of the individual compounds. Visual detection additionally enables an easy characterization and the direct comparison of emulsifiers through the lipid class pattern, when utilized as a fingerprint. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Reduced-cost Chlamydia trachomatis-specific multiplex real-time PCR diagnostic assay evaluated for ocular swabs and use by trachoma research programmes.

    PubMed

    Butcher, Robert; Houghton, Jo; Derrick, Tamsyn; Ramadhani, Athumani; Herrera, Beatriz; Last, Anna R; Massae, Patrick A; Burton, Matthew J; Holland, Martin J; Roberts, Chrissy H

    2017-08-01

    Trachoma, caused by the intracellular bacterium Chlamydia trachomatis (Ct), is the leading infectious cause of preventable blindness. Many commercial platforms are available that provide highly sensitive and specific detection of Ct DNA. However, the majority of these commercial platforms are inaccessible for population-level surveys in resource-limited settings typical to trachoma control programmes. We developed two low-cost quantitative PCR (qPCR) tests for Ct using readily available reagents on standard real-time thermocyclers. Each multiplex qPCR test targets one genomic and one plasmid Ct target in addition to an endogenous positive control for Homo sapiens DNA. The quantitative performance of the qPCR assays in clinical samples was determined by comparison to a previously evaluated droplet digital PCR (ddPCR) test. The diagnostic performance of the qPCR assays were evaluated against a commercial assay (artus C. trachomatis Plus RG PCR, Qiagen) using molecular diagnostics quality control standards and clinical samples. We examined the yield of Ct DNA prepared from five different DNA extraction kits and a cold chain-free dry-sample preservation method using swabs spiked with fixed concentrations of human and Ct DNA. The qPCR assay was highly reproducible (Ct plasmid and genomic targets mean total coefficients of variance 41.5% and 48.3%, respectively). The assay detected 8/8 core specimens upon testing of a quality control panel and performed well in comparison to commercially marketed comparator test (sensitivity and specificity>90%). Optimal extraction and sample preservation methods for research applications were identified. We describe a pipeline from collection to diagnosis providing the most efficient sample preservation and extraction with significant per test cost savings over a commercial qPCR diagnostic assay. The assay and its evaluation should allow control programs wishing to conduct independent research within the context of trachoma control, access to an affordable test with defined performance characteristics. Copyright © 2017. Published by Elsevier B.V.

  5. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    EPA Science Inventory

    Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...

  6. Evaluation of exchange-correlation functionals for time-dependent density functional theory calculations on metal complexes.

    PubMed

    Holland, Jason P; Green, Jennifer C

    2010-04-15

    The electronic absorption spectra of a range of copper and zinc complexes have been simulated by using time-dependent density functional theory (TD-DFT) calculations implemented in Gaussian03. In total, 41 exchange-correlation (XC) functionals including first-, second-, and third-generation (meta-generalized gradient approximation) DFT methods were compared in their ability to predict the experimental electronic absorption spectra. Both pure and hybrid DFT methods were tested and differences between restricted and unrestricted calculations were also investigated by comparison of analogous neutral zinc(II) and copper(II) complexes. TD-DFT calculated spectra were optimized with respect to the experimental electronic absorption spectra by use of a Matlab script. Direct comparison of the performance of each XC functional was achieved both qualitatively and quantitatively by comparison of optimized half-band widths, root-mean-squared errors (RMSE), energy scaling factors (epsilon(SF)), and overall quality-of-fit (Q(F)) parameters. Hybrid DFT methods were found to outperform all pure DFT functionals with B1LYP, B97-2, B97-1, X3LYP, and B98 functionals providing the highest quantitative and qualitative accuracy in both restricted and unrestricted systems. Of the functionals tested, B1LYP gave the most accurate results with both average RMSE and overall Q(F) < 3.5% and epsilon(SF) values close to unity (>0.990) for the copper complexes. The XC functional performance in spin-restricted TD-DFT calculations on the zinc complexes was found to be slightly worse. PBE1PBE, mPW1PW91 and B1LYP gave the most accurate results with typical RMSE and Q(F) values between 5.3 and 7.3%, and epsilon(SF) around 0.930. These studies illustrate the power of modern TD-DFT calculations for exploring excited state transitions of metal complexes. 2009 Wiley Periodicals, Inc.

  7. Physical activity among South Asian women: a systematic, mixed-methods review

    PubMed Central

    2012-01-01

    Introduction The objective of this systematic mixed-methods review is to assess what is currently known about the levels of physical activity (PA) and sedentary time (ST) and to contextualize these behaviors among South Asian women with an immigrant background. Methods A systematic search of the literature was conducted using combinations of the key words PA, ST, South Asian, and immigrant. A mixed-methods approach was used to analyze and synthesize all evidence, both quantitative and qualitative. Twenty-six quantitative and twelve qualitative studies were identified as meeting the inclusion criteria. Results Studies quantifying PA and ST among South Asian women showed low levels of PA compared with South Asian men and with white European comparison populations. However making valid comparisons between studies was challenging due to a lack of standardized PA measurement. The majority of studies indicated that South Asian women did not meet recommended amounts of PA for health benefits. Few studies assessed ST. Themes emerging from qualitative studies included cultural and structural barriers to PA, faith and education as facilitators, and a lack of understanding of the recommended amounts of PA and its benefits among South Asian women. Conclusions Quantitative and qualitative evidence indicate that South Asian women do not perform the recommended level of PA for health benefits. Both types of studies suffer from limitations due to methods of data collection. More research should be dedicated to standardizing objective PA measurement and to understanding how to utilize the resources of the individuals and communities to increase PA levels and overall health of South Asian women. PMID:23256686

  8. Physical activity among South Asian women: a systematic, mixed-methods review.

    PubMed

    Babakus, Whitney S; Thompson, Janice L

    2012-12-20

    The objective of this systematic mixed-methods review is to assess what is currently known about the levels of physical activity (PA) and sedentary time (ST) and to contextualize these behaviors among South Asian women with an immigrant background. A systematic search of the literature was conducted using combinations of the key words PA, ST, South Asian, and immigrant. A mixed-methods approach was used to analyze and synthesize all evidence, both quantitative and qualitative. Twenty-six quantitative and twelve qualitative studies were identified as meeting the inclusion criteria. Studies quantifying PA and ST among South Asian women showed low levels of PA compared with South Asian men and with white European comparison populations. However making valid comparisons between studies was challenging due to a lack of standardized PA measurement. The majority of studies indicated that South Asian women did not meet recommended amounts of PA for health benefits. Few studies assessed ST. Themes emerging from qualitative studies included cultural and structural barriers to PA, faith and education as facilitators, and a lack of understanding of the recommended amounts of PA and its benefits among South Asian women. Quantitative and qualitative evidence indicate that South Asian women do not perform the recommended level of PA for health benefits. Both types of studies suffer from limitations due to methods of data collection. More research should be dedicated to standardizing objective PA measurement and to understanding how to utilize the resources of the individuals and communities to increase PA levels and overall health of South Asian women.

  9. Comparison of propidium monoazide-quantitative PCR and reverse transcription quantitative PCR for viability detection of fresh Cryptosporidium oocysts following disinfection and after long-term storage in water samples

    EPA Science Inventory

    Purified oocysts of Cryptosporidium parvum were used to evaluate applicability of two quantitative PCR (qPCR) viability detection methods in raw surface water and disinfection treated water. Propidium monoazide-qPCR targeting hsp70 gene was compared to reverse transcription (RT)-...

  10. Directed differential connectivity graph of interictal epileptiform discharges

    PubMed Central

    Amini, Ladan; Jutten, Christian; Achard, Sophie; David, Olivier; Soltanian-Zadeh, Hamid; Hossein-Zadeh, Gh. Ali; Kahane, Philippe; Minotti, Lorella; Vercueil, Laurent

    2011-01-01

    In this paper, we study temporal couplings between interictal events of spatially remote regions in order to localize the leading epileptic regions from intracerebral electroencephalogram (iEEG). We aim to assess whether quantitative epileptic graph analysis during interictal period may be helpful to predict the seizure onset zone of ictal iEEG. Using wavelet transform, cross-correlation coefficient, and multiple hypothesis test, we propose a differential connectivity graph (DCG) to represent the connections that change significantly between epileptic and non-epileptic states as defined by the interictal events. Post-processings based on mutual information and multi-objective optimization are proposed to localize the leading epileptic regions through DCG. The suggested approach is applied on iEEG recordings of five patients suffering from focal epilepsy. Quantitative comparisons of the proposed epileptic regions within ictal onset zones detected by visual inspection and using electrically stimulated seizures, reveal good performance of the present method. PMID:21156385

  11. Quantitative evaluation of refolding conditions for a disulfide-bond-containing protein using a concise 18O-labeling technique

    PubMed Central

    Uchimura, Hiromasa; Kim, Yusam; Mizuguchi, Takaaki; Kiso, Yoshiaki; Saito, Kazuki

    2011-01-01

    A concise method was developed for quantifying native disulfide-bond formation in proteins using isotopically labeled internal standards, which were easily prepared with proteolytic 18O-labeling. As the method has much higher throughput to estimate the amounts of fragments possessing native disulfide arrangements by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) than the conventional high performance liquid chromatography (HPLC) analyses, it allows many different experimental conditions to be assessed in a short time. The method was applied to refolding experiments of a recombinant neuregulin 1-β1 EGF-like motif (NRG1-β1), and the optimum conditions for preparing native NRG1-β1 were obtained by quantitative comparisons. Protein disulfide isomerase (PDI) was most effective at the reduced/oxidized glutathione ratio of 2:1 for refolding the denatured sample NRG1-β1 with the native disulfide bonds. PMID:21500299

  12. Digital detection of endonuclease mediated gene disruption in the HIV provirus

    PubMed Central

    Sedlak, Ruth Hall; Liang, Shu; Niyonzima, Nixon; De Silva Feelixge, Harshana S.; Roychoudhury, Pavitra; Greninger, Alexander L.; Weber, Nicholas D.; Boissel, Sandrine; Scharenberg, Andrew M.; Cheng, Anqi; Magaret, Amalia; Bumgarner, Roger; Stone, Daniel; Jerome, Keith R.

    2016-01-01

    Genome editing by designer nucleases is a rapidly evolving technology utilized in a highly diverse set of research fields. Among all fields, the T7 endonuclease mismatch cleavage assay, or Surveyor assay, is the most commonly used tool to assess genomic editing by designer nucleases. This assay, while relatively easy to perform, provides only a semi-quantitative measure of mutation efficiency that lacks sensitivity and accuracy. We demonstrate a simple droplet digital PCR assay that quickly quantitates a range of indel mutations with detection as low as 0.02% mutant in a wild type background and precision (≤6%CV) and accuracy superior to either mismatch cleavage assay or clonal sequencing when compared to next-generation sequencing. The precision and simplicity of this assay will facilitate comparison of gene editing approaches and their optimization, accelerating progress in this rapidly-moving field. PMID:26829887

  13. QSAR study of anthranilic acid sulfonamides as inhibitors of methionine aminopeptidase-2 using LS-SVM and GRNN based on principal components.

    PubMed

    Shahlaei, Mohsen; Sabet, Razieh; Ziari, Maryam Bahman; Moeinifard, Behzad; Fassihi, Afshin; Karbakhsh, Reza

    2010-10-01

    Quantitative relationships between molecular structure and methionine aminopeptidase-2 inhibitory activity of a series of cytotoxic anthranilic acid sulfonamide derivatives were discovered. We have demonstrated the detailed application of two efficient nonlinear methods for evaluation of quantitative structure-activity relationships of the studied compounds. Components produced by principal component analysis as input of developed nonlinear models were used. The performance of the developed models namely PC-GRNN and PC-LS-SVM were tested by several validation methods. The resulted PC-LS-SVM model had a high statistical quality (R(2)=0.91 and R(CV)(2)=0.81) for predicting the cytotoxic activity of the compounds. Comparison between predictability of PC-GRNN and PC-LS-SVM indicates that later method has higher ability to predict the activity of the studied molecules. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.

  14. Quantifying substrate uptake by individual cells of marine bacterioplankton by catalyzed reporter deposition fluorescence in situ hybridization combined with microautoradiography.

    PubMed

    Sintes, Eva; Herndl, Gerhard J

    2006-11-01

    Catalyzed reporter deposition fluorescence in situ hybridization combined with microautoradiography (MICRO-CARD-FISH) is increasingly being used to obtain qualitative information on substrate uptake by individual members of specific prokaryotic communities. Here we evaluated the potential for using this approach quantitatively by relating the measured silver grain area around cells taking up (3)H-labeled leucine to bulk leucine uptake measurements. The increase in the silver grain area over time around leucine-assimilating cells of coastal bacterial assemblages was linear during 4 to 6 h of incubation. By establishing standardized conditions for specific activity levels and concomitantly performing uptake measurements with the bulk community, MICRO-CARD-FISH can be used quantitatively to determine uptake rates on a single-cell level. Therefore, this approach allows comparisons of single-cell activities for bacterial communities obtained from different sites or growing under different ecological conditions.

  15. Quantifying Substrate Uptake by Individual Cells of Marine Bacterioplankton by Catalyzed Reporter Deposition Fluorescence In Situ Hybridization Combined with Microautoradiography▿

    PubMed Central

    Sintes, Eva; Herndl, Gerhard J.

    2006-01-01

    Catalyzed reporter deposition fluorescence in situ hybridization combined with microautoradiography (MICRO-CARD-FISH) is increasingly being used to obtain qualitative information on substrate uptake by individual members of specific prokaryotic communities. Here we evaluated the potential for using this approach quantitatively by relating the measured silver grain area around cells taking up 3H-labeled leucine to bulk leucine uptake measurements. The increase in the silver grain area over time around leucine-assimilating cells of coastal bacterial assemblages was linear during 4 to 6 h of incubation. By establishing standardized conditions for specific activity levels and concomitantly performing uptake measurements with the bulk community, MICRO-CARD-FISH can be used quantitatively to determine uptake rates on a single-cell level. Therefore, this approach allows comparisons of single-cell activities for bacterial communities obtained from different sites or growing under different ecological conditions. PMID:16950912

  16. A comparative uncertainty study of the calibration of macrolide antibiotic reference standards using quantitative nuclear magnetic resonance and mass balance methods.

    PubMed

    Liu, Shu-Yu; Hu, Chang-Qin

    2007-10-17

    This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.

  17. KEY COMPARISON: Key comparison CCQM-K60: Total selenium and selenomethionine in selenised wheat flour

    NASA Astrophysics Data System (ADS)

    Goenaga Infante, Heidi; Sargent, Mike

    2010-01-01

    Key comparison CCQM-K60 was performed to assess the analytical capabilities of national metrology institutes (NMIs) to accurately quantitate the mass fraction of selenomethionine (SeMet) and total selenium (at low mg kg-1 levels) in selenised wheat flour. It was organized by the Inorganic Analysis Working Group (IAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM) as a follow-up key comparison to the previous pilot study CCQM-P86 on selenised yeast tablets. LGC Limited (Teddington, UK) and the Institute for National Measurement Standards, National Research Council Canada (NRCC, Ottawa, Canada) acted as the coordinating laboratories. CCQM-K60 was organized in parallel with a pilot study (CCQM-P86.1) involving not only NMIs but also expert laboratories worldwide, thus enabling them to assess their capabilities, discover problems and learn how to modify analytical procedures accordingly. Nine results for total Se and four results for SeMet were reported by the participant NMIs. Methods used for sample preparation were microwave assisted acid digestion for total Se and multiple-step enzymatic hydrolysis and hydrolysis with methanesulfonic acid for SeMet. For total Se, detection techniques included inductively coupled plasma mass spectrometry (ICP-MS) with external calibration, standard additions or isotope dilution analysis (IDMS); instrumental neutron activation analysis (INAA); and graphite furnace atomic absorption spectrometry (GFAAS) with external calibration. For determination of SeMet in the wheat flour sample, the four NMIs relied upon measurements using species-specific IDMS (using 76Se-enriched SeMet) with HPLC-ICP-MS. Eight of the nine participating NMIs reported results for total Se within 3.5% deviation from the key comparison reference value (KCRV). For SeMet, the four participating NMIs reported results within 3.2% deviation from the KCRV. This shows that the performance of the majority of the CCQM-K60 participants was very good, illustrating their ability to obtain accurate results for such analytes in a complex food matrix containing approximately 17 mg kg-1 Se. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  18. Quantitative analysis of naphthenic acids in water by liquid chromatography-accurate mass time-of-flight mass spectrometry.

    PubMed

    Hindle, Ralph; Noestheden, Matthew; Peru, Kerry; Headley, John

    2013-04-19

    This study details the development of a routine method for quantitative analysis of oil sands naphthenic acids, which are a complex class of compounds found naturally and as contaminants in oil sands process waters from Alberta's Athabasca region. Expanding beyond classical naphthenic acids (CnH2n-zO2), those compounds conforming to the formula CnH2n-zOx (where 2≥x≤4) were examined in commercial naphthenic acid and environmental water samples. HPLC facilitated a five-fold reduction in ion suppression when compared to the more commonly used flow injection analysis. A comparison of 39 model naphthenic acids revealed significant variability in response factors, demonstrating the necessity of using naphthenic acid mixtures for quantitation, rather than model compounds. It was also demonstrated that naphthenic acidic heterogeneity (commercial and environmental) necessitates establishing a single NA mix as the standard against which all quantitation is performed. The authors present the first ISO17025 accredited method for the analysis of naphthenic acids in water using HPLC high resolution accurate mass time-of-flight mass spectrometry. The method detection limit was 1mg/L total oxy-naphthenic acids (Sigma technical mix). Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Development of an analytical method for quantitative comparison of the e-waste management systems in Thailand, Laos, and China.

    PubMed

    Liang, Li; Sharp, Alice

    2016-11-01

    This study employed a set of quantitative criteria to analyse the three parameters; namely policy, process, and practice; of the respective e-waste management systems adopted in Thailand, Laos, and China. Questionnaire surveys were conducted to determine the current status of the three parameters in relation to mobile phones. A total of five, three, and six variables under Policy (P 1 ), Process (P 2 ), and Practice (P 3 ), respectively, were analysed and their weighted averages were calculated. The results showed that among the three countries surveyed, significant differences at p<0.01 were observed in all the P 1 , P 2 , and P 3 variables, except P 305 (sending e-waste to recovery centres) and P 306 (treating e-waste by retailers themselves). Based on the quantitative method developed in this study, Laos' e-waste management system received the highest scores in both P 1 average (0.130) and P 3 average (0.129). However, in the combined P total , China scored the highest (0.141), followed by Laos (0.132) and Thailand (0.121). This method could be used to assist decision makers in performing quantitative analysis of complex issues associating with e-waste management in a country. © The Author(s) 2016.

  20. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  1. Clinical performance of the LCx HCV RNA quantitative assay.

    PubMed

    Bertuzis, Rasa; Hardie, Alison; Hottentraeger, Barbara; Izopet, Jacques; Jilg, Wolfgang; Kaesdorf, Barbara; Leckie, Gregor; Leete, Jean; Perrin, Luc; Qiu, Chunfu; Ran, Iris; Schneider, George; Simmonds, Peter; Robinson, John

    2005-02-01

    This study was conducted to assess the performance of the Abbott laboratories LCx HCV RNA Quantitative Assay (LCx assay) in the clinical setting. Four clinical laboratories measured LCx assay precision, specificity, and linearity. In addition, a method comparison was conducted between the LCx assay and the Roche HCV Amplicor Monitor, version 2.0 (Roche Monitor 2.0) and the Bayer VERSANT HCV RNA 3.0 Assay (Bayer bDNA 3.0) quantitative assays. For precision, the observed LCx assay intra-assay standard deviation (S.D.) was 0.060-0.117 log IU/ml, the inter-assay S.D. was 0.083-0.133 log IU/ml, the inter-lot S.D. was 0.105-0.177 log IU/ml, the inter-site S.D. was 0.099-0.190 log IU/ml, and the total S.D. was 0.113-0.190 log IU/ml. The specificity of the LCx assay was 99.4% (542/545; 95% CI, 98.4-99.9%). For linearity, the mean pooled LCx assay results were linear (r=0.994) over the range of the panel (2.54-5.15 log IU/ml). A method comparison demonstrated a correlation coefficient of 0.881 between the LCx assay and Roche Monitor 2.0, 0.872 between the LCx assay and Bayer bDNA 3.0, and 0.870 between Roche Monitor 2.0 and Bayer bDNA 3.0. The mean LCx assay result was 0.04 log IU/ml (95% CI, -0.08, 0.01) lower than the mean Roche Monitor 2.0 result, but 0.57 log IU/ml (95% CI, 0.53, 0.61) higher than the mean Bayer bDNA 3.0 result. The mean Roche Monitor 2.0 result was 0.60 log IU/ml (95% CI, 0.56, 0.65) higher than the mean Bayer bDNA 3.0 result. The LCx assay quantitated genotypes 1-4 with statistical equivalency. The vast majority (98.9%, 278/281) of paired LCx assay-Roche Monitor 2.0 specimen results were within 1 log IU/ml. Similarly, 86.6% (240/277) of paired LCx assay and Bayer bDNA 3.0 specimen results were within 1 log, as were 85.6% (237/277) of paired Roche Monitor 2.0 and Bayer specimen results. These data demonstrate that the LCx assay may be used for quantitation of HCV RNA in HCV-infected individuals.

  2. Cone Beam CT vs. Fan Beam CT: A Comparison of Image Quality and Dose Delivered Between Two Differing CT Imaging Modalities.

    PubMed

    Lechuga, Lawrence; Weidlich, Georg A

    2016-09-12

    A comparison of image quality and dose delivered between two differing computed tomography (CT) imaging modalities-fan beam and cone beam-was performed. A literature review of quantitative analyses for various image quality aspects such as uniformity, signal-to-noise ratio, artifact presence, spatial resolution, modulation transfer function (MTF), and low contrast resolution was generated. With these aspects quantified, cone beam computed tomography (CBCT) shows a superior spatial resolution to that of fan beam, while fan beam shows a greater ability to produce clear and anatomically correct images with better soft tissue differentiation. The results indicate that fan beam CT produces superior images to that of on-board imaging (OBI) cone beam CT systems, while providing a considerably less dose to the patient.

  3. Cone Beam CT vs. Fan Beam CT: A Comparison of Image Quality and Dose Delivered Between Two Differing CT Imaging Modalities

    PubMed Central

    Weidlich, Georg A.

    2016-01-01

    A comparison of image quality and dose delivered between two differing computed tomography (CT) imaging modalities—fan beam and cone beam—was performed. A literature review of quantitative analyses for various image quality aspects such as uniformity, signal-to-noise ratio, artifact presence, spatial resolution, modulation transfer function (MTF), and low contrast resolution was generated. With these aspects quantified, cone beam computed tomography (CBCT) shows a superior spatial resolution to that of fan beam, while fan beam shows a greater ability to produce clear and anatomically correct images with better soft tissue differentiation. The results indicate that fan beam CT produces superior images to that of on-board imaging (OBI) cone beam CT systems, while providing a considerably less dose to the patient. PMID:27752404

  4. Visual Search with Image Modification in Age-Related Macular Degeneration

    PubMed Central

    Wiecek, Emily; Jackson, Mary Lou; Dakin, Steven C.; Bex, Peter

    2012-01-01

    Purpose. AMD results in loss of central vision and a dependence on low-resolution peripheral vision. While many image enhancement techniques have been proposed, there is a lack of quantitative comparison of the effectiveness of enhancement. We developed a natural visual search task that uses patients' eye movements as a quantitative and functional measure of the efficacy of image modification. Methods. Eye movements of 17 patients (mean age = 77 years) with AMD were recorded while they searched for target objects in natural images. Eight different image modification methods were implemented and included manipulations of local image or edge contrast, color, and crowding. In a subsequent task, patients ranked their preference of the image modifications. Results. Within individual participants, there was no significant difference in search duration or accuracy across eight different image manipulations. When data were collapsed across all image modifications, a multivariate model identified six significant predictors for normalized search duration including scotoma size and acuity, as well as interactions among scotoma size, age, acuity, and contrast (P < 0.05). Additionally, an analysis of image statistics showed no correlation with search performance across all image modifications. Rank ordering of enhancement methods based on participants' preference revealed a trend that participants preferred the least modified images (P < 0.05). Conclusions. There was no quantitative effect of image modification on search performance. A better understanding of low- and high-level components of visual search in natural scenes is necessary to improve future attempts at image enhancement for low vision patients. Different search tasks may require alternative image modifications to improve patient functioning and performance. PMID:22930725

  5. Determination of vitamins D2 and D3 in selected food matrices by online high-performance liquid chromatography-gas chromatography-mass spectrometry (HPLC-GC-MS).

    PubMed

    Nestola, Marco; Thellmann, Andrea

    2015-01-01

    An online normal-phase liquid chromatography-gas chromatography-mass spectrometry (HPLC-GC-MS) method was developed for the determination of vitamins D2 and D3 in selected food matrices. Transfer of the sample from HPLC to GC was realized by large volume on-column injection; detection was performed with a time-of-flight mass spectrometer (TOF-MS). Typical GC problems in the determination of vitamin D such as sample degradation or sensitivity issues, previously reported in the literature, were not observed. Determination of total vitamin D content was done by quantitation of its pyro isomer based on an isotopically labelled internal standard (ISTD). Extracted ion traces of analyte and ISTD showed cross-contribution, but non-linearity of the calibration curve was not determined inside the chosen calibration range by selection of appropriate quantifier ions. Absolute limits of detection (LOD) and quantitation (LOQ) for vitamins D2 and D3 were calculated as approximately 50 and 150 pg, respectively. Repeatability with internal standard correction was below 2 %. Good agreement between quantitative results of an established high-performance liquid chromatography with UV detection (HPLC-UV) method and HPLC-GC-MS was found. Sterol-enriched margarine was subjected to HPLC-GC-MS and HPLC-MS/MS for comparison, because HPLC-UV showed strong matrix interferences. HPLC-GC-MS produced comparable results with less manual sample cleanup. In summary, online hyphenation of HPLC and GC allowed a minimization in manual sample preparation with an increase of sample throughput.

  6. Quantitative comparison between PGNAA measurements and MCNP calculations in view of the characterization of radioactive wastes in Germany and France

    NASA Astrophysics Data System (ADS)

    Mauerhofer, E.; Havenith, A.; Carasco, C.; Payan, E.; Kettler, J.; Ma, J. L.; Perot, B.

    2013-04-01

    The Forschungszentrum Jülich GmbH (FZJ), together with the Aachen University Rheinisch-Westfaelische Technische Hochschule (RWTH) and the French Alternative Energies and Atomic Energy Commission (CEA Cadarache) are involved in a cooperation aiming at characterizing toxic and reactive elements in radioactive waste packages by means of Prompt Gamma Neutron Activation Analysis (PGNAA) [1]. The French and German waste management agencies have indeed defined acceptability limits concerning these elements in view of their projected geological repositories. A first measurement campaign was performed in the new Prompt Gamma Neutron Activation Analysis (PGNAA) facility called MEDINA, at FZJ, to assess the capture gamma-ray signatures of some elements of interest in large samples up to waste drums with a volume of 200 liter. MEDINA is the acronym for Multi Element Detection based on Instrumental Neutron Activation. This paper presents MCNP calculations of the MEDINA facility and quantitative comparison between measurement and simulation. Passive gamma-ray spectra acquired with a high purity germanium detector and calibration sources are used to qualify the numerical model of the crystal. Active PGNAA spectra of a sodium chloride sample measured with MEDINA then allow for qualifying the global numerical model of the measurement cell. Chlorine indeed constitutes a usual reference with reliable capture gamma-ray production data. The goal is to characterize the entire simulation protocol (geometrical model, nuclear data, and postprocessing tools) which will be used for current measurement interpretation, extrapolation of the performances to other types of waste packages or other applications, as well as for the study of future PGNAA facilities.

  7. Occupation-Based Intervention for Addictive Disorders: A Systematic Review.

    PubMed

    Wasmuth, Sally; Pritchard, Kevin; Kaneshiro, Kellie

    2016-03-01

    Addictive disorders disrupt individuals' occupational lives, suggesting that occupational therapists can play a crucial role in addiction rehabilitation. Occupation-based interventions are those in which an occupation is performed, and occupations are defined as those activities a person engages in to structure time and create meaning in one's life. This review asked: In persons with addictive disorders, are occupation-based interventions more effective than treatment as usual in improving short and long-term recovery outcomes? A systematic literature search was performed by a medical librarian in Ovid MEDLINE, PsychINFO, Social Work Abstracts, OTSeeker, HealthSTAR, CINAHL, and ACPJournalClub. Authors screened 1095 articles for inclusion criteria (prospective outcome studies examining the effectiveness of an occupation-based intervention with a sample primarily consisting of a diagnosis of a substance-related or addictive disorder and with at least five participants), and two authors appraised the resulting 66 articles using a standard appraisal tool, yielding 26 articles for qualitative synthesis and 8 with shared outcome measures for quantitative analysis. Occupation-based interventions in the areas of work, leisure, and social participation were found to have been used to treat addictive disorders. Occupation-based interventions in the area of social participation all elicited better outcomes than their respective control/comparison groups. Not all occupation-based interventions in the area of leisure elicited better outcomes than their comparison group, but in the eight articles with shared outcome measures, quantitative analysis demonstrated leisure interventions produced larger effect sizes than social participation interventions. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Ultrasound-guided injection for MR arthrography of the hip: comparison of two different techniques.

    PubMed

    Kantarci, Fatih; Ozbayrak, Mustafa; Gulsen, Fatih; Gencturk, Mert; Botanlioglu, Huseyin; Mihmanli, Ismail

    2013-01-01

    The purpose of this study was to prospectively evaluate the two different ultrasound-guided injection techniques for MR arthrography of the hip. Fifty-nine consecutive patients (21 men, 38 women) referred for MR arthrographies of the hip were prospectively included in the study. Three patients underwent bilateral MR arthrography. The two injection techniques were quantitatively and qualitatively compared. Quantitative analysis was performed by the comparison of injected contrast material volume into the hip joint. Qualitative analysis was performed with regard to extraarticular leakage of contrast material into the soft tissues. Extraarticular leakage of contrast material was graded as none, minimal, moderate, or severe according to the MR images. Each patient rated discomfort after the procedure using a visual analogue scale (VAS). The injected contrast material volume was less in femoral head puncture technique (mean 8.9 ± 3.4 ml) when compared to femoral neck puncture technique (mean 11.2 ± 2.9 ml) (p < 0.05). The chi-squared test showed significantly more contrast leakage by femoral head puncture technique (p < 0.05). Statistical analysis showed no difference between the head and neck puncture groups in terms of feeling of pain (p = 0.744) or in the body mass index (p = 0.658) of the patients. The femoral neck injection technique provides high intraarticular contrast volume and produces less extraarticular contrast leakage than the femoral head injection technique when US guidance is used for MR arthrography of the hip.

  9. CCQM-K104 key comparison (avermectin B1a) on the characterization of organic substances for chemical purity

    NASA Astrophysics Data System (ADS)

    Dai, Xinhua; Zhang, Wei; Li, Hongmei; Huang, Ting; Li, Mengwan; Quan, Can; Zhang, Qinghe; Davies, Stephen R.; Warren, John; Lo, Man-fung; Kakoulides, Elias; Ceyhan Gören, Ahmet; Marbumrung, Sornkrit; Pfeifer, Dietmar; Ün, İlker; Gündüz, Simay; Yilmaz, Hasibe; Kankaew, Pornhatai; Sudsiri, Nittaya; Shearman, Kittiya; Pookrod, Preeyaporn; Polzer, Joachim; Radeck, Wolfgang

    2017-01-01

    Under the Comité Consultatif pour la Quantité de Matière (CCQM), a key comparison, CCQM-K104, was coordinated by the National Institute of Metrology (NIM). The comparison was designed to demonstrate a laboratory's performance in determining the mass fraction of the main component in a complex high purity organic material. Nine NMIs or DIs participated in the comparison. Eight participants reported their results. An additional impurity was resolved from the avermectin B1a peak and was tentatively identified as an unknown impurity by NMIA (National Measurement Institute (Australia)). It was subsequently identified by NIM as a diastereoisomer of avermectin B1a at the C-26 position. Final reference value (KCRV) = 924.63 mg/g, with uncertainty (k=1) = 3.89 mg/g, and expanded uncertainty = 8.97 mg/g. The degrees of equivalence with the avermectin B1a KCRV for each participant were reported. The measurement results and degrees of equivalence should be indicative of the performance of a laboratory's measurement capability for the purity assignment of organic compounds of high structural complexity (relative molecular mass range of 500 Da -1000 Da and low polarity (-log KOW <= -2). Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  10. The effect of constructivist teaching strategies on science test scores of middle school students

    NASA Astrophysics Data System (ADS)

    Vaca, James L., Jr.

    International studies show that the United States is lagging behind other industrialized countries in science proficiency. The studies revealed how American students showed little significant gain on standardized tests in science between 1995 and 2005. Little information is available regarding how reform in American teaching strategies in science could improve student performance on standardized testing. The purpose of this quasi-experimental quantitative study using a pretest/posttest control group design was to examine how the use of a hands-on, constructivist teaching approach with low achieving eighth grade science students affected student achievement on the 2007 Ohio Eighth Grade Science Achievement Test posttest (N = 76). The research question asked how using constructivist teaching strategies in the science classroom affected student performance on standardized tests. Two independent samples of 38 students each consisting of low achieving science students as identified by seventh grade science scores and scores on the Ohio Eighth Grade Science Half-Length Practice Test pretest were used. Four comparisons were made between the control group receiving traditional classroom instruction and the experimental group receiving constructivist instruction including: (a) pretest/posttest standard comparison, (b) comparison of the number of students who passed the posttest, (c) comparison of the six standards covered on the posttest, (d) posttest's sample means comparison. A Mann-Whitney U Test revealed that there was no significant difference between the independent sample distributions for the control group and the experimental group. These findings contribute to positive social change by investigating science teaching strategies that could be used in eighth grade science classes to improve student achievement in science.

  11. Multilayer Markov Random Field models for change detection in optical remote sensing images

    NASA Astrophysics Data System (ADS)

    Benedek, Csaba; Shadaydeh, Maha; Kato, Zoltan; Szirányi, Tamás; Zerubia, Josiane

    2015-09-01

    In this paper, we give a comparative study on three Multilayer Markov Random Field (MRF) based solutions proposed for change detection in optical remote sensing images, called Multicue MRF, Conditional Mixed Markov model, and Fusion MRF. Our purposes are twofold. On one hand, we highlight the significance of the focused model family and we set them against various state-of-the-art approaches through a thematic analysis and quantitative tests. We discuss the advantages and drawbacks of class comparison vs. direct approaches, usage of training data, various targeted application fields and different ways of Ground Truth generation, meantime informing the Reader in which roles the Multilayer MRFs can be efficiently applied. On the other hand we also emphasize the differences between the three focused models at various levels, considering the model structures, feature extraction, layer interpretation, change concept definition, parameter tuning and performance. We provide qualitative and quantitative comparison results using principally a publicly available change detection database which contains aerial image pairs and Ground Truth change masks. We conclude that the discussed models are competitive against alternative state-of-the-art solutions, if one uses them as pre-processing filters in multitemporal optical image analysis. In addition, they cover together a large range of applications, considering the different usage options of the three approaches.

  12. Detection of tumor markers in prostate cancer and comparison of sensitivity between real time and nested PCR.

    PubMed

    Matsuoka, Takayuki; Shigemura, Katsumi; Yamamichi, Fukashi; Fujisawa, Masato; Kawabata, Masato; Shirakawa, Toshiro

    2012-06-27

    The objective of this study is to investigate and compare the sensitivity in conventional PCR, quantitative real time PCR, nested PCR and western blots for detection of prostate cancer tumor markers using prostate cancer (PCa) cells. We performed conventional PCR, quantitative real time PCR, nested PCR, and western blots using 5 kinds of PCa cells. Prostate specific antigen (PSA), prostate specific membrane antigen (PSMA), and androgen receptor (AR) were compared for their detection sensitivity by real time PCR and nested PCR. In real time PCR, there was a significant correlation between cell number and the RNA concentration obtained (R(2)=0.9944) for PSA, PSMA, and AR. We found it possible to detect these markers from a single LNCaP cell in both real time and nested PCR. By comparison, nested PCR reached a linear curve in fewer PCR cycles than real time PCR, suggesting that nested PCR may offer PCR results more quickly than real time PCR. In conclusion, nested PCR may offer tumor maker detection in PCa cells more quickly (with fewer PCR cycles) with the same high sensitivity as real time PCR. Further study is necessary to establish and evaluate the best tool for PCa tumor marker detection.

  13. Quantitative proteomic profiling of paired cancerous and normal colon epithelial cells isolated freshly from colorectal cancer patients.

    PubMed

    Tu, Chengjian; Mojica, Wilfrido; Straubinger, Robert M; Li, Jun; Shen, Shichen; Qu, Miao; Nie, Lei; Roberts, Rick; An, Bo; Qu, Jun

    2017-05-01

    The heterogeneous structure in tumor tissues from colorectal cancer (CRC) patients excludes an informative comparison between tumors and adjacent normal tissues. Here, we develop and apply a strategy to compare paired cancerous (CEC) versus normal (NEC) epithelial cells enriched from patients and discover potential biomarkers and therapeutic targets for CRC. CEC and NEC cells are respectively isolated from five different tumor and normal locations in the resected colon tissue from each patient (N = 12 patients) using an optimized epithelial cell adhesion molecule (EpCAM)-based enrichment approach. An ion current-based quantitative method is employed to perform comparative proteomic analysis for each patient. A total of 458 altered proteins that are common among >75% of patients are observed and selected for further investigation. Besides known findings such as deregulation of mitochondrial function, tricarboxylic acid cycle, and RNA post-transcriptional modification, functional analysis further revealed RAN signaling pathway, small nucleolar ribonucleoproteins (snoRNPs), and infection by RNA viruses are altered in CEC cells. A selection of the altered proteins of interest is validated by immunohistochemistry analyses. The informative comparison between matched CEC and NEC enhances our understanding of molecular mechanisms of CRC development and provides biomarker candidates and new pathways for therapeutic intervention. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics

    NASA Astrophysics Data System (ADS)

    El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  15. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics.

    PubMed

    El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  16. A quantitative comparison of leading-edge vortices in incompressible and supersonic flows

    DOT National Transportation Integrated Search

    2002-01-14

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that pl...

  17. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    PubMed Central

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-01-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5–6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance. PMID:26656260

  18. Qualitative and quantitative analysis of hyaluronan oligosaccharides with high performance thin layer chromatography using reagent-free derivatization on amino-modified silica and electrospray ionization-quadrupole time-of-flight mass spectrometry coupling on normal phase.

    PubMed

    Rothenhöfer, Martin; Scherübl, Rosmarie; Bernhardt, Günther; Heilmann, Jörg; Buschauer, Armin

    2012-07-27

    Purified oligomers of hyalobiuronic acid are indispensable tools to elucidate the physiological and pathophysiological role of hyaluronan degradation by various hyaluronidase isoenzymes. Therefore, we established and validated a novel sensitive, convenient, rapid, and cost-effective high performance thin layer chromatography (HPTLC) method for the qualitative and quantitative analysis of small saturated hyaluronan oligosaccharides consisting of 2-4 hyalobiuronic acid moieties. The use of amino-modified silica as stationary phase allows a simple reagent-free in situ derivatization by heating, resulting in a very low limit of detection (7-19 pmol per band, depending on the analyzed saturated oligosaccharide). By this derivatization procedure for the first time densitometric quantification of the analytes could be performed by HPTLC. The validated method showed a quantification limit of 37-71 pmol per band and was proven to be superior in comparison to conventional detection of hyaluronan oligosaccharides. The analytes were identified by hyphenation of normal phase planar chromatography to mass spectrometry (TLC-MS) using electrospray ionization. As an alternative to sequential techniques such as high performance liquid chromatography (HPLC) and capillary electrophoresis (CE), the validated HPTLC quantification method can easily be automated and is applicable to the analysis of multiple samples in parallel. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. No childhood development of viewpoint-invariant face recognition: evidence from 8-year-olds and adults.

    PubMed

    Crookes, Kate; Robbins, Rachel A

    2014-10-01

    Performance on laboratory face tasks improves across childhood, not reaching adult levels until adolescence. Debate surrounds the source of this development, with recent reviews suggesting that underlying face processing mechanisms are mature early in childhood and that the improvement seen on experimental tasks instead results from general cognitive/perceptual development. One face processing mechanism that has been argued to develop slowly is the ability to encode faces in a view-invariant manner (i.e., allowing recognition across changes in viewpoint). However, many previous studies have not controlled for general cognitive factors. In the current study, 8-year-olds and adults performed a recognition memory task with two study-test viewpoint conditions: same view (study front view, test front view) and change view (study front view, test three-quarter view). To allow quantitative comparison between children and adults, performance in the same view condition was matched across the groups by increasing the learning set size for adults. Results showed poorer memory in the change view condition than in the same view condition for both adults and children. Importantly, there was no quantitative difference between children and adults in the size of decrement in memory performance resulting from a change in viewpoint. This finding adds to growing evidence that face processing mechanisms are mature early in childhood. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Social Comparison and Body Image in Adolescence: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Krayer, A.; Ingledew, D. K.; Iphofen, R.

    2008-01-01

    This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…

  1. Comparison of nine different real-time PCR chemistries for qualitative and quantitative applications in GMO detection.

    PubMed

    Buh Gasparic, Meti; Tengs, Torstein; La Paz, Jose Luis; Holst-Jensen, Arne; Pla, Maria; Esteve, Teresa; Zel, Jana; Gruden, Kristina

    2010-03-01

    Several techniques have been developed for detection and quantification of genetically modified organisms, but quantitative real-time PCR is by far the most popular approach. Among the most commonly used real-time PCR chemistries are TaqMan probes and SYBR green, but many other detection chemistries have also been developed. Because their performance has never been compared systematically, here we present an extensive evaluation of some promising chemistries: sequence-unspecific DNA labeling dyes (SYBR green), primer-based technologies (AmpliFluor, Plexor, Lux primers), and techniques involving double-labeled probes, comprising hybridization (molecular beacon) and hydrolysis (TaqMan, CPT, LNA, and MGB) probes, based on recently published experimental data. For each of the detection chemistries assays were included targeting selected loci. Real-time PCR chemistries were subsequently compared for their efficiency in PCR amplification and limits of detection and quantification. The overall applicability of the chemistries was evaluated, adding practicability and cost issues to the performance characteristics. None of the chemistries seemed to be significantly better than any other, but certain features favor LNA and MGB technology as good alternatives to TaqMan in quantification assays. SYBR green and molecular beacon assays can perform equally well but may need more optimization prior to use.

  2. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  3. Quantitative assessment of 12-lead ECG synthesis using CAVIAR.

    PubMed

    Scherer, J A; Rubel, P; Fayn, J; Willems, J L

    1992-01-01

    The objective of this study is to assess the performance of patient-specific segment-specific (PSSS) synthesis in QRST complexes using CAVIAR, a new method of the serial comparison for electrocardiograms and vectorcardiograms. A collection of 250 multi-lead recordings from the Common Standards for Quantitative Electrocardiography (CSE) diagnostic pilot study is employed. QRS and ST-T segments are independently synthesized using the PSSS algorithm so that the mean-squared error between the original and estimated waveforms is minimized. CAVIAR compares the recorded and synthesized QRS and ST-T segments and calculates the mean-quadratic deviation as a measure of error. The results of this study indicate that estimated QRS complexes are good representatives of their recorded counterparts, and the integrity of the spatial information is maintained by the PSSS synthesis process. Analysis of the ST-T segments suggests that the deviations between recorded and synthesized waveforms are considerably greater than those associated with the QRS complexes. The poorer performance of the ST-T segments is attributed to magnitude normalization of the spatial loops, low-voltage passages, and noise interference. Using the mean-quadratic deviation and CAVIAR as methods of performance assessment, this study indicates that the PSSS-synthesis algorithm accurately maintains the signal information within the 12-lead electrocardiogram.

  4. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  5. Modeling ready biodegradability of fragrance materials.

    PubMed

    Ceriani, Lidia; Papa, Ester; Kovarich, Simona; Boethling, Robert; Gramatica, Paola

    2015-06-01

    In the present study, quantitative structure activity relationships were developed for predicting ready biodegradability of approximately 200 heterogeneous fragrance materials. Two classification methods, classification and regression tree (CART) and k-nearest neighbors (kNN), were applied to perform the modeling. The models were validated with multiple external prediction sets, and the structural applicability domain was verified by the leverage approach. The best models had good sensitivity (internal ≥80%; external ≥68%), specificity (internal ≥80%; external 73%), and overall accuracy (≥75%). Results from the comparison with BIOWIN global models, based on group contribution method, show that specific models developed in the present study perform better in prediction than BIOWIN6, in particular for the correct classification of not readily biodegradable fragrance materials. © 2015 SETAC.

  6. Improving fast-ion confinement in high-performance discharges by suppressing Alfvén eigenmodes

    DOE PAGES

    Kramer, Geritt J.; Podestà, Mario; Holcomb, Christopher; ...

    2017-03-28

    Here, we show that the degradation of fast-ion confinement in steady-state DIII-D discharges is quantitatively consistent with predictions based on the effects of multiple unstable Alfven eigenmodes on beam-ion transport. Simulation and experiment show that increasing the radius where the magnetic safety factor has its minimum is effective in minimizing beam-ion transport. This is favorable for achieving high performance steady-state operation in DIII-D and future reactors. A comparison between the experiments and a critical gradient model, in which only equilibrium profiles were used to predict the most unstable modes, show that in a number of cases this model reproduces themore » measured neutron rate well.« less

  7. Analyzing fragment production in mass-asymmetric reactions as a function of density dependent part of symmetry energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaur, Amandeep; Deepshikha; Vinayak, Karan Singh

    2016-07-15

    We performed a theoretical investigation of different mass-asymmetric reactions to access the direct impact of the density-dependent part of symmetry energy on multifragmentation. The simulations are performed for a specific set of reactions having same system mass and N/Z content, using isospin-dependent quantum molecular dynamics model to estimate the quantitative dependence of fragment production on themass-asymmetry factor (τ) for various symmetry energy forms. The dynamics associated with different mass-asymmetric reactions is explored and the direct role of symmetry energy is checked. Also a comparison with the experimental data (asymmetric reaction) is presented for a different equation of states (symmetry energymore » forms).« less

  8. Target Scattering Metrics: Model-Model and Model-Data Comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  9. Target Scattering Metrics: Model-Model and Model Data comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  10. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  11. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  12. External quality assessment on detection of hepatitis C virus RNA in clinical laboratories of China.

    PubMed

    Wang, Lu-nan; Zhang, Rui; Shen, Zi-yu; Chen, Wen-xiang; Li, Jin-ming

    2008-06-05

    As with many studies carried out in European countries, a quality assurance program has been established by the National Center for Clinical Laboratories in China (NCCL). The results showed that the external quality assessment significantly improves laboratory performance for quantitative evaluation of hepatitis C virus (HCV) RNA. Serum panels were delivered twice annually to the clinical laboratories which performed HCV RNA detection in China. Each panel made up of 5 coded samples. All laboratories were requested to carry out the detection within the required time period and report on testing results which contained qualitative and/or quantitative test findings, reagents used and relevant information about apparatus. All the positive samples were calibrated against the first International Standard for HCV RNA in a collaborative study and the range of comparison target value (TG) designated as +/- 0.5 log. The numbers of laboratories reporting on qualitative testing results for the first and second time external quality assessment were 168 and 167 in the year of 2003 and increased to 209 and 233 in 2007; the numbers of laboratories reporting on quantitative testing results were 134 and 147 in 2003 and rose to 340 and 339 in 2007. Deviation between the mean value for quantitative results at home in 2003 and the target value was above 0.5 log, which was comparatively high. By 2007, the target value was close to the national average except for the low concentrated specimens (10(3) IU/ml). The percentage of results within the range of GM +/- 0.5 log(10) varied from 8.2% to 93.5%. Some laboratories had some difficulties in the exact quantification of the lowest (3.00 log IU/ml) as well as of the highest viral levels (6.37 log IU/ml) values, very near to the limits of the dynamic range of the assays. The comparison of these results with the previous study confirms that a regular participation in external quality assessment (EQA) assures the achievement of a high proficiency level in the diagnosis of HCV infection. During the 5-year external quality assessment, sensitivity and accuracy of detection in most of the clinical laboratories have been evidently improved and the quality of kits has also been substantially improved.

  13. Performance comparison of token ring protocols for hard-real-time communication

    NASA Technical Reports Server (NTRS)

    Kamat, Sanjay; Zhao, Wei

    1992-01-01

    The ability to guarantee the deadlines of synchronous messages while maintaining a good aggregate throughput is an important consideration in the design of distributed real-time systems. In this paper, we study two token ring protocols, the priority driven protocol and the timed token protocol, for their suitability for hard real-time systems. Both these protocols use a token to control access to the transmission medium. In a priority driven protocol, messages are assigned priorities and the protocol ensures that messages are transmitted in the order of their priorities. Timed token protocols do not provide for priority arbitration but ensure that the maximum access delay for a station is bounded. For both protocols, we first derive the schedulability conditions under which the transmission deadlines of a given set of synchronous messages can be guaranteed. Subsequently, we use these schedulability conditions to quantitatively compare the average case behavior of the protocols. This comparison demonstrates that each of the protocols has its domain of superior performance and neither dominates the other for the entire range of operating conditions.

  14. Comparison of torsional and microburst longitudinal phacoemulsification: a prospective, randomized, masked clinical trial.

    PubMed

    Vasavada, Abhay R; Raj, Shetal M; Patel, Udayan; Vasavada, Vaishali; Vasavada, Viraj

    2010-01-01

    To compare intraoperative performance and postoperative outcome of three phacoemulsification technologies in patients undergoing microcoaxial phacoemulsification through 2.2-mm corneal incisions. The prospective, randomized, single-masked study included 360 eyes randomly assigned to torsional (Infiniti Vision System; Alcon Laboratories, Fort Worth, TX), microburst with longitudinal (Infiniti), or microburst with longitudinal (Legacy Everest, Alcon Laboratories) ultrasound. Assessments included surgical clock time, fluid volume, and intraoperative complications, central corneal thickness on day 1 and months 1 and 3 postoperatively, and endothelial cell density at 3 months postoperatively. Comparisons among groups were conducted. Torsional ultrasound required significantly less surgical clock time and fluid volume than the other groups. There were no intraoperative complications. Change in central corneal thickness and endothelial cell loss was significantly lower in the torsional ultrasound group at all postoperative visits (P < .001, Kruskal-Wallis test) compared to microburst longitudinal ultrasound modalities. Torsional ultrasound demonstrated quantitatively superior intraoperative performance and showed less increase in corneal thickness and less endothelial cell loss compared to microburst longitudinal ultrasound. Copyright 2010, SLACK Incorporated.

  15. Quantitative comparison of DNA methylation assays for biomarker development and clinical applications.

    PubMed

    2016-07-01

    DNA methylation patterns are altered in numerous diseases and often correlate with clinically relevant information such as disease subtypes, prognosis and drug response. With suitable assays and after validation in large cohorts, such associations can be exploited for clinical diagnostics and personalized treatment decisions. Here we describe the results of a community-wide benchmarking study comparing the performance of all widely used methods for DNA methylation analysis that are compatible with routine clinical use. We shipped 32 reference samples to 18 laboratories in seven different countries. Researchers in those laboratories collectively contributed 21 locus-specific assays for an average of 27 predefined genomic regions, as well as six global assays. We evaluated assay sensitivity on low-input samples and assessed the assays' ability to discriminate between cell types. Good agreement was observed across all tested methods, with amplicon bisulfite sequencing and bisulfite pyrosequencing showing the best all-round performance. Our technology comparison can inform the selection, optimization and use of DNA methylation assays in large-scale validation studies, biomarker development and clinical diagnostics.

  16. Quantitative comparison of alternative methods for coarse-graining biological networks

    PubMed Central

    Bowman, Gregory R.; Meng, Luming; Huang, Xuhui

    2013-01-01

    Markov models and master equations are a powerful means of modeling dynamic processes like protein conformational changes. However, these models are often difficult to understand because of the enormous number of components and connections between them. Therefore, a variety of methods have been developed to facilitate understanding by coarse-graining these complex models. Here, we employ Bayesian model comparison to determine which of these coarse-graining methods provides the models that are most faithful to the original set of states. We find that the Bayesian agglomerative clustering engine and the hierarchical Nyström expansion graph (HNEG) typically provide the best performance. Surprisingly, the original Perron cluster cluster analysis (PCCA) method often provides the next best results, outperforming the newer PCCA+ method and the most probable paths algorithm. We also show that the differences between the models are qualitatively significant, rather than being minor shifts in the boundaries between states. The performance of the methods correlates well with the entropy of the resulting coarse-grainings, suggesting that finding states with more similar populations (i.e., avoiding low population states that may just be noise) gives better results. PMID:24089717

  17. Helicopter Blade-Vortex Interaction Noise with Comparisons to CFD Calculations

    NASA Technical Reports Server (NTRS)

    McCluer, Megan S.

    1996-01-01

    A comparison of experimental acoustics data and computational predictions was performed for a helicopter rotor blade interacting with a parallel vortex. The experiment was designed to examine the aerodynamics and acoustics of parallel Blade-Vortex Interaction (BVI) and was performed in the Ames Research Center (ARC) 80- by 120-Foot Subsonic Wind Tunnel. An independently generated vortex interacted with a small-scale, nonlifting helicopter rotor at the 180 deg azimuth angle to create the interaction in a controlled environment. Computational Fluid Dynamics (CFD) was used to calculate near-field pressure time histories. The CFD code, called Transonic Unsteady Rotor Navier-Stokes (TURNS), was used to make comparisons with the acoustic pressure measurement at two microphone locations and several test conditions. The test conditions examined included hover tip Mach numbers of 0.6 and 0.7, advance ratio of 0.2, positive and negative vortex rotation, and the vortex passing above and below the rotor blade by 0.25 rotor chords. The results show that the CFD qualitatively predicts the acoustic characteristics very well, but quantitatively overpredicts the peak-to-peak sound pressure level by 15 percent in most cases. There also exists a discrepancy in the phasing (about 4 deg) of the BVI event in some cases. Additional calculations were performed to examine the effects of vortex strength, thickness, time accuracy, and directionality. This study validates the TURNS code for prediction of near-field acoustic pressures of controlled parallel BVI.

  18. Comparison of LEWICE and GlennICE in the SLD Regime

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Potapczuk, Mark G.; Levinson, Laurie H.

    2008-01-01

    A research project is underway at the NASA Glenn Research Center (GRC) to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from two different computer programs. The first program, LEWICE version 3.2.2, has been reported on previously. The second program is GlennICE version 0.1. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the GRC Icing Research Tunnel (IRT) has also been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. This paper will show the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. This report will also provide a description of both programs. Comparisons are then made to recent additions to the SLD database and selected previous cases. Quantitative comparisons are shown for horn height, horn angle, icing limit, area, and leading edge thickness. The results show that the predicted results for both programs are within the accuracy limits of the experimental data for the majority of cases.

  19. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.

  20. Quantitation of TGF-beta1 mRNA in porcine mesangial cells by comparative kinetic RT/PCR: comparison with ribonuclease protection assay and in situ hybridization.

    PubMed

    Ceol, M; Forino, M; Gambaro, G; Sauer, U; Schleicher, E D; D'Angelo, A; Anglani, F

    2001-01-01

    Gene expression can be examined with different techniques including ribonuclease protection assay (RPA), in situ hybridisation (ISH), and quantitative reverse transcription-polymerase chain reaction (RT/PCR). These methods differ considerably in their sensitivity and precision in detecting and quantifying low abundance mRNA. Although there is evidence that RT/PCR can be performed in a quantitative manner, the quantitative capacity of this method is generally underestimated. To demonstrate that the comparative kinetic RT/PCR strategy-which uses a housekeeping gene as internal standard-is a quantitative method to detect significant differences in mRNA levels between different samples, the inhibitory effect of heparin on phorbol 12-myristate 13-acetate (PMA)-induced-TGF-beta1 mRNA expression was evaluated by RT/PCR and RPA, the standard method of mRNA quantification, and the results were compared. The reproducibility of RT/PCR amplification was calculated by comparing the quantity of G3PDH and TGF-beta1 PCR products, generated during the exponential phases, estimated from two different RT/PCR (G3PDH, r = 0.968, P = 0.0000; TGF-beta1, r = 0.966, P = 0.0000). The quantitative capacity of comparative kinetic RT/PCR was demonstrated by comparing the results obtained from RPA and RT/PCR using linear regression analysis. Starting from the same RNA extraction, but using only 1% of the RNA for the RT/PCR compared to RPA, significant correlation was observed (r = 0.984, P = 0.0004). Moreover the morphometric analysis of ISH signal was applied for the semi-quantitative evaluation of the expression and localisation of TGF-beta1 mRNA in the entire cell population. Our results demonstrate the close similarity of the RT/PCR and RPA methods in giving quantitative information on mRNA expression and indicate the possibility to adopt the comparative kinetic RT/PCR as reliable quantitative method of mRNA analysis. Copyright 2001 Wiley-Liss, Inc.

  1. Dynamic comparisons of piezoelectric ejecta diagnostics

    NASA Astrophysics Data System (ADS)

    Buttler, W. T.; Zellner, M. B.; Olson, R. T.; Rigg, P. A.; Hixson, R. S.; Hammerberg, J. E.; Obst, A. W.; Payton, J. R.; Iverson, A.; Young, J.

    2007-03-01

    We investigate the quantitative reliability and precision of three different piezoelectric technologies for measuring ejected areal mass from shocked surfaces. Specifically we performed ejecta measurements on Sn shocked at two pressures, P ≈215 and 235 kbar. The shock in the Sn was created by launching a impactor with a powder gun. We self-compare and cross-compare these measurements to assess the ability of these probes to precisely determine the areal mass ejected from a shocked surface. We demonstrate the precision of each technology to be good, with variabilities on the order of ±10%. We also discuss their relative accuracy.

  2. Steady-state and transient operation of a heat-pipe radiator system

    NASA Technical Reports Server (NTRS)

    Sellers, J. P.

    1974-01-01

    Data obtained on a VCHP heat-pipe radiator system tested in a vacuum environment were studied. Analyses and interpretation of the steady-state results are presented along with an initial analysis of some of the transient data. Particular emphasis was placed on quantitative comparisons of the experimental data with computer model simulations. The results of the study provide a better understanding of the system but do not provide a complete explanation for the observed low VCHP performance and the relatively flat radiator panel temperature distribution. The results of the study also suggest hardware, software, and testing improvements.

  3. X-ray fluorescence holography studies for a Cu3Au crystal

    NASA Astrophysics Data System (ADS)

    Dąbrowski, K. M.; Dul, D. T.; Jaworska-Gołąb, T.; Rysz, J.; Korecki, P.

    2015-12-01

    In this work we show that performing a numerical correction for beam attenuation and indirect excitation allows one to fully restore element sensitivity in the three-dimensional reconstruction of the atomic structure. This is exemplified by a comparison of atomic images reconstructed from holograms measured for ordered and disordered phases of a Cu3Au crystal that clearly show sensitivity to changes in occupancy of the atomic sites. Moreover, the numerical correction, which is based on quantitative methods of X-ray fluorescence spectroscopy, was extended to take into account the influence of a disturbed overlayer in the sample.

  4. A quantitative assessment of the Hadoop framework for analyzing massively parallel DNA sequencing data.

    PubMed

    Siretskiy, Alexey; Sundqvist, Tore; Voznesenskiy, Mikhail; Spjuth, Ola

    2015-01-01

    New high-throughput technologies, such as massively parallel sequencing, have transformed the life sciences into a data-intensive field. The most common e-infrastructure for analyzing this data consists of batch systems that are based on high-performance computing resources; however, the bioinformatics software that is built on this platform does not scale well in the general case. Recently, the Hadoop platform has emerged as an interesting option to address the challenges of increasingly large datasets with distributed storage, distributed processing, built-in data locality, fault tolerance, and an appealing programming methodology. In this work we introduce metrics and report on a quantitative comparison between Hadoop and a single node of conventional high-performance computing resources for the tasks of short read mapping and variant calling. We calculate efficiency as a function of data size and observe that the Hadoop platform is more efficient for biologically relevant data sizes in terms of computing hours for both split and un-split data files. We also quantify the advantages of the data locality provided by Hadoop for NGS problems, and show that a classical architecture with network-attached storage will not scale when computing resources increase in numbers. Measurements were performed using ten datasets of different sizes, up to 100 gigabases, using the pipeline implemented in Crossbow. To make a fair comparison, we implemented an improved preprocessor for Hadoop with better performance for splittable data files. For improved usability, we implemented a graphical user interface for Crossbow in a private cloud environment using the CloudGene platform. All of the code and data in this study are freely available as open source in public repositories. From our experiments we can conclude that the improved Hadoop pipeline scales better than the same pipeline on high-performance computing resources, we also conclude that Hadoop is an economically viable option for the common data sizes that are currently used in massively parallel sequencing. Given that datasets are expected to increase over time, Hadoop is a framework that we envision will have an increasingly important role in future biological data analysis.

  5. From big data to rich data: The key features of athlete wheelchair mobility performance.

    PubMed

    van der Slikke, R M A; Berger, M A M; Bregman, D J J; Veeger, H E J

    2016-10-03

    Quantitative assessment of an athlete׳s individual wheelchair mobility performance is one prerequisite needed to evaluate game performance, improve wheelchair settings and optimize training routines. Inertial Measurement Unit (IMU) based methods can be used to perform such quantitative assessment, providing a large number of kinematic data. The goal of this research was to reduce that large amount of data to a set of key features best describing wheelchair mobility performance in match play and present them in meaningful way for both scientists and athletes. To test the discriminative power, wheelchair mobility characteristics of athletes with different performance levels were compared. The wheelchair kinematics of 29 (inter-)national level athletes were measured during a match using three inertial sensors mounted on the wheelchair. Principal component analysis was used to reduce 22 kinematic outcomes to a set of six outcomes regarding linear and rotational movement; speed and acceleration; average and best performance. In addition, it was explored whether groups of athletes with known performance differences based on their impairment classification also differed with respect to these key outcomes using univariate general linear models. For all six key outcomes classification showed to be a significant factor (p<0.05). We composed a set of six key kinematic outcomes that accurately describe wheelchair mobility performance in match play. The key kinematic outcomes were displayed in an easy to interpret way, usable for athletes, coaches and scientists. This standardized representation enables comparison of different wheelchair sports regarding wheelchair mobility, but also evaluation at the level of an individual athlete. By this means, the tool could enhance further development of wheelchair sports in general. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Improved diagnosis of pulmonary emphysema using in vivo dark-field radiography.

    PubMed

    Meinel, Felix G; Yaroshenko, Andre; Hellbach, Katharina; Bech, Martin; Müller, Mark; Velroyen, Astrid; Bamberg, Fabian; Eickelberg, Oliver; Nikolaou, Konstantin; Reiser, Maximilian F; Pfeiffer, Franz; Yildirim, Ali Ö

    2014-10-01

    The purpose of this study was to assess whether the recently developed method of grating-based x-ray dark-field radiography can improve the diagnosis of pulmonary emphysema in vivo. Pulmonary emphysema was induced in female C57BL/6N mice using endotracheal instillation of porcine pancreatic elastase and confirmed by in vivo pulmonary function tests, histopathology, and quantitative morphometry. The mice were anesthetized but breathing freely during imaging. Experiments were performed using a prototype small-animal x-ray dark-field scanner that was operated at 35 kilovolt (peak) with an exposure time of 5 seconds for each of the 10 grating steps. Images were compared visually. For quantitative comparison of signal characteristics, regions of interest were placed in the upper, middle, and lower zones of each lung. Receiver-operating-characteristic statistics were performed to compare the effectiveness of transmission and dark-field signal intensities and the combined parameter "normalized scatter" to differentiate between healthy and emphysematous lungs. A clear visual difference between healthy and emphysematous mice was found for the dark-field images. Quantitative measurements of x-ray dark-field signal and normalized scatter were significantly different between the mice with pulmonary emphysema and the control mice and showed good agreement with pulmonary function tests and quantitative histology. The normalized scatter showed a significantly higher discriminatory power (area under the receiver-operating-characteristic curve [AUC], 0.99) than dark-field (AUC, 0.90; P = 0.01) or transmission signal (AUC, 0.69; P < 0.001) alone did, allowing for an excellent discrimination of healthy and emphysematous lung regions. In a murine model, x-ray dark-field radiography is technically feasible in vivo and represents a substantial improvement over conventional transmission-based x-ray imaging for the diagnosis of pulmonary emphysema.

  7. Qualitative and quantitative analysis of the diuretic component ergone in Polyporus umbellatus by HPLC with fluorescence detection and HPLC-APCI-MS/MS.

    PubMed

    Zhao, Ying-Yong; Zhao, Ye; Zhang, Yong-Min; Lin, Rui-Chao; Sun, Wen-Ji

    2009-06-01

    Polyporus umbellatus is a widely used anti-aldosteronic diuretic in Traditional Chinese medicine (TCM). A new, sensitive and selective high-performance liquid chromatography-fluorescence detector (HPLC-FLD) and high-performance liquid chromatography-atmospheric pressure chemical ionization-mass spectrometry (HPLC-APCI-MS/MS) method for quantitative and qualitative determination of ergosta-4,6,8(14),22-tetraen-3-one(ergone), which is the main diuretic component, was provided for quality control of P. umbellatus crude drug. The ergone in the ethanolic extract of P. umbellatus was unambiguously characterized by HPLC-APCI, and further confirmed by comparing with a standard compound. The trace ergone was detected by the sensitive and selective HPLC-FLD. Linearity (r2 > 0.9998) and recoveries of low, medium and high concentration (100.5%, 100.2% and 100.4%) were consistent with the experimental criteria. The limit of detection (LOD) of ergone was around 0.2 microg/mL. Our results indicated that the content of ergone in P. umbellatus varied significantly from habitat to habitat with contents ranging from 2.13 +/- 0.02 to 59.17 +/- 0.05 microg/g. Comparison among HPLC-FLD and HPLC-UV or HPLC-APCI-MS/MS demonstrated that the HPLC-FLD and HPLC-APCI-MS/MS methods gave similar quantitative results for the selected herb samples, the HPLC-UV methods gave lower quantitative results than HPLC-FLD and HPLC-APCI-MS/MS methods. The established new HPLC-FLD method has the advantages of being rapid, simple, selective and sensitive, and could be used for the routine analysis of P. umbellatus crude drug.

  8. Quantitative performance evaluation of 124I PET/MRI lesion dosimetry in differentiated thyroid cancer

    NASA Astrophysics Data System (ADS)

    Wierts, R.; Jentzen, W.; Quick, H. H.; Wisselink, H. J.; Pooters, I. N. A.; Wildberger, J. E.; Herrmann, K.; Kemerink, G. J.; Backes, W. H.; Mottaghy, F. M.

    2018-01-01

    The aim was to investigate the quantitative performance of 124I PET/MRI for pre-therapy lesion dosimetry in differentiated thyroid cancer (DTC). Phantom measurements were performed on a PET/MRI system (Biograph mMR, Siemens Healthcare) using 124I and 18F. The PET calibration factor and the influence of radiofrequency coil attenuation were determined using a cylindrical phantom homogeneously filled with radioactivity. The calibration factor was 1.00  ±  0.02 for 18F and 0.88  ±  0.02 for 124I. Near the radiofrequency surface coil an underestimation of less than 5% in radioactivity concentration was observed. Soft-tissue sphere recovery coefficients were determined using the NEMA IEC body phantom. Recovery coefficients were systematically higher for 18F than for 124I. In addition, the six spheres of the phantom were segmented using a PET-based iterative segmentation algorithm. For all 124I measurements, the deviations in segmented lesion volume and mean radioactivity concentration relative to the actual values were smaller than 15% and 25%, respectively. The effect of MR-based attenuation correction (three- and four-segment µ-maps) on bone lesion quantification was assessed using radioactive spheres filled with a K2HPO4 solution mimicking bone lesions. The four-segment µ-map resulted in an underestimation of the imaged radioactivity concentration of up to 15%, whereas the three-segment µ-map resulted in an overestimation of up to 10%. For twenty lesions identified in six patients, a comparison of 124I PET/MRI to PET/CT was performed with respect to segmented lesion volume and radioactivity concentration. The interclass correlation coefficients showed excellent agreement in segmented lesion volume and radioactivity concentration (0.999 and 0.95, respectively). In conclusion, it is feasible that accurate quantitative 124I PET/MRI could be used to perform radioiodine pre-therapy lesion dosimetry in DTC.

  9. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    PubMed

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  11. Chronic Obstructive Pulmonary Disease: Lobe-based Visual Assessment of Volumetric CT by Using Standard Images—Comparison with Quantitative CT and Pulmonary Function Test in the COPDGene Study

    PubMed Central

    Kim, Song Soo; Lee, Ho Yun; Nevrekar, Dipti V.; Forssen, Anna V.; Crapo, James D.; Schroeder, Joyce D.; Lynch, David A.

    2013-01-01

    Purpose: To provide a new detailed visual assessment scheme of computed tomography (CT) for chronic obstructive pulmonary disease (COPD) by using standard reference images and to compare this visual assessment method with quantitative CT and several physiologic parameters. Materials and Methods: This research was approved by the institutional review board of each institution. CT images of 200 participants in the COPDGene study were evaluated. Four thoracic radiologists performed independent, lobar analysis of volumetric CT images for type (centrilobular, panlobular, and mixed) and extent (on a six-point scale) of emphysema, the presence of bronchiectasis, airway wall thickening, and tracheal abnormalities. Standard images for each finding, generated by two radiologists, were used for reference. The extent of emphysema, airway wall thickening, and luminal area were quantified at the lobar level by using commercial software. Spearman rank test and simple and multiple regression analyses were performed to compare the results of visual assessment with physiologic and quantitative parameters. Results: The type of emphysema, determined by four readers, showed good agreement (κ = 0.63). The extent of the emphysema in each lobe showed good agreement (mean weighted κ = 0.70) and correlated with findings at quantitative CT (r = 0.75), forced expiratory volume in 1 second (FEV1) (r = −0.68), FEV1/forced vital capacity (FVC) ratio (r = −0.74) (P < .001). Agreement for airway wall thickening was fair (mean κ = 0.41), and the number of lobes with thickened bronchial walls correlated with FEV1 (r = −0.60) and FEV1/FVC ratio (r = −0.60) (P < .001). Conclusion: Visual assessment of emphysema and airways disease in individuals with COPD can provide reproducible, physiologically substantial information that may complement that provided by quantitative CT assessment. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12120385/-/DC1 PMID:23220894

  12. Initial experience with 3D isotropic high-resolution 3 T MR arthrography of the wrist.

    PubMed

    Sutherland, John K; Nozaki, Taiki; Kaneko, Yasuhito; J Yu, Hon; Rafijah, Gregory; Hitt, David; Yoshioka, Hiroshi

    2016-01-16

    Our study was performed to evaluate the image quality of 3 T MR wrist arthrograms with attention to ulnar wrist structures, comparing image quality of isotropic 3D proton density fat suppressed turbo spin echo (PDFS TSE) sequence versus standard 2D 3 T sequences as well as comparison with 1.5 T MR arthrograms. Eleven consecutive 3 T MR wrist arthrograms were performed and the following sequences evaluated: 3D isotropic PDFS, repetition time/echo time (TR/TE) 1400/28.3 ms, voxel size 0.35x0.35x0.35 mm, acquisition time 5 min; 2D coronal sequences with slice thickness 2 mm: T1 fat suppressed turbo spin echo (T1FS TSE) (TR/TE 600/20 ms); proton density (PD) TSE (TR/TE 3499/27 ms). A 1.5 T group of 18 studies with standard sequences were evaluated for comparison. All MR imaging followed fluoroscopically guided intra-articular injection of dilute gadolinium contrast. Qualitative assessment related to delineation of anatomic structures between 1.5 T and 3 T MR arthrograms was carried out using Mann-Whitney test and the differences in delineation of anatomic structures among each sequence in 3 T group were analyzed with Wilcoxon signed-rank test. Quantitative assessment of mean relative signal intensity (SI) and relative contrast measurements was performed using Wilcoxon signed-rank test. Mean qualitative scores for 3 T sequences were significantly higher than 1.5 T (p < 0.01), with isotropic 3D PDFS sequence having highest mean qualitative scores (p < 0.05). Quantitative analysis demonstrated no significant difference in relative signal intensity among the 3 T sequences. Significant differences were found in relative contrast between fluid-bone and fluid-fat comparing 3D and 2D PDFS (p < 0.01). 3D isotropic PDFS sequence showed promise in both qualitative and quantitative assessment, suggesting this may be useful for MR wrist arthrograms at 3 T. Primary reasons for diagnostic potential include the ability to make reformations in any obliquity to follow the components of ulnar side wrist structures including triangular fibrocartilage complex. Additionally, isotropic imaging provides thinner slice thickness with less partial volume averaging allowing for identification of subtle injuries.

  13. Physics-Based Image Segmentation Using First Order Statistical Properties and Genetic Algorithm for Inductive Thermography Imaging.

    PubMed

    Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun

    2018-05-01

    Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.

  14. The Focinator v2-0 - Graphical Interface, Four Channels, Colocalization Analysis and Cell Phase Identification.

    PubMed

    Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena

    2017-07-01

    The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.

  15. Developing a more useful surface quality metric for laser optics

    NASA Astrophysics Data System (ADS)

    Turchette, Quentin; Turner, Trey

    2011-02-01

    Light scatter due to surface defects on laser resonator optics produces losses which lower system efficiency and output power. The traditional methodology for surface quality inspection involves visual comparison of a component to scratch and dig (SAD) standards under controlled lighting and viewing conditions. Unfortunately, this process is subjective and operator dependent. Also, there is no clear correlation between inspection results and the actual performance impact of the optic in a laser resonator. As a result, laser manufacturers often overspecify surface quality in order to ensure that optics will not degrade laser performance due to scatter. This can drive up component costs and lengthen lead times. Alternatively, an objective test system for measuring optical scatter from defects can be constructed with a microscope, calibrated lighting, a CCD detector and image processing software. This approach is quantitative, highly repeatable and totally operator independent. Furthermore, it is flexible, allowing the user to set threshold levels as to what will or will not constitute a defect. This paper details how this automated, quantitative type of surface quality measurement can be constructed, and shows how its results correlate against conventional loss measurement techniques such as cavity ringdown times.

  16. Determination of mean rainfall from the Special Sensor Microwave/Imager (SSM/I) using a mixed lognormal distribution

    NASA Technical Reports Server (NTRS)

    Berg, Wesley; Chase, Robert

    1992-01-01

    Global estimates of monthly, seasonal, and annual oceanic rainfall are computed for a period of one year using data from the Special Sensor Microwave/Imager (SSM/I). Instantaneous rainfall estimates are derived from brightness temperature values obtained from the satellite data using the Hughes D-matrix algorithm. The instantaneous rainfall estimates are stored in 1 deg square bins over the global oceans for each month. A mixed probability distribution combining a lognormal distribution describing the positive rainfall values and a spike at zero describing the observations indicating no rainfall is used to compute mean values. The resulting data for the period of interest are fitted to a lognormal distribution by using a maximum-likelihood. Mean values are computed for the mixed distribution and qualitative comparisons with published historical results as well as quantitative comparisons with corresponding in situ raingage data are performed.

  17. Comparison of Aircraft Icing Growth Assessment Software

    NASA Technical Reports Server (NTRS)

    Wright, William; Potapczuk, Mark G.; Levinson, Laurie H.

    2011-01-01

    A research project is underway to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. The project shows the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. The project addresses the validation of the software against a recent set of ice-shape data in the SLD regime. This validation effort mirrors a similar effort undertaken for previous validations of LEWICE. Those reports quantified the ice accretion prediction capabilities of the LEWICE software. Several ice geometry features were proposed for comparing ice shapes in a quantitative manner. The resulting analysis showed that LEWICE compared well to the available experimental data.

  18. Modeling of Convective-Stratiform Precipitation Processes: Sensitivity to Partitioning Methods

    NASA Technical Reports Server (NTRS)

    Lang, S. E.; Tao, W.-K.; Simpson, J.; Ferrier, B.; Starr, David OC. (Technical Monitor)

    2001-01-01

    Six different convective-stratiform separation techniques, including a new technique that utilizes the ratio of vertical and terminal velocities, are compared and evaluated using two-dimensional numerical simulations of a tropical [Tropical Ocean Global Atmosphere Coupled Ocean Atmosphere Response Experiment (TOGA COARE)] and midlatitude continental [Preliminary Regional Experiment for STORM-Central (PRESTORM)] squall line. Comparisons are made in terms of rainfall, cloud coverage, mass fluxes, apparent heating and moistening, mean hydrometeor profiles, CFADs (Contoured Frequency with Altitude Diagrams), microphysics, and latent heating retrieval. Overall, it was found that the different separation techniques produced results that qualitatively agreed. However, the quantitative differences were significant. Observational comparisons were unable to conclusively evaluate the performance of the techniques. Latent heating retrieval was shown to be sensitive to the use of separation technique mainly due to the stratiform region for methods that found very little stratiform rain.

  19. Comparison of the signal-to-noise characteristics of quantum versus thermal ghost imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Sullivan, Malcolm N.; Chan, Kam Wai Clifford; Boyd, Robert W.

    2010-11-15

    We present a theoretical comparison of the signal-to-noise characteristics of quantum versus thermal ghost imaging. We first calculate the signal-to-noise ratio of each process in terms of its controllable experimental conditions. We show that a key distinction is that a thermal ghost image always resides on top of a large background; the fluctuations in this background constitutes an intrinsic noise source for thermal ghost imaging. In contrast, there is a negligible intrinsic background to a quantum ghost image. However, for practical reasons involving achievable illumination levels, acquisition times for thermal ghost images are often much shorter than those for quantummore » ghost images. We provide quantitative predictions for the conditions under which each process provides superior performance. Our conclusion is that each process can provide useful functionality, although under complementary conditions.« less

  20. DOT/NASA comparative assessment of Brayton engines for guideway vehicle and buses. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Department of Transportation requested that the NASA Office of Aeronautics and Space Technology evaluate and assess the potential of several types of gas turbine engines and fuels for the on-board power and propulsion of a future heavy-duty ground transportation system. The purpose of the investigation was threefold: (1) to provide a definition of the potential for turbine engines to minimize pollution, energy consumption, and noise; (2) to provide a useful means of comparison of the types of engine based on consistent assumptions and a common analytical approach; and (3) to provide a compendium of comparative performance data that would serve as the technical basis for future planning. Emphasis was on establishing comparison trends rather than on absolute values and a definitive engine selection. The primary value of this study is intended to be usefulness of the results to provide a quantitative basis for future judgement.

  1. Proposal of an innovative benchmark for comparison of the performance of contactless digitizers

    NASA Astrophysics Data System (ADS)

    Iuliano, Luca; Minetola, Paolo; Salmi, Alessandro

    2010-10-01

    Thanks to the improving performances of 3D optical scanners, in terms of accuracy and repeatability, reverse engineering applications have extended from CAD model design or reconstruction to quality control. Today, contactless digitizing devices constitute a good alternative to coordinate measuring machines (CMMs) for the inspection of certain parts. The German guideline VDI/VDE 2634 is the only reference to evaluate whether 3D optical measuring systems comply with the declared or required performance specifications. Nevertheless it is difficult to compare the performance of different scanners referring to such a guideline. An adequate novel benchmark is proposed in this paper: focusing on the inspection of production tools (moulds), the innovative test piece was designed using common geometries and free-form surfaces. The reference part is intended to be employed for the evaluation of the performance of several contactless digitizing devices in computer-aided inspection, considering dimensional and geometrical tolerances as well as other quantitative and qualitative criteria.

  2. Comparison of MPEG-1 digital videotape with digitized sVHS videotape for quantitative echocardiographic measurements

    NASA Technical Reports Server (NTRS)

    Garcia, M. J.; Thomas, J. D.; Greenberg, N.; Sandelski, J.; Herrera, C.; Mudd, C.; Wicks, J.; Spencer, K.; Neumann, A.; Sankpal, B.; hide

    2001-01-01

    Digital format is rapidly emerging as a preferred method for displaying and retrieving echocardiographic studies. The qualitative diagnostic accuracy of Moving Pictures Experts Group (MPEG-1) compressed digital echocardiographic studies has been previously reported. The goals of the present study were to compare quantitative measurements derived from MPEG-1 recordings with the super-VHS (sVHS) videotape clinical standard. Six reviewers performed blinded measurements from still-frame images selected from 20 echocardiographic studies that were simultaneously acquired in sVHS and MPEG-1 formats. Measurements were obtainable in 1401 (95%) of 1486 MPEG-1 variables compared with 1356 (91%) of 1486 sVHS variables (P <.001). Excellent agreement existed between MPEG-1 and sVHS 2-dimensional linear measurements (r = 0.97; MPEG-1 = 0.95[sVHS] + 1.1 mm; P <.001; Delta = 9% +/- 10%), 2-dimensional area measurements (r = 0.89), color jet areas (r = 0.87, p <.001), and Doppler velocities (r = 0.92, p <.001). Interobserver variability was similar for both sVHS and MPEG-1 readings. Our results indicate that quantitative off-line measurements from MPEG-1 digitized echocardiographic studies are feasible and comparable to those obtained from sVHS.

  3. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  4. Comparison of quantitative and qualitative tests for glucose-6-phosphate dehydrogenase deficiency in the neonatal period.

    PubMed

    Keihanian, F; Basirjafari, S; Darbandi, B; Saeidinia, A; Jafroodi, M; Sharafi, R; Shakiba, M

    2017-06-01

    Considering the high prevalence of glucose-6-phosphate dehydrogenase (G6PD) deficiency among newborns, different screening methods have been established in various countries. In this study, we aimed to assess the prevalence of G6PD deficiency among newborns in Rasht, Iran, and compare G6PD activity in cord blood samples, using quantitative and qualitative tests. This cross-sectional, prospective study was performed at five largest hospitals in Rasht, Guilan Province, Iran. The screening tests were performed for all the newborns, referred to these hospitals. Specimens were characterized in terms of G6PD activity under ultraviolet light, using the kinetic method and the qualitative fluorescent spot test (FST). We also determined the sensitivity, specificity, negative predictive value, and positive predictive value of the qualitative assay. Blood samples were collected from 1474 newborns. Overall, 757 (51.4%) subjects were male. As the findings revealed, 1376 (93.4%) newborns showed normal G6PD activity, while 98 (6.6%) had G6PD deficiency. There was a significant difference in the mean G6PD level between males and females (P = 0.0001). Also, a significant relationship was detected between FST results and the mean values obtained in the quantitative test (P < 0.0001). According to the present study, FST showed acceptable sensitivity and specificity for G6PD activity, although it appeared inefficient for diagnostic purposes in some cases. © 2017 John Wiley & Sons Ltd.

  5. Performance characteristics of the ARCHITECT Active-B12 (Holotranscobalamin) assay.

    PubMed

    Merrigan, Stephen D; Owen, William E; Straseski, Joely A

    2015-01-01

    Vitamin B12 (cobalamin) is a necessary cofactor in methionine and succinyl-CoA metabolism. Studies estimate the deficiency prevalence as high as 30% in the elderly population. Ten to thirty percent of circulating cobalamin is bound to transcobalamin (holotranscobalamin, holoTC) which can readily enter cells and is therefore considered the bioactive form. The objective of our study was to evaluate the analytical performance of a high-throughput, automated holoTC assay (ARCHITECT i2000(SR) Active-B12 (Holotranscobalamin)) and compare it to other available methods. Manufacturer-specified limits of blank (LoB), detection (LoD), and quantitation (LoQ), imprecision, interference, and linearity were evaluated for the ARCHITECT HoloTC assay. Residual de-identified serum samples were used to compare the ARCHITECT HoloTC assay with the automated AxSYM Active-B12 (Holotranscobalamin) assay (Abbott Diagnostics) and the manual Active-B12 (Holotranscobalamin) Enzyme Immunoassay (EIA) (Axis-Shield Diagnostics, Dundee, Scotland, UK). Manufacturer's claims of LoB, LoD, LoQ, imprecision, interference, and linearity to the highest point tested (113.4 pmol/L) were verified for the ARCHITECT HoloTC assay. Method comparison of the ARCHITECT HoloTC to the AxSYM HoloTC produced the following Deming regression statistics: (ARCHITECT(HoloTc)) = 0.941 (AxSYM(HoloTC)) + 1.2 pmol/L, S(y/x) = 6.4, r = 0.947 (n = 98). Comparison to the Active-B12 EIA produced: (ARCHITECT(HoloTC)) = 1.105 (EIA(Active-B12)) - 6.8 pmol/L, S(y/x) = 11.0, r = 0.950 (n = 221). This assay performed acceptably for LoB, LoD, LoQ, imprecision, interference, linearity and method comparison to the predicate device (AxSYM). An additional comparison to a manual Active-B12 EIA method performed similarly, with minor exceptions. This study determined that the ARCHITECT HoloTC assay is suitable for routine clinical use, which provides a high-throughput alternative for automated testing of this emerging marker of cobalamin deficiency.

  6. 3D analysis of bone formation around titanium implants using micro-computed tomography (μCT)

    NASA Astrophysics Data System (ADS)

    Bernhardt, Ricardo; Scharnweber, Dieter; Müller, Bert; Beckmann, Felix; Goebbels, Jürgen; Jansen, John; Schliephake, Henning; Worch, Hartmut

    2006-08-01

    The quantitative analysis of bone formation around biofunctionalised metallic implants is an important tool for the further development of implants with higher success rates. This is, nowadays, especially important in cases of additional diseases like diabetes or osteoporosis. Micro computed tomography (μCT), as non-destructive technique, offers the possibility for quantitative three-dimensional recording of bone close to the implant's surface with micrometer resolution, which is the range of the relevant bony structures. Within different animal models using cylindrical and screw-shaped Ti6Al4V implants we have compared visualization and quantitative analysis of newly formed bone by the use of synchrotron-radiation-based CT-systems in comparison with histological findings. The SRμCT experiments were performed at the beamline BW 5 (HASYLAB at DESY, Hamburg, Germany; at the BAMline (BESSY, Berlin, Germany). For the experiments, PMMA-embedded samples were prepared with diameters of about 8 mm, which contain in the center the implant surrounded by the bony tissue. To (locally) quantify the bone formation, models were developed and optimized. The comparison of the results obtained by SRμCT and histology demonstrates the advantages and disadvantages of both approaches, although the bone formation values for the different biofunctionalized implants are identical within the error bars. SRμCT allows the clear identification of fully mineralized bone around the different titanium implants. As hundreds of virtual slices were easily generated for the individual samples, the quantification and interactive bone detection led to conclusions of high precision and statistical relevance. In this way, SRμCT in combination with interactive data analysis is proven to be more significant with respect to classical histology.

  7. The role of PET quantification in cardiovascular imaging.

    PubMed

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries has been demonstrated.

  8. Evaluating reporter genes of different luciferases for optimized in vivo bioluminescence imaging of transplanted neural stem cells in the brain.

    PubMed

    Mezzanotte, Laura; Aswendt, Markus; Tennstaedt, Annette; Hoeben, Rob; Hoehn, Mathias; Löwik, Clemens

    2013-01-01

    Bioluminescence imaging (BLI) has become the method of choice for optical tracking of cells in small laboratory animals. However, the use of luciferases from different species, depending on different substrates and emitting at distinct wavelengths, has not been optimized for sensitive neuroimaging. In order to identify the most suitable luciferase, this quantitative study compared the luciferases Luc2, CBG99, PpyRE9 and hRluc. Human embryonic kidney (HEK-293) cells and mouse neural stem cells were transduced by lentiviral vector-mediated transfer to express one of the four luciferases, together with copGFP. A T2A peptide linker promoted stoichiometric expression between both imaging reporters and the comparison of cell populations upon flow cytometry. Cell dilution series were used to determine highest BLI sensitivity in vitro for Luc2. However, Coelenterazine h-dependent hRluc signals clearly exceeded d-luciferin-dependent BLI in vitro. For the quantitative in vivo analysis, cells were transplanted into mouse brain and BLI was performed including the recording of emission kinetics and spectral characteristics. Differences in light kinetics were observed for d-luciferin vs Coelenterazine h. The emission spectra of Luc2 and PpyRE9 remained almost unchanged, while the emission spectrum of CBG99 became biphasic. Most importantly, photon emission decreased in the order of Luc2, CBG99, PpyRE9 to hRluc. The feasibility of combining different luciferases for dual color and dual substrate neuroimaging was tested and discussed. This investigation provides the first complete quantitative comparison of different luciferases expressed by neural stem cells. It results in a clear recommendation of Luc2 as the best luciferase selection for in vivo neuroimaging. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  10. Comparative study of the dynamics of lipid membrane phase decomposition in experiment and simulation.

    PubMed

    Burger, Stefan; Fraunholz, Thomas; Leirer, Christian; Hoppe, Ronald H W; Wixforth, Achim; Peter, Malte A; Franke, Thomas

    2013-06-25

    Phase decomposition in lipid membranes has been the subject of numerous investigations by both experiment and theoretical simulation, yet quantitative comparisons of the simulated data to the experimental results are rare. In this work, we present a novel way of comparing the temporal development of liquid-ordered domains obtained from numerically solving the Cahn-Hilliard equation and by inducing a phase transition in giant unilamellar vesicles (GUVs). Quantitative comparison is done by calculating the structure factor of the domain pattern. It turns out that the decomposition takes place in three distinct regimes in both experiment and simulation. These regimes are characterized by different rates of growth of the mean domain diameter, and there is quantitative agreement between experiment and simulation as to the duration of each regime and the absolute rate of growth in each regime.

  11. Computational comparison of aortic root stresses in presence of stentless and stented aortic valve bio-prostheses.

    PubMed

    Nestola, M G C; Faggiano, E; Vergara, C; Lancellotti, R M; Ippolito, S; Antona, C; Filippi, S; Quarteroni, A; Scrofani, R

    2017-02-01

    We provide a computational comparison of the performance of stentless and stented aortic prostheses, in terms of aortic root displacements and internal stresses. To this aim, we consider three real patients; for each of them, we draw the two prostheses configurations, which are characterized by different mechanical properties and we also consider the native configuration. For each of these scenarios, we solve the fluid-structure interaction problem arising between blood and aortic root, through Finite Elements. In particular, the Arbitrary Lagrangian-Eulerian formulation is used for the numerical solution of the fluid-dynamic equations and a hyperelastic material model is adopted to predict the mechanical response of the aortic wall and the two prostheses. The computational results are analyzed in terms of aortic flow, internal wall stresses and aortic wall/prosthesis displacements; a quantitative comparison of the mechanical behavior of the three scenarios is reported. The numerical results highlight a good agreement between stentless and native displacements and internal wall stresses, whereas higher/non-physiological stresses are found for the stented case.

  12. Validation metrics for turbulent plasma transport

    DOE PAGES

    Holland, C.

    2016-06-22

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  13. Comparison of manual and homogenizer methods for preparation of tick-derived stabilates of Theileria parva: equivalence testing using an in vitro titration model.

    PubMed

    Mbao, V; Speybroeck, N; Berkvens, D; Dolan, T; Dorny, P; Madder, M; Mulumba, M; Duchateau, L; Brandt, J; Marcotty, T

    2005-07-01

    Theileria parva sporozoite stabilates are used in the infection and treatment method of immunization, a widely accepted control option for East Coast fever in cattle. T. parva sporozoites are extracted from infected adult Rhipicephalus appendiculatus ticks either manually, using a pestle and a mortar, or by use of an electric homogenizer. A comparison of the two methods as a function of stabilate infectivity has never been documented. This study was designed to provide a quantitative comparison of stabilates produced by the two methods. The approach was to prepare batches of stabilate by both methods and then subject them to in vitro titration. Equivalence testing was then performed on the average effective doses (ED). The ratio of infective sporozoites yielded by the two methods was found to be 1.14 in favour of the manually ground stabilate with an upper limit of the 95% confidence interval equal to 1.3. We conclude that the choice of method rests more on costs, available infrastructure and standardization than on which method produces a richer sporozoite stabilate.

  14. Comparison of pre/post-operative CT image volumes to preoperative digitization of partial hepatectomies: a feasibility study in surgical validation

    NASA Astrophysics Data System (ADS)

    Dumpuri, Prashanth; Clements, Logan W.; Li, Rui; Waite, Jonathan M.; Stefansic, James D.; Geller, David A.; Miga, Michael I.; Dawant, Benoit M.

    2009-02-01

    Preoperative planning combined with image-guidance has shown promise towards increasing the accuracy of liver resection procedures. The purpose of this study was to validate one such preoperative planning tool for four patients undergoing hepatic resection. Preoperative computed tomography (CT) images acquired before surgery were used to identify tumor margins and to plan the surgical approach for resection of these tumors. Surgery was then performed with intraoperative digitization data acquire by an FDA approved image-guided liver surgery system (Pathfinder Therapeutics, Inc., Nashville, TN). Within 5-7 days after surgery, post-operative CT image volumes were acquired. Registration of data within a common coordinate reference was achieved and preoperative plans were compared to the postoperative volumes. Semi-quantitative comparisons are presented in this work and preliminary results indicate that significant liver regeneration/hypertrophy in the postoperative CT images may be present post-operatively. This could challenge pre/post operative CT volume change comparisons as a means to evaluate the accuracy of preoperative surgical plans.

  15. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C.

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  16. Pleiotropy Analysis of Quantitative Traits at Gene Level by Multivariate Functional Linear Models

    PubMed Central

    Wang, Yifan; Liu, Aiyi; Mills, James L.; Boehnke, Michael; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao; Wu, Colin O.; Fan, Ruzong

    2015-01-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks’s Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. PMID:25809955

  17. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    PubMed

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  18. SU-E-I-51: Quantitative Assessment of X-Ray Imaging Detector Performance in a Clinical Setting - a Simple Approach Using a Commercial Instrument

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjoeberg, J; Bujila, R; Omar, A

    2015-06-15

    Purpose: To measure and compare the performance of X-ray imaging detectors in a clinical setting using a dedicated instrument for the quantitative determination of detector performance. Methods: The DQEPro (DQE Instruments Inc., London, Ontario Canada) was used to determine the MTF, NPS and DQE using an IEC compliant methodology for three different imaging modalities: conventional radiography (CsI-based detector), general-purpose radioscopy (CsI-based detector), and mammography (a-Se based detector). The radiation qualities (IEC) RQA-5 and RQA-M-2 were used for the CsI-based and a-Se-based detectors, respectively. The DQEPro alleviates some of the difficulties associated with DQE measurements by automatically positioning test devices overmore » the detector, guiding the user through the image acquisition process and providing software for calculations. Results: A comparison of the NPS showed that the image noise of the a-Se detector was less correlated than the CsI detectors. A consistently higher performance was observed for the a-Se detector at all spatial frequencies (MTF: 0.97@0.25 cy/mm, DQE: 0.72@0.25 cy/mm) and the DQE drops off slower than for the CsI detectors. The CsI detector used for conventional radiography displayed a higher performance at low spatial frequencies compared to the CsI detector used for radioscopy (DQE: 0.65 vs 0.60@0.25 cy/mm). However, at spatial frequencies above 1.3 cy/mm, the radioscopy detector displayed better performance than the conventional radiography detector (DQE: 0.35 vs 0.24@2.00 cy/mm). Conclusion: The difference in the MTF, NPS and DQE that was observed for the two different CsI detectors and the a-Se detector reflect the imaging tasks that the different detector types are intended for. The DQEPro has made the determination and calculation of quantitative metrics of X-ray imaging detector performance substantially more convenient and accessible to undertake in a clinical setting.« less

  19. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method ...

  20. Employment from Solar Energy: A Bright but Partly Cloudy Future.

    ERIC Educational Resources Information Center

    Smeltzer, K. K.; Santini, D. J.

    A comparison of quantitative and qualitative employment effects of solar and conventional systems can prove the increased employment postulated as one of the significant secondary benefits of a shift from conventional to solar energy use. Current quantitative employment estimates show solar technology-induced employment to be generally greater…

  1. Using Facebook as a LMS?

    ERIC Educational Resources Information Center

    Arabacioglu, Taner; Akar-Vural, Ruken

    2014-01-01

    The main purpose of this research was to compare the communication media according to effective teaching. For this purpose, in the research, the mixed method, including quantitative and qualitative data collecting techniques, was applied. For the quantitative part of the research, the static group comparison design was implemented as one of the…

  2. Single Laboratory Comparison of Quantitative Real-time PCR Assays for the Detection of Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) assays available to detect and enumerate fecal pollution in ambient waters. Each assay employs distinct primers and probes that target different rRNA genes and microorganisms leading to potential variations in concentration es...

  3. Comparison of genetic diversity and population structure of Pacific Coast whitebark pine across multiple markers

    Treesearch

    Andrew D. Bower; Bryce A. Richardson; Valerie Hipkins; Regina Rochefort; Carol Aubry

    2011-01-01

    Analysis of "neutral" molecular markers and "adaptive" quantitative traits are common methods of assessing genetic diversity and population structure. Molecular markers typically reflect the effects of demographic and stochastic processes but are generally assumed to not reflect natural selection. Conversely, quantitative (or "adaptive")...

  4. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  5. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  6. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    PubMed

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Clinically Relevant Subregions of Articular Cartilage of the Hip for Analysis and Reporting Quantitative Magnetic Resonance Imaging: A Technical Note.

    PubMed

    Surowiec, Rachel K; Lucas, Erin P; Wilson, Katharine J; Saroki, Adriana J; Ho, Charles P

    2014-01-01

    Before quantitative imaging techniques can become clinically valuable, the method, and more specifically, the regions of locating and reporting these values should be standardized toward reproducibility comparisons across centers and longitudinal follow-up of individual patients. The purpose of this technical note is to describe a rigorous and reproducible method of locating, analyzing, and reporting quantitative MRI values in hip articular cartilage with an approach that is consistent with current orthopedic literature. To demonstrate this localization and documentation, 3 patients (age, 23 ± 5.1 years; 2 males, 1 female) who presented with symptomatic mixed-type femoroacetabular impingement (α angle, 63.3° ± 2.1°; center edge angle, 39° ± 4.2°) were evaluated with T2-mapping at 3 T MRI prior to hip arthroscopy. Manual segmentation was performed and cartilage of the acetabulum and femur was divided into 12 subregions adapted from the geographic zone method. Bone landmarks in the acetabulum and femur, identifiable both in arthroscopy and MR images, were manually selected and the coordinates exported for division of cartilage. Mean T2 values in each zone are presented. The current work outlines a standardized system to locate and describe quantitative mapping values that could aid in surgical decision making, planning, and the noninvasive longitudinal follow-up of implemented cartilage preservation and restoration techniques.

  8. Comparison of PCR and quantitative real-time PCR methods for the characterization of ruminant and cattle fecal pollution sources.

    PubMed

    Raith, Meredith R; Kelty, Catherine A; Griffith, John F; Schriewer, Alexander; Wuertz, Stefan; Mieszkin, Sophie; Gourmelon, Michele; Reischer, Georg H; Farnleitner, Andreas H; Ervin, Jared S; Holden, Patricia A; Ebentier, Darcy L; Jay, Jennifer A; Wang, Dan; Boehm, Alexandria B; Aw, Tiong Gim; Rose, Joan B; Balleste, E; Meijer, W G; Sivaganesan, Mano; Shanks, Orin C

    2013-11-15

    The State of California has mandated the preparation of a guidance document on the application of fecal source identification methods for recreational water quality management. California contains the fifth highest population of cattle in the United States, making the inclusion of cow-associated methods a logical choice. Because the performance of these methods has been shown to change based on geography and/or local animal feeding practices, laboratory comparisons are needed to determine which assays are best suited for implementation. We describe the performance characterization of two end-point PCR assays (CF128 and CF193) and five real-time quantitative PCR (qPCR) assays (Rum2Bac, BacR, BacCow, CowM2, and CowM3) reported to be associated with either ruminant or cattle feces. Each assay was tested against a blinded set of 38 reference challenge filters (19 duplicate samples) containing fecal pollution from 12 different sources suspected to impact water quality. The abundance of each host-associated genetic marker was measured for qPCR-based assays in both target and non-target animals and compared to quantities of total DNA mass, wet mass of fecal material, as well as Bacteroidales, and enterococci determined by 16S rRNA qPCR and culture-based approaches (enterococci only). Ruminant- and cow-associated genetic markers were detected in all filters containing a cattle fecal source. However, some assays cross-reacted with non-target pollution sources. A large amount of variability was evident across laboratories when protocols were not fixed suggesting that protocol standardization will be necessary for widespread implementation. Finally, performance metrics indicate that the cattle-associated CowM2 qPCR method combined with either the BacR or Rum2Bac ruminant-associated methods are most suitable for implementation. Published by Elsevier Ltd.

  9. An iterative method for hydrodynamic interactions in Brownian dynamics simulations of polymer dynamics

    NASA Astrophysics Data System (ADS)

    Miao, Linling; Young, Charles D.; Sing, Charles E.

    2017-07-01

    Brownian Dynamics (BD) simulations are a standard tool for understanding the dynamics of polymers in and out of equilibrium. Quantitative comparison can be made to rheological measurements of dilute polymer solutions, as well as direct visual observations of fluorescently labeled DNA. The primary computational challenge with BD is the expensive calculation of hydrodynamic interactions (HI), which are necessary to capture physically realistic dynamics. The full HI calculation, performed via a Cholesky decomposition every time step, scales with the length of the polymer as O(N3). This limits the calculation to a few hundred simulated particles. A number of approximations in the literature can lower this scaling to O(N2 - N2.25), and explicit solvent methods scale as O(N); however both incur a significant constant per-time step computational cost. Despite this progress, there remains a need for new or alternative methods of calculating hydrodynamic interactions; large polymer chains or semidilute polymer solutions remain computationally expensive. In this paper, we introduce an alternative method for calculating approximate hydrodynamic interactions. Our method relies on an iterative scheme to establish self-consistency between a hydrodynamic matrix that is averaged over simulation and the hydrodynamic matrix used to run the simulation. Comparison to standard BD simulation and polymer theory results demonstrates that this method quantitatively captures both equilibrium and steady-state dynamics after only a few iterations. The use of an averaged hydrodynamic matrix allows the computationally expensive Brownian noise calculation to be performed infrequently, so that it is no longer the bottleneck of the simulation calculations. We also investigate limitations of this conformational averaging approach in ring polymers.

  10. Chemical Shift MR Imaging Methods for the Quantification of Transcatheter Lipiodol Delivery to the Liver: Preclinical Feasibility Studies in a Rodent Model

    PubMed Central

    Yin, Xiaoming; Guo, Yang; Li, Weiguo; Huo, Eugene; Zhang, Zhuoli; Nicolai, Jodi; Kleps, Robert A.; Hernando, Diego; Katsaggelos, Aggelos K.; Omary, Reed A.

    2012-01-01

    Purpose: To demonstrate the feasibility of using chemical shift magnetic resonance (MR) imaging fat-water separation methods for quantitative estimation of transcatheter lipiodol delivery to liver tissues. Materials and Methods: Studies were performed in accordance with institutional Animal Care and Use Committee guidelines. Proton nuclear MR spectroscopy was first performed to identify lipiodol spectral peaks and relative amplitudes. Next, phantoms were constructed with increasing lipiodol-water volume fractions. A multiecho chemical shift–based fat-water separation method was used to quantify lipiodol concentration within each phantom. Six rats served as controls; 18 rats underwent catheterization with digital subtraction angiography guidance for intraportal infusion of a 15%, 30%, or 50% by volume lipiodol-saline mixture. MR imaging measurements were used to quantify lipiodol delivery to each rat liver. Lipiodol concentration maps were reconstructed by using both single-peak and multipeak chemical shift models. Intraclass and Spearman correlation coefficients were calculated for statistical comparison of MR imaging–based lipiodol concentration and volume measurements to reference standards (known lipiodol phantom compositions and the infused lipiodol dose during rat studies). Results: Both single-peak and multipeak measurements were well correlated to phantom lipiodol concentrations (r2 > 0.99). Lipiodol volume measurements were progressively and significantly higher when comparing between animals receiving different doses (P < .05 for each comparison). MR imaging–based lipiodol volume measurements strongly correlated with infused dose (intraclass correlation coefficients > 0.93, P < .001) with both single- and multipeak approaches. Conclusion: Chemical shift MR imaging fat-water separation methods can be used for quantitative measurements of lipiodol delivery to liver tissues. © RSNA, 2012 PMID:22623693

  11. Evaluating motion processing algorithms for use with functional near-infrared spectroscopy data from young children.

    PubMed

    Delgado Reyes, Lourdes M; Bohache, Kevin; Wijeakumar, Sobanawartiny; Spencer, John P

    2018-04-01

    Motion artifacts are often a significant component of the measured signal in functional near-infrared spectroscopy (fNIRS) experiments. A variety of methods have been proposed to address this issue, including principal components analysis (PCA), correlation-based signal improvement (CBSI), wavelet filtering, and spline interpolation. The efficacy of these techniques has been compared using simulated data; however, our understanding of how these techniques fare when dealing with task-based cognitive data is limited. Brigadoi et al. compared motion correction techniques in a sample of adult data measured during a simple cognitive task. Wavelet filtering showed the most promise as an optimal technique for motion correction. Given that fNIRS is often used with infants and young children, it is critical to evaluate the effectiveness of motion correction techniques directly with data from these age groups. This study addresses that problem by evaluating motion correction algorithms implemented in HomER2. The efficacy of each technique was compared quantitatively using objective metrics related to the physiological properties of the hemodynamic response. Results showed that targeted PCA (tPCA), spline, and CBSI retained a higher number of trials. These techniques also performed well in direct head-to-head comparisons with the other approaches using quantitative metrics. The CBSI method corrected many of the artifacts present in our data; however, this approach produced sometimes unstable HRFs. The targeted PCA and spline methods proved to be the most robust, performing well across all comparison metrics. When compared head to head, tPCA consistently outperformed spline. We conclude, therefore, that tPCA is an effective technique for correcting motion artifacts in fNIRS data from young children.

  12. Qualitative and quantitative outcomes of audience response systems as an educational tool in a plastic surgery residency program.

    PubMed

    Arneja, Jugpal S; Narasimhan, Kailash; Bouwman, David; Bridge, Patrick D

    2009-12-01

    In-training evaluations in graduate medical education have typically been challenging. Although the majority of standardized examination delivery methods have become computer-based, in-training examinations generally remain pencil-paper-based, if they are performed at all. Audience response systems present a novel way to stimulate and evaluate the resident-learner. The purpose of this study was to assess the outcomes of audience response systems testing as compared with traditional testing in a plastic surgery residency program. A prospective 1-year pilot study of 10 plastic surgery residents was performed using audience response systems-delivered testing for the first half of the academic year and traditional pencil-paper testing for the second half. Examination content was based on monthly "Core Quest" curriculum conferences. Quantitative outcome measures included comparison of pretest and posttest and cumulative test scores of both formats. Qualitative outcomes from the individual participants were obtained by questionnaire. When using the audience response systems format, pretest and posttest mean scores were 67.5 and 82.5 percent, respectively; using traditional pencil-paper format, scores were 56.5 percent and 79.5 percent. A comparison of the cumulative mean audience response systems score (85.0 percent) and traditional pencil-paper score (75.0 percent) revealed statistically significantly higher scores with audience response systems (p = 0.01). Qualitative outcomes revealed increased conference enthusiasm, greater enjoyment of testing, and no user difficulties with the audience response systems technology. The audience response systems modality of in-training evaluation captures participant interest and reinforces material more effectively than traditional pencil-paper testing does. The advantages include a more interactive learning environment, stimulation of class participation, immediate feedback to residents, and immediate tabulation of results for the educator. Disadvantages include start-up costs and lead-time preparation.

  13. Quantitative comparison between PGNAA measurements and MCNP calculations in view of the characterization of radioactive wastes in Germany and France

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauerhofer, E.; Havenith, A.; Kettler, J.

    The Forschungszentrum Juelich GmbH (FZJ), together with the Aachen University Rheinisch-Westfaelische Technische Hochschule (RWTH) and the French Alternative Energies and Atomic Energy Commission (CEA Cadarache) are involved in a cooperation aiming at characterizing toxic and reactive elements in radioactive waste packages by means of Prompt Gamma Neutron Activation Analysis (PGNAA). The French and German waste management agencies have indeed defined acceptability limits concerning these elements in view of their projected geological repositories. A first measurement campaign was performed in the new Prompt Gamma Neutron Activation Analysis (PGNAA) facility called MEDINA, at FZJ, to assess the capture gamma-ray signatures of somemore » elements of interest in large samples up to waste drums with a volume of 200 liter. MEDINA is the acronym for Multi Element Detection based on Instrumental Neutron Activation. This paper presents MCNP calculations of the MEDINA facility and quantitative comparison between measurement and simulation. Passive gamma-ray spectra acquired with a high purity germanium detector and calibration sources are used to qualify the numerical model of the crystal. Active PGNAA spectra of a sodium chloride sample measured with MEDINA then allow for qualifying the global numerical model of the measurement cell. Chlorine indeed constitutes a usual reference with reliable capture gamma-ray production data. The goal is to characterize the entire simulation protocol (geometrical model, nuclear data, and postprocessing tools) which will be used for current measurement interpretation, extrapolation of the performances to other types of waste packages or other applications, as well as for the study of future PGNAA facilities.« less

  14. Studies on the quantitative autoradiography. III. Quantitative comparison of a novel tissue-mold measurement technique "paste-mold method," to the semiquantitative whole body autoradiography (WBA), using the same animals.

    PubMed

    Motoji, N; Hamai, Y; Niikura, Y; Shigematsu, A

    1995-01-01

    A novel preparation technique, so called "Paste Mold," was devised for organ and tissue distribution studies. This is the most powerful by joining with autoradioluminography (ARLG), which was established and validated recently in the working group of Forum '93 of Japanese Society for study of xenobiotics. A small piece (10-50 mg) of each organ or tissue was available for measuring its radioactive concentration and it was sampled from the remains of frozen carcass used for macroautoradiography (MARG). The solubilization of the frozen pieces was performed with mixing a suitable volume of gelatine and strong alkaline solution prior to mild heating kept at 40 degrees C for a few hours. After that, the tissue paste was molded in template pattern to form the small plates. The molded plates were contacted with Imaging plate (IP) for recording their radioactive concentration. The recorded IP was processed by BAS2000. The molded plate was formed in thickness of 200 microns, so called infinit thickness against soft beta rays, and therefore the resulting relative intensities, represented by (PSL-BG)/S values, indicated practically responsible ratio of the radioactive concentration in organs and tissues, without any calibulation for beta-self absorption coefficiency. On the other hand, the left half body of the frozen carcass was used for making whole body autoradiography (WBA) before the Paste-Mold preparation. Comparison was performed for difference in (PSL-BG)/S values of organs and tissues between frozen and dried sections. A good concordance in relative intensities, (PSL-BG)/S by the Paste-Mold preparation was given with those by the frozen sections rather than dried sections.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. [Emotional climate and internal communication in a clinical management unit compared with two traditional hospital services].

    PubMed

    Alonso, E; Rubio, A; March, J C; Danet, A

    2011-01-01

    The aim of this study is to compare the emotional climate, quality of communication and performance indicators in a clinical management unit and two traditional hospital services. Quantitative study. questionnaire of 94 questions. 83 health professionals (63 responders) from the clinical management unit of breast pathology and the hospital services of medical oncology and radiation oncology. descriptive statistics, comparison of means, correlation and linear regression models. The clinical management unit reaches higher values compared with the hospital services about: performance indicators, emotional climate, internal communication and evaluation of the leadership. An important gap between existing and desired sources, channels, media and subjects of communication appear, in both clinical management unit and traditional services. The clinical management organization promotes better internal communication and interpersonal relations, leading to improved performance indicators. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.

  16. Performance evaluation of cryogenic counter-flow heat exchangers with longitudinal conduction, heat in-leak and property variations

    NASA Astrophysics Data System (ADS)

    Jiang, Q. F.; Zhuang, M.; Zhu, Z. G.; Y Zhang, Q.; Sheng, L. H.

    2017-12-01

    Counter-flow plate-fin heat exchangers are commonly utilized in cryogenic applications due to their high effectiveness and compact size. For cryogenic heat exchangers in helium liquefaction/refrigeration systems, conventional design theory is no longer applicable and they are usually sensitive to longitudinal heat conduction, heat in-leak from surroundings and variable fluid properties. Governing equations based on distributed parameter method are developed to evaluate performance deterioration caused by these effects. The numerical model could also be applied in many other recuperators with different structures and, hence, available experimental data are used to validate it. For a specific case of the multi-stream heat exchanger in the EAST helium refrigerator, quantitative effects of these heat losses are further discussed, in comparison with design results obtained by the common commercial software. The numerical model could be useful to evaluate and rate the heat exchanger performance under the actual cryogenic environment.

  17. Some aspects of robotics calibration, design and control

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1990-01-01

    The main objective is to introduce techniques in the areas of testing and calibration, design, and control of robotic systems. A statistical technique is described that analyzes a robot's performance and provides quantitative three-dimensional evaluation of its repeatability, accuracy, and linearity. Based on this analysis, a corrective action should be taken to compensate for any existing errors and enhance the robot's overall accuracy and performance. A comparison between robotics simulation software packages that were commercially available (SILMA, IGRIP) and that of Kennedy Space Center (ROBSIM) is also included. These computer codes simulate the kinematics and dynamics patterns of various robot arm geometries to help the design engineer in sizing and building the robot manipulator and control system. A brief discussion on an adaptive control algorithm is provided.

  18. Comparison of in silico models for prediction of mutagenicity.

    PubMed

    Bakhtyari, Nazanin G; Raitano, Giuseppa; Benfenati, Emilio; Martin, Todd; Young, Douglas

    2013-01-01

    Using a dataset with more than 6000 compounds, the performance of eight quantitative structure activity relationships (QSAR) models was evaluated: ACD/Tox Suite, Absorption, Distribution, Metabolism, Elimination, and Toxicity of chemical substances (ADMET) predictor, Derek, Toxicity Estimation Software Tool (T.E.S.T.), TOxicity Prediction by Komputer Assisted Technology (TOPKAT), Toxtree, CEASAR, and SARpy (SAR in python). In general, the results showed a high level of performance. To have a realistic estimate of the predictive ability, the results for chemicals inside and outside the training set for each model were considered. The effect of applicability domain tools (when available) on the prediction accuracy was also evaluated. The predictive tools included QSAR models, knowledge-based systems, and a combination of both methods. Models based on statistical QSAR methods gave better results.

  19. 3D-nanostructured Au electrodes for the event-specific detection of MON810 transgenic maize.

    PubMed

    Fátima Barroso, M; Freitas, Maria; Oliveira, M Beatriz P P; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; Delerue-Matos, Cristina

    2015-03-01

    In the present work, the development of a genosensor for the event-specific detection of MON810 transgenic maize is proposed. Taking advantage of nanostructuration, a cost-effective three dimensional electrode was fabricated and a ternary monolayer containing a dithiol, a monothiol and the thiolated capture probe was optimized to minimize the unspecific signals. A sandwich format assay was selected as a way of precluding inefficient hybridization associated with stable secondary target structures. A comparison between the analytical performance of the Au nanostructured electrodes and commercially available screen-printed electrodes highlighted the superior performance of the nanostructured ones. Finally, the genosensor was effectively applied to detect the transgenic sequence in real samples, showing its potential for future quantitative analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Photo ion spectrometer

    DOEpatents

    Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.

    1989-01-01

    A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.

  1. Interlaboratory Comparison of Methods Determining the Botanical Composition of Animal Feed.

    PubMed

    Braglia, Luca; Morello, Laura; Gavazzi, Floriana; Gianì, Silvia; Mastromauro, Francesco; Breviario, Diego; Cardoso, Hélia Guerra; Valadas, Vera; Campos, Maria Doroteia

    2018-01-01

    A consortium of European enterprises and research institutions has been engaged in the Feed-Code Project with the aim of addressing the requirements stated in European Union Regulation No. 767/2009, concerning market placement and use of feed of known and ascertained botanical composition. Accordingly, an interlaboratory trial was set up to compare the performance of different assays based either on optical microscope or DNA analysis for the qualitative and quantitative identification of the composition of compound animal feeds. A tubulin-based polymorphism method, on which the Feed-Code platform was developed, provided the most accurate results. The present study highlights the need for the performance of ring trials for the determination of the botanical composition of animal feeds and raises an alarm on the actual status of analytical inaccuracy.

  2. Modeling phytoplankton community in reservoirs. A comparison between taxonomic and functional groups-based models.

    PubMed

    Di Maggio, Jimena; Fernández, Carolina; Parodi, Elisa R; Diaz, M Soledad; Estrada, Vanina

    2016-01-01

    In this paper we address the formulation of two mechanistic water quality models that differ in the way the phytoplankton community is described. We carry out parameter estimation subject to differential-algebraic constraints and validation for each model and comparison between models performance. The first approach aggregates phytoplankton species based on their phylogenetic characteristics (Taxonomic group model) and the second one, on their morpho-functional properties following Reynolds' classification (Functional group model). The latter approach takes into account tolerance and sensitivity to environmental conditions. The constrained parameter estimation problems are formulated within an equation oriented framework, with a maximum likelihood objective function. The study site is Paso de las Piedras Reservoir (Argentina), which supplies water for consumption for 450,000 population. Numerical results show that phytoplankton morpho-functional groups more closely represent each species growth requirements within the group. Each model performance is quantitatively assessed by three diagnostic measures. Parameter estimation results for seasonal dynamics of the phytoplankton community and main biogeochemical variables for a one-year time horizon are presented and compared for both models, showing the functional group model enhanced performance. Finally, we explore increasing nutrient loading scenarios and predict their effect on phytoplankton dynamics throughout a one-year time horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A human visual model-based approach of the visual attention and performance evaluation

    NASA Astrophysics Data System (ADS)

    Le Meur, Olivier; Barba, Dominique; Le Callet, Patrick; Thoreau, Dominique

    2005-03-01

    In this paper, a coherent computational model of visual selective attention for color pictures is described and its performances are precisely evaluated. The model based on some important behaviours of the human visual system is composed of four parts: visibility, perception, perceptual grouping and saliency map construction. This paper focuses mainly on its performances assessment by achieving extended subjective and objective comparisons with real fixation points captured by an eye-tracking system used by the observers in a task-free viewing mode. From the knowledge of the ground truth, qualitatively and quantitatively comparisons have been made in terms of the measurement of the linear correlation coefficient (CC) and of the Kulback Liebler divergence (KL). On a set of 10 natural color images, the results show that the linear correlation coefficient and the Kullback Leibler divergence are of about 0.71 and 0.46, respectively. CC and Kl measures with this model are respectively improved by about 4% and 7% compared to the best model proposed by L.Itti. Moreover, by comparing the ability of our model to predict eye movements produced by an average observer, we can conclude that our model succeeds quite well in predicting the spatial locations of the most important areas of the image content.

  4. A Comparison Study for DNA Motif Modeling on Protein Binding Microarray.

    PubMed

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Wong, Hau-San

    2016-01-01

    Transcription factor binding sites (TFBSs) are relatively short (5-15 bp) and degenerate. Identifying them is a computationally challenging task. In particular, protein binding microarray (PBM) is a high-throughput platform that can measure the DNA binding preference of a protein in a comprehensive and unbiased manner; for instance, a typical PBM experiment can measure binding signal intensities of a protein to all possible DNA k-mers (k = 8∼10). Since proteins can often bind to DNA with different binding intensities, one of the major challenges is to build TFBS (also known as DNA motif) models which can fully capture the quantitative binding affinity data. To learn DNA motif models from the non-convex objective function landscape, several optimization methods are compared and applied to the PBM motif model building problem. In particular, representative methods from different optimization paradigms have been chosen for modeling performance comparison on hundreds of PBM datasets. The results suggest that the multimodal optimization methods are very effective for capturing the binding preference information from PBM data. In particular, we observe a general performance improvement if choosing di-nucleotide modeling over mono-nucleotide modeling. In addition, the models learned by the best-performing method are applied to two independent applications: PBM probe rotation testing and ChIP-Seq peak sequence prediction, demonstrating its biological applicability.

  5. Are Neurodynamic Organizations A Fundamental Property of Teamwork?

    PubMed Central

    Stevens, Ronald H.; Galloway, Trysha L.

    2017-01-01

    When performing a task it is important for teams to optimize their strategies and actions to maximize value and avoid the cost of surprise. The decisions teams make sometimes have unintended consequences and they must then reorganize their thinking, roles and/or configuration into corrective structures more appropriate for the situation. In this study we ask: What are the neurodynamic properties of these reorganizations and how do they relate to the moment-by-moment, and longer, performance-outcomes of teams?. We describe an information-organization approach for detecting and quantitating the fluctuating neurodynamic organizations in teams. Neurodynamic organization is the propensity of team members to enter into prolonged (minutes) metastable neurodynamic relationships as they encounter and resolve disturbances to their normal rhythms. Team neurodynamic organizations were detected and modeled by transforming the physical units of each team member's EEG power levels into Shannon entropy-derived information units about the team's organization and synchronization. Entropy is a measure of the variability or uncertainty of information in a data stream. This physical unit to information unit transformation bridges micro level social coordination events with macro level expert observations of team behavior allowing multimodal comparisons across the neural, cognitive and behavioral time scales of teamwork. The measures included the entropy of each team member's data stream, the overall team entropy and the mutual information between dyad pairs of the team. Mutual information can be thought of as periods related to team member synchrony. Comparisons between individual entropy and mutual information levels for the dyad combinations of three-person teams provided quantitative estimates of the proportion of a person's neurodynamic organizations that represented periods of synchrony with other team members, which in aggregate provided measures of the overall degree of neurodynamic interactions of the team. We propose that increased neurodynamic organization occurs when a team's operating rhythm can no longer support the complexity of the task and the team needs to expend energy to re-organize into structures that better minimize the “surprise” in the environment. Consistent with this hypothesis, the frequency and magnitude of neurodynamic organizations were less in experienced military and healthcare teams than they were in more junior teams. Similar dynamical properties of neurodynamic organization were observed in models of the EEG data streams of military, healthcare and high school science teams suggesting that neurodynamic organization may be a common property of teamwork. The innovation of this study is the potential it raises for developing globally applicable quantitative models of team dynamics that will allow comparisons to be made across teams, tasks and training protocols. PMID:28512438

  6. Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury

    PubMed Central

    Liu, Wei; Soderlund, Karl; Senseney, Justin S.; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B.; Liu, Tian; Wang, Yi; Oakes, Terrence R.; Riedy, Gerard

    2017-01-01

    Purpose To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. Materials and Methods The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multi-echo gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Results Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping–derived quantitative measures of microhemorrhages also decreased over time: −0.85 mm3 per day ± 1.59 for total volume (P = .039) and −0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). Conclusion The number of microhemorrhages and quantitative susceptibility mapping–derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. PMID:26371749

  7. Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury.

    PubMed

    Liu, Wei; Soderlund, Karl; Senseney, Justin S; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B; Liu, Tian; Wang, Yi; Oakes, Terrence R; Riedy, Gerard

    2016-02-01

    To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multiecho gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping-derived quantitative measures of microhemorrhages also decreased over time: -0.85 mm(3) per day ± 1.59 for total volume (P = .039) and -0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). The number of microhemorrhages and quantitative susceptibility mapping-derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. © RSNA, 2015.

  8. Ammonium chloride salting out extraction/cleanup for trace-level quantitative analysis in food and biological matrices by flow injection tandem mass spectrometry.

    PubMed

    Nanita, Sergio C; Padivitage, Nilusha L T

    2013-03-20

    A sample extraction and purification procedure that uses ammonium-salt-induced acetonitrile/water phase separation was developed and demonstrated to be compatible with the recently reported method for pesticide residue analysis based on fast extraction and dilution flow injection mass spectrometry (FED-FI-MS). The ammonium salts evaluated were chloride, acetate, formate, carbonate, and sulfate. A mixture of NaCl and MgSO4, salts used in the well-known QuEChERS method, was also tested for comparison. With thermal decomposition/evaporation temperature of <350°C, ammonium salts resulted in negligible ion source residual under typical electrospray conditions, leading to consistent method performance and less instrument cleaning. Although all ammonium salts tested induced acetonitrile/water phase separation, NH4Cl yielded the best performance, thus it was the preferred salting out agent. The NH4Cl salting out method was successfully coupled with FI/MS/MS and tested for fourteen pesticide active ingredients: chlorantraniliprole, cyantraniliprole, chlorimuron ethyl, oxamyl, methomyl, sulfometuron methyl, chlorsulfuron, triflusulfuron methyl, azimsulfuron, flupyrsulfuron methyl, aminocyclopyrachlor, aminocyclopyrachlor methyl, diuron and hexazinone. A validation study was conducted with nine complex matrices: sorghum, rice, grapefruit, canola, milk, eggs, beef, urine and blood plasma. The method is applicable to all analytes, except aminocyclopyrachlor. The method was deemed appropriate for quantitative analysis in 114 out of 126 analyte/matrix cases tested (applicability rate=0.90). The NH4Cl salting out extraction/cleanup allowed expansion of FI/MS/MS for analysis in food of plant and animal origin, and body fluids with increased ruggedness and sensitivity, while maintaining high-throughput (run time=30s/sample). Limits of quantitation (LOQs) of 0.01mgkg(-1) (ppm), the 'well-accepted standard' in pesticide residue analysis, were achieved in >80% of cases tested; while limits of detection (LODs) were typically in the range of 0.001-0.01mgkg(-1) (ppm). A comparison to a well-established HPLC/MS/MS method was also conducted, yielding comparable results, thus confirming the suitability of NH4Cl salting out FI/MS/MS for pesticide residue analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Comparative analysis of hospital energy use: pacific northwest and scandinavia.

    PubMed

    Burpee, Heather; McDade, Erin

    2014-01-01

    This study aimed to establish the potential for significant energy reduction in hospitals in the United States by providing evidence of Scandinavian operational precedents with high Interior Environmental Quality (IEQ) and substantially lower energy profiles than comparable U.S. facilities. These facilities set important precedents for design teams seeking operational examples for achieving aggressive energy and interior environmental quality goals. This examination of operational hospitals is intended to offer hospital owners, designers, and building managers a strong case and concrete framework for strategies to achieve exceptionally high performing buildings. Energy efficient hospitals have the potential to significantly impact the U.S.'s overall energy profile, and key stakeholders in the hospital industry need specific, operationally grounded precedents in order to successfully implement informed energy reduction strategies. This study is an outgrowth of previous research evaluating high quality, low energy hospitals that serve as examples for new high performance hospital design, construction, and operation. Through extensive interviews, numerous site visits, the development of case studies, and data collection, this team has established thorough qualitative and quantitative analyses of several contemporary hospitals in Scandinavia and the Pacific Northwest. Many Scandinavian hospitals demonstrate a low energy profile, and when analyzed in comparison with U.S. hospitals, such Scandinavian precedents help define the framework required to make significant changes in the U.S. hospital building industry. Eight hospitals, four Scandinavian and four Pacific Northwest, were quantitatively compared using the Environmental Protection Agency's Portfolio Manager, allowing researchers to answer specific questions about the impact of energy source and architectural and mechanical strategies on energy efficiency in operational hospitals. Specific architectural, mechanical, and plant systems make these Scandinavian hospitals more energy efficient than their Pacific Northwest counterparts. More importantly, synergistic systems integration allows for their significant reductions in energy consumption. This quantitative comparison of operational Scandinavian and Pacific Northwest hospitals resulted in compelling evidence of the potential for deep energy savings in the U.S., and allowed researchers to outline specific strategies for achieving such reductions. © 2014 Vendome Group, LLC.

  10. Performance of a RT-PCR Assay in Comparison to FISH and Immunohistochemistry for the Detection of ALK in Non-Small Cell Lung Cancer.

    PubMed

    Hout, David R; Schweitzer, Brock L; Lawrence, Kasey; Morris, Stephan W; Tucker, Tracy; Mazzola, Rosetta; Skelton, Rachel; McMahon, Frank; Handshoe, John; Lesperance, Mary; Karsan, Aly; Saltman, David L

    2017-08-01

    Patients with lung cancers harboring an activating anaplastic lymphoma kinase ( ALK ) rearrangement respond favorably to ALK inhibitor therapy. Fluorescence in situ hybridization (FISH) and immunohistochemistry (IHC) are validated and widely used screening tests for ALK rearrangements but both methods have limitations. The ALK RGQ RT-PCR Kit (RT-PCR) is a single tube quantitative real-time PCR assay for high throughput and automated interpretation of ALK expression. In this study, we performed a direct comparison of formalin-fixed paraffin-embedded (FFPE) lung cancer specimens using all three ALK detection methods. The RT-PCR test (diagnostic cut-off Δ C t of ≤8) was shown to be highly sensitive (100%) when compared to FISH and IHC. Sequencing of RNA detected full-length ALK transcripts or EML4-ALK and KIF5B-ALK fusion variants in discordant cases in which ALK expression was detected by the ALK RT-PCR test but negative by FISH and IHC. The overall specificity of the RT-PCR test for the detection of ALK in cases without full-length ALK expression was 94% in comparison to FISH and sequencing. These data support the ALK RT-PCR test as a highly efficient and reliable diagnostic screening approach to identify patients with non-small cell lung cancer whose tumors are driven by oncogenic ALK.

  11. Tribology of monolayer films: comparison between n-alkanethiols on gold and n-alkyl trichlorosilanes on silicon.

    PubMed

    Booth, Brandon D; Vilt, Steven G; McCabe, Clare; Jennings, G Kane

    2009-09-01

    This Article presents a quantitative comparison of the frictional performance for monolayers derived from n-alkanethiolates on gold and n-alkyl trichlorosilanes on silicon. Monolayers were characterized by pin-on-disk tribometry, contact angle analysis, ellipsometry, and electrochemical impedance spectroscopy (EIS). Pin-on-disk microtribometry provided frictional analysis at applied normal loads from 10 to 1000 mN at a speed of 0.1 mm/s. At low loads (10 mN), methyl-terminated n-alkanethiolate self-assembled monolayers (SAMs) exhibited a 3-fold improvement in coefficient of friction over SAMs with hydroxyl- or carboxylic-acid-terminated surfaces. For monolayers prepared from both n-alkanethiols on gold and n-alkyl trichlorosilanes on silicon, a critical chain length of at least eight carbons is required for beneficial tribological performance at an applied load of 9.8 mN. Evidence for disruption of chemisorbed alkanethiolate SAMs with chain lengths n

  12. Performance of a RT-PCR Assay in Comparison to FISH and Immunohistochemistry for the Detection of ALK in Non-Small Cell Lung Cancer

    PubMed Central

    Hout, David R.; Lawrence, Kasey; Morris, Stephan W.; Tucker, Tracy; Mazzola, Rosetta; Skelton, Rachel; McMahon, Frank; Handshoe, John; Lesperance, Mary; Karsan, Aly

    2017-01-01

    Patients with lung cancers harboring an activating anaplastic lymphoma kinase (ALK) rearrangement respond favorably to ALK inhibitor therapy. Fluorescence in situ hybridization (FISH) and immunohistochemistry (IHC) are validated and widely used screening tests for ALK rearrangements but both methods have limitations. The ALK RGQ RT-PCR Kit (RT-PCR) is a single tube quantitative real-time PCR assay for high throughput and automated interpretation of ALK expression. In this study, we performed a direct comparison of formalin-fixed paraffin-embedded (FFPE) lung cancer specimens using all three ALK detection methods. The RT-PCR test (diagnostic cut-off ΔCt of ≤8) was shown to be highly sensitive (100%) when compared to FISH and IHC. Sequencing of RNA detected full-length ALK transcripts or EML4-ALK and KIF5B-ALK fusion variants in discordant cases in which ALK expression was detected by the ALK RT-PCR test but negative by FISH and IHC. The overall specificity of the RT-PCR test for the detection of ALK in cases without full-length ALK expression was 94% in comparison to FISH and sequencing. These data support the ALK RT-PCR test as a highly efficient and reliable diagnostic screening approach to identify patients with non-small cell lung cancer whose tumors are driven by oncogenic ALK. PMID:28763012

  13. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography.

    PubMed

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Ake; Winter, Reidar

    2009-08-25

    Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 +/- 3.7% and -0.2 +/- 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant.

  14. Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography.

    PubMed

    Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T

    2013-12-01

    Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  16. Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing

    USDA-ARS?s Scientific Manuscript database

    Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...

  17. Feedback on students' clinical reasoning skills during fieldwork education

    PubMed Central

    de Beer, Marianne; Mårtensson, Lena

    2015-01-01

    Background/aim Feedback on clinical reasoning skills during fieldwork education is regarded as vital in occupational therapy students' professional development. The nature of supervisors' feedback however, could be confirmative and/or corrective and corrective feedback could be with or without suggestions on how to improve. The aim of the study was to evaluate the impact of supervisors' feedback on final-year occupational therapy students' clinical reasoning skills through comparing the nature of feedback with the students' subsequent clinical reasoning ability. Method A mixed-method approach with a convergent parallel design was used combining the collection and analysis of qualitative and quantitative data. From focus groups and interviews with students, data were collected and analysed qualitatively to determine how the students experienced the feedback they received from their supervisors. By quantitatively comparing the final practical exam grades with the nature of the feedback, their fieldwork End-of-Term grades and average academic performance it became possible to merge the results for comparison and interpretation. Results Students' clinical reasoning skills seem to be improved through corrective feedback if accompanied by suggestions on how to improve, irrespective of their average academic performance. Supervisors were inclined to underrate high performing students and overrate lower performing students. Conclusions Students who obtained higher grades in the final practical examinations received more corrective feedback with suggestions on how to improve from their supervisors. Confirmative feedback alone may not be sufficient for improving the clinical reasoning skills of students. PMID:26256854

  18. Performance comparison of deep learning and segmentation-based radiomic methods in the task of distinguishing benign and malignant breast lesions on DCE-MRI

    NASA Astrophysics Data System (ADS)

    Antropova, Natasha; Huynh, Benjamin; Giger, Maryellen

    2017-03-01

    Intuitive segmentation-based CADx/radiomic features, calculated from the lesion segmentations of dynamic contrast-enhanced magnetic resonance images (DCE-MRIs) have been utilized in the task of distinguishing between malignant and benign lesions. Additionally, transfer learning with pre-trained deep convolutional neural networks (CNNs) allows for an alternative method of radiomics extraction, where the features are derived directly from the image data. However, the comparison of computer-extracted segmentation-based and CNN features in MRI breast lesion characterization has not yet been conducted. In our study, we used a DCE-MRI database of 640 breast cases - 191 benign and 449 malignant. Thirty-eight segmentation-based features were extracted automatically using our quantitative radiomics workstation. Also, 2D ROIs were selected around each lesion on the DCE-MRIs and directly input into a pre-trained CNN AlexNet, yielding CNN features. Each method was investigated separately and in combination in terms of performance in the task of distinguishing between benign and malignant lesions. Area under the ROC curve (AUC) served as the figure of merit. Both methods yielded promising classification performance with round-robin cross-validated AUC values of 0.88 (se =0.01) and 0.76 (se=0.02) for segmentationbased and deep learning methods, respectively. Combination of the two methods enhanced the performance in malignancy assessment resulting in an AUC value of 0.91 (se=0.01), a statistically significant improvement over the performance of the CNN method alone.

  19. Cold Season QPF: Sensitivities to Snow Parameterizations and Comparisons to NASA CloudSat Observations

    NASA Technical Reports Server (NTRS)

    Molthan, A. L.; Haynes, J. A.; Jedlovec, G. L.; Lapenta, W. M.

    2009-01-01

    As operational numerical weather prediction is performed at increasingly finer spatial resolution, precipitation traditionally represented by sub-grid scale parameterization schemes is now being calculated explicitly through the use of single- or multi-moment, bulk water microphysics schemes. As computational resources grow, the real-time application of these schemes is becoming available to a broader audience, ranging from national meteorological centers to their component forecast offices. A need for improved quantitative precipitation forecasts has been highlighted by the United States Weather Research Program, which advised that gains in forecasting skill will draw upon improved simulations of clouds and cloud microphysical processes. Investments in space-borne remote sensing have produced the NASA A-Train of polar orbiting satellites, specially equipped to observe and catalog cloud properties. The NASA CloudSat instrument, a recent addition to the A-Train and the first 94 GHz radar system operated in space, provides a unique opportunity to compare observed cloud profiles to their modeled counterparts. Comparisons are available through the use of a radiative transfer model (QuickBeam), which simulates 94 GHz radar returns based on the microphysics of cloudy model profiles and the prescribed characteristics of their constituent hydrometeor classes. CloudSat observations of snowfall are presented for a case in the central United States, with comparisons made to precipitating clouds as simulated by the Weather Research and Forecasting Model and the Goddard single-moment microphysics scheme. An additional forecast cycle is performed with a temperature-based parameterization of the snow distribution slope parameter, with comparisons to CloudSat observations provided through the QuickBeam simulator.

  20. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  1. Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.

    PubMed

    Sugino, T; Kawahira, H; Nakamura, R

    2014-09-01

       Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information.    Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits.    Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently.    Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.

  2. A 100-Year Review: Methods and impact of genetic selection in dairy cattle-From daughter-dam comparisons to deep learning algorithms.

    PubMed

    Weigel, K A; VanRaden, P M; Norman, H D; Grosu, H

    2017-12-01

    In the early 1900s, breed society herdbooks had been established and milk-recording programs were in their infancy. Farmers wanted to improve the productivity of their cattle, but the foundations of population genetics, quantitative genetics, and animal breeding had not been laid. Early animal breeders struggled to identify genetically superior families using performance records that were influenced by local environmental conditions and herd-specific management practices. Daughter-dam comparisons were used for more than 30 yr and, although genetic progress was minimal, the attention given to performance recording, genetic theory, and statistical methods paid off in future years. Contemporary (herdmate) comparison methods allowed more accurate accounting for environmental factors and genetic progress began to accelerate when these methods were coupled with artificial insemination and progeny testing. Advances in computing facilitated the implementation of mixed linear models that used pedigree and performance data optimally and enabled accurate selection decisions. Sequencing of the bovine genome led to a revolution in dairy cattle breeding, and the pace of scientific discovery and genetic progress accelerated rapidly. Pedigree-based models have given way to whole-genome prediction, and Bayesian regression models and machine learning algorithms have joined mixed linear models in the toolbox of modern animal breeders. Future developments will likely include elucidation of the mechanisms of genetic inheritance and epigenetic modification in key biological pathways, and genomic data will be used with data from on-farm sensors to facilitate precision management on modern dairy farms. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  3. A comparison of the analytical performance of five commercially available assays for neutrophil gelatinase-associated lipocalin using urine.

    PubMed

    Kift, Rebecca L; Messenger, Michael P; Wind, Tobias C; Hepburn, Sophie; Wilson, Michelle; Thompson, Douglas; Smith, Matthew Welberry; Sturgeon, Catharine; Lewington, Andrew J; Selby, Peter J; Banks, Rosamonde E

    2013-05-01

    Neutrophil gelatinase-associated lipocalin (NGAL) is a promising biomarker for acute kidney injury that is beginning to be used in clinical practice in addition to research studies. The current study describes an independent validation and comparison of five commercially available NGAL assays, focusing on urine samples. This is an essential step in the translation of this marker to clinical use in terms of allowing valid inter-study comparison and generation of robust results. Two CE (Conformité Européenne)-marked assays, the NGAL Test (BioPorto) on Siemens ADVIA(®) 1800 and the ARCHITECT Urine NGAL assay on i2000SR (Abbott Laboratories), and three research-use-only (RUO) ELISAs (R&D Systems, Hycult and BioPorto) were evaluated. Imprecision, parallelism, recovery, selectivity, limit of quantitation (LOQ), vulnerability to interference and hook effect were assessed and inter-assay agreement was determined using 68 urine samples from patients with various renal diseases and healthy controls. The Abbott and R&D Systems assays demonstrated satisfactory performance for all parameters tested. However for the other three assays evaluated, problems were identified with LOQ (BioPorto/ADVIA(®)), parallelism (BioPorto ELISA) or several parameters (Hycult). Between-method agreement varied with the Hycult assay in particular being markedly different and highlighting issues with standardization and form of NGAL measured. Variability exists between the five NGAL assays in terms of their performance and this should be taken into account when interpreting results from the various clinical or research studies measuring urinary NGAL.

  4. Reassessment of the Access Testosterone chemiluminescence assay and comparison with LC-MS method.

    PubMed

    Dittadi, Ruggero; Matteucci, Mara; Meneghetti, Elisa; Ndreu, Rudina

    2018-03-01

    To reassess the imprecision and Limit of Quantitation, to evaluate the cross-reaction with dehydroepiandrosterone-sulfate (DHEAS), the accuracy toward liquid chromatography-mass spectrometry (LC-MS) and the reference interval of the Access Testosterone method, performed by DxI immunoassay platform (Beckman Coulter). Imprecision was evaluated testing six pool samples assayed in 20 different run using two reagents lots. The cross-reaction with DHEAS was studied both by a displacement curve and by spiking DHEAS standard in two serum samples with known amount of testosterone. The comparison with LC-MS was evaluated by Passing-Bablock analysis in 21 routine serum samples and 19 control samples from an External Quality Assurance (EQA) scheme. The reference interval was verified by an indirect estimation on 2445 male and 2838 female outpatients. The imprecision study showed a coefficient of variation (CV) between 2.7% and 34.7% for serum pools from 16.3 and 0.27 nmol/L. The value of Limit of Quantitation at 20% CV was 0.53 nmol/L. The DHEAS showed a cross-reaction of 0.0074%. A comparison with LC-MS showed a trend toward a slight underestimation of immunoassay vs LC-MS (Passing-Bablock equations: DxI=-0.24+0.906 LCMS in serum samples and DxI=-0.299+0.981 LCMS in EQA samples). The verification of reference interval showed a 2.5th-97.5th percentile distribution of 6.6-24.3 nmol/L for male over 14 years and <0.5-2.78 nmol/L for female subjects, in accord with the reference intervals reported by the manufacturer. The Access Testosterone method could be considered an adequately reliable tool for the testosterone measurement. © 2017 Wiley Periodicals, Inc.

  5. Evaluation of dental enamel caries assessment using Quantitative Light Induced Fluorescence and Optical Coherence Tomography.

    PubMed

    Maia, Ana Marly Araújo; de Freitas, Anderson Zanardi; de L Campello, Sergio; Gomes, Anderson Stevens Leônidas; Karlsson, Lena

    2016-06-01

    An in vitro study of morphological alterations between sound dental structure and artificially induced white spot lesions in human teeth, was performed through the loss of fluorescence by Quantitative Light-Induced Fluorescence (QLF) and the alterations of the light attenuation coefficient by Optical Coherence Tomography (OCT). To analyze the OCT images using a commercially available system, a special algorithm was applied, whereas the QLF images were analyzed using the software available in the commercial system employed. When analyzing the sound region against white spot lesions region by QLF, a reduction in the fluorescence intensity was observed, whilst an increase of light attenuation by the OCT system occurred. Comparison of the percentage of alteration between optical properties of sound and artificial enamel caries regions showed that OCT processed images through the attenuation of light enhanced the tooth optical alterations more than fluorescence detected by QLF System. QLF versus OCT imaging of enamel caries: a photonics assessment. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Automated Quantitative Characterization of Retinal Vascular Leakage and Microaneurysms in Ultra-widefield Fluorescein Angiography

    PubMed Central

    Ehlers, Justis P.; Wang, Kevin; Vasanji, Amit; Hu, Ming; Srivastava, Sunil K.

    2017-01-01

    Summary Ultra-widefield fluorescein angiography (UWFA) is an emerging imaging modality used to characterize pathology in the retinal vasculature such as microaneurysms (MA) and vascular leakage. Despites its potential value for diagnosis and disease surveillance, objective quantitative assessment of retinal pathology by UWFA is currently limited because it requires laborious manual segmentation by trained human graders. In this report, we describe a novel fully automated software platform, which segments MAs and leakage areas in native and dewarped UWFA images with retinal vascular disease. Comparison of the algorithm to human grader generated gold standards demonstrated significant strong correlations for MA and leakage areas (ICC=0.78-0.87 and ICC=0.70-0.86, respectively, p=2.1×10-7 to 3.5×10-10 and p=7.8×10-6 to 1.3×10-9, respectively). These results suggest the algorithm performs similarly to human graders in MA and leakage segmentation and may be of significant utility in clinical and research settings. PMID:28432113

  7. Foot and Ankle Kinematics and Dynamic Electromyography: Quantitative Analysis of Recovery From Peroneal Neuropathy in a Professional Football Player.

    PubMed

    Prasad, Nikhil K; Coleman Wood, Krista A; Spinner, Robert J; Kaufman, Kenton R

    The assessment of neuromuscular recovery after peripheral nerve surgery has typically been a subjective physical examination. The purpose of this report was to assess the value of gait analysis in documenting recovery quantitatively. A professional football player underwent gait analysis before and after surgery for a peroneal intraneural ganglion cyst causing a left-sided foot drop. Surface electromyography (SEMG) recording from surface electrodes and motion parameter acquisition from a computerized motion capture system consisting of 10 infrared cameras were performed simultaneously. A comparison between SEMG recordings before and after surgery showed a progression from disorganized activation in the left tibialis anterior and peroneus longus muscles to temporally appropriate activation for the phase of the gait cycle. Kinematic analysis of ankle motion planes showed resolution from a complete foot drop preoperatively to phase-appropriate dorsiflexion postoperatively. Gait analysis with dynamic SEMG and motion capture complements physical examination when assessing postoperative recovery in athletes.

  8. AESOP: A Python Library for Investigating Electrostatics in Protein Interactions.

    PubMed

    Harrison, Reed E S; Mohan, Rohith R; Gorham, Ronald D; Kieslich, Chris A; Morikis, Dimitrios

    2017-05-09

    Electric fields often play a role in guiding the association of protein complexes. Such interactions can be further engineered to accelerate complex association, resulting in protein systems with increased productivity. This is especially true for enzymes where reaction rates are typically diffusion limited. To facilitate quantitative comparisons of electrostatics in protein families and to describe electrostatic contributions of individual amino acids, we previously developed a computational framework called AESOP. We now implement this computational tool in Python with increased usability and the capability of performing calculations in parallel. AESOP utilizes PDB2PQR and Adaptive Poisson-Boltzmann Solver to generate grid-based electrostatic potential files for protein structures provided by the end user. There are methods within AESOP for quantitatively comparing sets of grid-based electrostatic potentials in terms of similarity or generating ensembles of electrostatic potential files for a library of mutants to quantify the effects of perturbations in protein structure and protein-protein association. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  9. In vivo quantitative bioluminescence tomography using heterogeneous and homogeneous mouse models.

    PubMed

    Liu, Junting; Wang, Yabin; Qu, Xiaochao; Li, Xiangsi; Ma, Xiaopeng; Han, Runqiang; Hu, Zhenhua; Chen, Xueli; Sun, Dongdong; Zhang, Rongqing; Chen, Duofang; Chen, Dan; Chen, Xiaoyuan; Liang, Jimin; Cao, Feng; Tian, Jie

    2010-06-07

    Bioluminescence tomography (BLT) is a new optical molecular imaging modality, which can monitor both physiological and pathological processes by using bioluminescent light-emitting probes in small living animal. Especially, this technology possesses great potential in drug development, early detection, and therapy monitoring in preclinical settings. In the present study, we developed a dual modality BLT prototype system with Micro-computed tomography (MicroCT) registration approach, and improved the quantitative reconstruction algorithm based on adaptive hp finite element method (hp-FEM). Detailed comparisons of source reconstruction between the heterogeneous and homogeneous mouse models were performed. The models include mice with implanted luminescence source and tumor-bearing mice with firefly luciferase report gene. Our data suggest that the reconstruction based on heterogeneous mouse model is more accurate in localization and quantification than the homogeneous mouse model with appropriate optical parameters and that BLT allows super-early tumor detection in vivo based on tomographic reconstruction of heterogeneous mouse model signal.

  10. Proposal for a study of computer mapping of terrain using multispectral data from ERTS-A for the Yellowstone National Park test site

    NASA Technical Reports Server (NTRS)

    Smedes, H. W. (Principal Investigator); Root, R. R.; Roller, N. E. G.; Despain, D.

    1978-01-01

    The author has identified the following significant results. A terrain map of Yellowstone National Park showed plant community types and other classes of ground cover in what is basically a wild land. The map comprised 12 classes, six of which were mapped with accuracies of 70 to 95%. The remaining six classes had spectral reflectances that overlapped appreciably, and hence, those were mapped less accurately. Techniques were devised for quantitatively comparing the recognition map of the park with control data acquired from ground inspection and from analysis of sidelooking radar images, a thermal IR mosaic, and IR aerial photos of several scales. Quantitative analyses were made in ten 40 sq km test areas. Comparison mechanics were performed by computer with the final results displayed on line printer output. Forested areas were mapped by computer using ERTS data for less than 1/4 the cost of the conventional forest mapping technique for topographic base maps.

  11. A priori Prediction of Neoadjuvant Chemotherapy Response and Survival in Breast Cancer Patients using Quantitative Ultrasound

    PubMed Central

    Tadayyon, Hadi; Sannachi, Lakshmanan; Gangeh, Mehrdad J.; Kim, Christina; Ghandi, Sonal; Trudeau, Maureen; Pritchard, Kathleen; Tran, William T.; Slodkowska, Elzbieta; Sadeghi-Naini, Ali; Czarnota, Gregory J.

    2017-01-01

    Quantitative ultrasound (QUS) can probe tissue structure and analyze tumour characteristics. Using a 6-MHz ultrasound system, radiofrequency data were acquired from 56 locally advanced breast cancer patients prior to their neoadjuvant chemotherapy (NAC) and QUS texture features were computed from regions of interest in tumour cores and their margins as potential predictive and prognostic indicators. Breast tumour molecular features were also collected and used for analysis. A multiparametric QUS model was constructed, which demonstrated a response prediction accuracy of 88% and ability to predict patient 5-year survival rates (p = 0.01). QUS features demonstrated superior performance in comparison to molecular markers and the combination of QUS and molecular markers did not improve response prediction. This study demonstrates, for the first time, that non-invasive QUS features in the core and margin of breast tumours can indicate breast cancer response to neoadjuvant chemotherapy (NAC) and predict five-year recurrence-free survival. PMID:28401902

  12. A priori Prediction of Neoadjuvant Chemotherapy Response and Survival in Breast Cancer Patients using Quantitative Ultrasound.

    PubMed

    Tadayyon, Hadi; Sannachi, Lakshmanan; Gangeh, Mehrdad J; Kim, Christina; Ghandi, Sonal; Trudeau, Maureen; Pritchard, Kathleen; Tran, William T; Slodkowska, Elzbieta; Sadeghi-Naini, Ali; Czarnota, Gregory J

    2017-04-12

    Quantitative ultrasound (QUS) can probe tissue structure and analyze tumour characteristics. Using a 6-MHz ultrasound system, radiofrequency data were acquired from 56 locally advanced breast cancer patients prior to their neoadjuvant chemotherapy (NAC) and QUS texture features were computed from regions of interest in tumour cores and their margins as potential predictive and prognostic indicators. Breast tumour molecular features were also collected and used for analysis. A multiparametric QUS model was constructed, which demonstrated a response prediction accuracy of 88% and ability to predict patient 5-year survival rates (p = 0.01). QUS features demonstrated superior performance in comparison to molecular markers and the combination of QUS and molecular markers did not improve response prediction. This study demonstrates, for the first time, that non-invasive QUS features in the core and margin of breast tumours can indicate breast cancer response to neoadjuvant chemotherapy (NAC) and predict five-year recurrence-free survival.

  13. In vitro quantitative analysis of Salmonella typhimurium preference for amino acids secreted by human breast tumor

    NASA Astrophysics Data System (ADS)

    Choi, Eunpyo; Maeng, Bohee; Lee, Jae-hun; Chang, Hyung-kwan; Park, Jungyul

    2016-12-01

    Bacterial therapies have been paid significant attentions by their ability to penetrate deep into the solid tumor tissue and its propensity to naturally accumulate in tumors of living animals. Understanding the actual mechanism for bacteria to target the tumor is therapeutically crucial but is poorly understood. We hypothesized that amino acids released from the specific tumors induced bacteria to those tumors and the experiments for chemotactic response of bacteria toward the cancer secreting amino acids was then performed by using the diffusion based multiple chemical gradient generator constructed by in situ self-assembly of microspheres. The quantitative analysis was carried out by comparison of intensity using green fluorescent protein (GFP) tagged Salmonella typhimurium ( S. typhimurium) in the gradient generator, which showed the clear preference to the released amino acids, especially from breast cancer patients. The understanding chemotaxis toward the cancer secreting amino acids is essential for controlling S. typhimurium targeting in tumors and will allow for the development of bacterial therapies.

  14. Study on Quality Standard of Processed Curcuma Longa Radix

    PubMed Central

    Zhao, Yongfeng; Quan, Liang; Zhou, Haiting; Cao, Dong; Li, Wenbing; Yang, Zhuo

    2017-01-01

    To control the quality of Curcuma Longa Radix by establishing quality standards, this paper increased the contents of extract and volatile oil determination. Meanwhile, the curcumin was selected as the internal marker, and the relative correlation factors (RCFs) of demethoxycurcumin and bisdemethoxycurcumin were established by high performance liquid chromatography (HPLC). The contents of multicomponents were calculated based on their RCFs. The rationality and feasibility of the methods were evaluated by comparison of the quantitative results between external standard method (ESM) and quantitative analysis of multicomponents by single-marker (QAMS). Ethanol extracts ranged from 9.749 to 15.644% and the mean value was 13.473%. The volatile oil ranged from 0.45 to 0.90 mL/100 g and the mean value was 0.66 mL/100 g. This method was accurate and feasible and could provide a reference for further comprehensive and effective control of the quality standard of Curcuma Longa Radix and its processed products. PMID:29375640

  15. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution - Poster

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method p...

  16. Exciting New Images | Lunar Reconnaissance Orbiter Camera

    Science.gov Websites

    slowly and relentlessly reshapes the Moon's topography. Comparative study of the shapes of lunar craters , quantitative comparison be derived? And how can we quantify and compare the topography of a large number of for quantitative characterization of impact crater topography (Mahanti, P. et al., 2014, Icarus v. 241

  17. A Comparison of Learning Cultures in Different Sizes and Types

    ERIC Educational Resources Information Center

    Brown, Paula D.; Finch, Kim S.; MacGregor, Cynthia

    2012-01-01

    This study compared relevant data and information about leadership and learning cultures in different sizes and types of high schools. Research was conducted using a quantitative design with a qualitative element. Quantitative data were gathered using a researcher-created survey. Independent sample t-tests were conducted to analyze the means of…

  18. Does Pre-Service Preparation Matter? Examining an Old Question in New Ways

    ERIC Educational Resources Information Center

    Ronfeldt, Matthew

    2014-01-01

    Background: Over the past decade, most of the quantitative studies on teacher preparation have focused on comparisons between alternative and traditional routes. There has been relatively little quantitative research on specific features of teacher education that might cause certain pathways into teaching to be more effective than others. The vast…

  19. Detection limits and cost comparisons of human- and gull-associated conventional and quantitative PCR assays in artificial and environmental waters

    EPA Science Inventory

    Modern techniques for tracking fecal pollution in environmental waters require investing in DNA-based methods to determine the presence of specific fecal sources. To help water quality managers decide whether to employ routine polymerase chain reaction (PCR) or quantitative PC...

  20. Comparison of quantitative PCR assays for Escherichia coli targeting ribosomal RNA and single copy genes

    EPA Science Inventory

    Aims: Compare specificity and sensitivity of quantitative PCR (qPCR) assays targeting single and multi-copy gene regions of Escherichia coli. Methods and Results: A previously reported assay targeting the uidA gene (uidA405) was used as the basis for comparing the taxono...

  1. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  2. Comparison of the quantitative dry culture methods with both conventional media and most probable number method for the enumeration of coliforms and Escherichia coli/coliforms in food.

    PubMed

    Teramura, H; Sota, K; Iwasaki, M; Ogihara, H

    2017-07-01

    Sanita-kun™ CC (coliform count) and EC (Escherichia coli/coliform count), sheet quantitative culture systems which can avoid chromogenic interference by lactase in food, were evaluated in comparison with conventional methods for these bacteria. Based on the results of inclusivity and exclusivity studies using 77 micro-organisms, sensitivity and specificity of both Sanita-kun™ met the criteria for ISO 16140. Both media were compared with deoxycholate agar, violet red bile agar, Merck Chromocult™ coliform agar (CCA), 3M Petrifilm™ CC and EC (PEC) and 3-tube MPN, as reference methods, in 100 naturally contaminated food samples. The correlation coefficients of both Sanita-kun™ for coliform detection were more than 0·95 for all comparisons. For E. coli detection, Sanita-kun™ EC was compared with CCA, PEC and MPN in 100 artificially contaminated food samples. The correlation coefficients for E. coli detection of Sanita-kun™ EC were more than 0·95 for all comparisons. There were no significant differences in all comparisons when conducting a one-way analysis of variance (anova). Both Sanita-kun™ significantly inhibited colour interference by lactase when inhibition of enzymatic staining was assessed using 40 natural cheese samples spiked with coliform. Our results demonstrated Sanita-kun™ CC and EC are suitable alternatives for the enumeration of coliforms and E. coli/coliforms, respectively, in a variety of foods, and specifically in fermented foods. Current chromogenic media for coliforms and Escherichia coli/coliforms have enzymatic coloration due to breaking down of chromogenic substrates by food lactase. The novel sheet culture media which have film layer to avoid coloration by food lactase have been developed for enumeration of coliforms and E. coli/coliforms respectively. In this study, we demonstrated these media had comparable performance with reference methods and less interference by food lactase. These media have a possibility not only to be useful alternatives but also to contribute for accurate enumeration of these bacteria in a variety of foods, and specifically in fermented foods. © 2017 The Society for Applied Microbiology.

  3. Cerebral NIRS performance testing with molded and 3D-printed phantoms (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wang, Jianting; Huang, Stanley; Chen, Yu; Welle, Cristin G.; Pfefer, T. Joshua

    2017-03-01

    Near-infrared spectroscopy (NIRS) has emerged as a low-cost, portable approach for rapid, point-of-care detection of hematomas caused by traumatic brain injury. As a new technology, there is a need to develop standardized test methods for objective, quantitative performance evaluation of these devices. Towards this goal, we have developed and studied two types of phantom-based testing approaches. The first involves 3D-printed phantoms incorporating hemoglobin-filled inclusions. Phantom layers representing specific cerebral tissues were printed using photopolymers doped with varying levels of titanium oxide and black resin. The accuracy, precision and spectral dependence of printed phantom optical properties were validated using spectrophotometry. The phantom also includes a hematoma inclusion insert which was filled with a hemoglobin solution. Oxygen saturation levels were modified by adding sodium dithionite at calibrated concentrations. The second phantom approach involves molded silicone layers with a superficial region - simulating the scalp and skull - comprised of removable layers to vary hematoma size and depth, and a bottom layer representing brain matter. These phantoms were tested with both a commercial hematoma detector and a custom NIRS system to optimize their designs and validate their utility in performing inter-device comparisons. The effects of hematoma depth, diameter, and height, as well as tissue optical properties and biological variables including hemoglobin saturation level and scalp/skull thickness were studied. Results demonstrate the ability to quantitatively compare NIRS device performance and indicate the promise of using 3D printing to achieve phantoms with realistic variations in tissue optical properties for evaluating biophotonic device performance.

  4. LCSH and PRECIS in Music: A Comparison.

    ERIC Educational Resources Information Center

    Gabbard, Paula Beversdorf

    1985-01-01

    By studying examples of their applications by two major English language bibliographic agencies, this article compares strengths and weaknesses of PRECIS and Library of Congress Subject Headings for books about music. Highlights include quantitative and qualitative analysis, comparison of number of subject statements, and terminology problems in…

  5. An Optimized Method for the Measurement of Acetaldehyde by High-Performance Liquid Chromatography

    PubMed Central

    Guan, Xiangying; Rubin, Emanuel; Anni, Helen

    2011-01-01

    Background Acetaldehyde is produced during ethanol metabolism predominantly in the liver by alcohol dehydrogenase, and rapidly eliminated by oxidation to acetate via aldehyde dehydrogenase. Assessment of circulating acetaldehyde levels in biological matrices is performed by headspace gas chromatography and reverse phase high-performance liquid chromatography (RP-HPLC). Methods We have developed an optimized method for the measurement of acetaldehyde by RP-HPLC in hepatoma cell culture medium, blood and plasma. After sample deproteinization, acetaldehyde was derivatized with 2,4-dinitrophenylhydrazine (DNPH). The reaction was optimized for pH, amount of derivatization reagent,, time and temperature. Extraction methods of the acetaldehyde-hydrazone (AcH-DPN) stable derivative and product stability studies were carried out. Acetaldehyde was identified by its retention time in comparison to AcH-DPN standard, using a new chromatography gradient program, and quantitated based on external reference standards and standard addition calibration curves in the presence and absence of ethanol. Results Derivatization of acetaldehyde was performed at pH 4.0 with a 80-fold molar excess of DNPH. The reaction was completed in 40 min at ambient temperature, and the product was stable for 2 days. A clear separation of AcH-DNP from DNPH was obtained with a new 11-min chromatography program. Acetaldehyde detection was linear up to 80 μM. The recovery of acetaldehyde was >88% in culture media, and >78% in plasma. We quantitatively determined the ethanol-derived acetaldehyde in hepatoma cells, rat blood and plasma with a detection limit around 3 μM. The accuracy of the method was <9% for intraday and <15% for interday measurements, in small volume (70 μl) plasma sampling. Conclusions An optimized method for the quantitative determination of acetaldehyde in biological systems was developed using derivatization with DNPH, followed by a short RP-HPLC separation of AcH-DNP. The method has an extended linear range, is reproducible and applicable to small volume sampling of culture media and biological fluids. PMID:21895715

  6. An optimized method for the measurement of acetaldehyde by high-performance liquid chromatography.

    PubMed

    Guan, Xiangying; Rubin, Emanuel; Anni, Helen

    2012-03-01

    Acetaldehyde is produced during ethanol metabolism predominantly in the liver by alcohol dehydrogenase and rapidly eliminated by oxidation to acetate via aldehyde dehydrogenase. Assessment of circulating acetaldehyde levels in biological matrices is performed by headspace gas chromatography and reverse phase high-performance liquid chromatography (RP-HPLC). We have developed an optimized method for the measurement of acetaldehyde by RP-HPLC in hepatoma cell culture medium, blood, and plasma. After sample deproteinization, acetaldehyde was derivatized with 2,4-dinitrophenylhydrazine (DNPH). The reaction was optimized for pH, amount of derivatization reagent, time, and temperature. Extraction methods of the acetaldehyde-hydrazone (AcH-DNP) stable derivative and product stability studies were carried out. Acetaldehyde was identified by its retention time in comparison with AcH-DNP standard, using a new chromatography gradient program, and quantitated based on external reference standards and standard addition calibration curves in the presence and absence of ethanol. Derivatization of acetaldehyde was performed at pH 4.0 with an 80-fold molar excess of DNPH. The reaction was completed in 40 minutes at ambient temperature, and the product was stable for 2 days. A clear separation of AcH-DNP from DNPH was obtained with a new 11-minute chromatography program. Acetaldehyde detection was linear up to 80 μM. The recovery of acetaldehyde was >88% in culture media and >78% in plasma. We quantitatively determined the ethanol-derived acetaldehyde in hepatoma cells, rat blood and plasma with a detection limit around 3 μM. The accuracy of the method was <9% for intraday and <15% for interday measurements, in small volume (70 μl) plasma sampling. An optimized method for the quantitative determination of acetaldehyde in biological systems was developed using derivatization with DNPH, followed by a short RP-HPLC separation of AcH-DNP. The method has an extended linear range, is reproducible and applicable to small-volume sampling of culture media and biological fluids. Copyright © 2011 by the Research Society on Alcoholism.

  7. Comparison of Quantitative Antifungal Testing Methods for Textile Fabrics.

    PubMed

    Imoto, Yasuo; Seino, Satoshi; Nakagawa, Takashi; Yamamoto, Takao A

    2017-01-01

     Quantitative antifungal testing methods for textile fabrics under growth-supportive conditions were studied. Fungal growth activities on unfinished textile fabrics and textile fabrics modified with Ag nanoparticles were investigated using the colony counting method and the luminescence method. Morphological changes of the fungi during incubation were investigated by microscopic observation. Comparison of the results indicated that the fungal growth activity values obtained with the colony counting method depended on the morphological state of the fungi on textile fabrics, whereas those obtained with the luminescence method did not. Our findings indicated that unique characteristics of each testing method must be taken into account for the proper evaluation of antifungal activity.

  8. Empirical expression for DC magnetization curve of immobilized magnetic nanoparticles for use in biomedical applications

    NASA Astrophysics Data System (ADS)

    Elrefai, Ahmed L.; Sasayama, Teruyoshi; Yoshida, Takashi; Enpuku, Keiji

    2018-05-01

    We studied the magnetization (M-H) curve of immobilized magnetic nanoparticles (MNPs) used for biomedical applications. First, we performed numerical simulation on the DC M-H curve over a wide range of MNPs parameters. Based on the simulation results, we obtained an empirical expression for DC M-H curve. The empirical expression was compared with the measured M-H curves of various MNP samples, and quantitative agreements were obtained between them. We can also estimate the basic parameters of MNP from the comparison. Therefore, the empirical expression is useful for analyzing the M-H curve of immobilized MNPs for specific biomedical applications.

  9. Error function attack of chaos synchronization based encryption schemes.

    PubMed

    Wang, Xingang; Zhan, Meng; Lai, C-H; Gang, Hu

    2004-03-01

    Different chaos synchronization based encryption schemes are reviewed and compared from the practical point of view. As an efficient cryptanalysis tool for chaos encryption, a proposal based on the error function attack is presented systematically and used to evaluate system security. We define a quantitative measure (quality factor) of the effective applicability of a chaos encryption scheme, which takes into account the security, the encryption speed, and the robustness against channel noise. A comparison is made of several encryption schemes and it is found that a scheme based on one-way coupled chaotic map lattices performs outstandingly well, as judged from quality factor. Copyright 2004 American Institute of Physics.

  10. 0.4 Microns Spatial Resolution with 1 GHz (lambda = 30 cm) Evanescent Microwave Probe

    NASA Technical Reports Server (NTRS)

    Tabib-Azar, M.; Su, D.-P.; Pohar, A.; LeClair, S. R.; Ponchak, George E.

    1999-01-01

    In this article we describe evanescent field imaging of material nonuniformities with a record resolution of 0.4 microns at 1 GHz (lambda(sub g)/750000), using a resonant stripline scanning microwave probe. A chemically etched tip is used as a point-like evanescent field emitter and a probe-sample distance modulation is employed to improve the signal-to-noise ratio. Images obtained by evanescent microwave probe, by optical microscope, and by scanning tunneling microscope are presented for comparison. Probe was calibrated to perform quantitative conductivity measurements. The principal factors affecting the ultimate resolution of evanescent microwave probe are also discussed.

  11. System architectures for telerobotic research

    NASA Technical Reports Server (NTRS)

    Harrison, F. Wallace

    1989-01-01

    Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.

  12. Tropospheric ozone in the Nineteenth Century: The Moncalieri series

    NASA Astrophysics Data System (ADS)

    Anfossi, D.; Sandroni, S.; Viarengo, S.

    1991-09-01

    A 26-year (1868-1893) data series of daily ozone readings performed at Moncalieri, northern Italy, by the Schönbein test paper technique has been analyzed. The availability of a series of simultaneous readings by the Schönbein and a quantitative technique (Levy, 1877) and the conversion chart for humidity by Linvill et al. (1980) allowed us to develop a procedure to convert the Moncalieri data into parts per billion by volume values. The results seem to indicate that in comparison to one century ago, the ozone level in Europe has increased by more than twice not only at the surface but also in the free troposphere.

  13. Foot agility and toe gnosis/graphaesthesia as potential indicators of integrity of the medial cerebral surface: normative data and comparison with clinical populations.

    PubMed

    Persinger, M A; Richards, P M

    1995-06-01

    A protocol was designed to identify quantitative indicators of the function of the medial surfaces of the cerebral hemispheres. Normative data were collected from 40 volunteers for foot agility, toe gnosis, and toe graphaesthesia. A total of 100 patients (most of whom had been referred for possible closed-head injuries) completed thorough neuropsychological and cognitive assessments. Deficits for toe graphaesthesia were most consistently correlated with general brain impairment and with scores for tasks whose normal performance requires the integrity of structures within the dorsal half of the medial cerebral hemispheres.

  14. Cost analysis of advanced turbine blade manufacturing processes

    NASA Technical Reports Server (NTRS)

    Barth, C. F.; Blake, D. E.; Stelson, T. S.

    1977-01-01

    A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.

  15. Flight test investigation of certification issues pertaining to general-aviation-type aircraft with natural laminar flow

    NASA Technical Reports Server (NTRS)

    Doty, Wayne A.

    1990-01-01

    Development of Natural Laminar Flow (NLF) technology for application to general aviation-type aircraft has raised some question as to the adequacy of FAR Part 23 for certification of aircraft with significant NLF. A series of flight tests were conducted with a modified Cessna T210R to allow quantitative comparison of the aircraft's ability to meet certification requirements with significant NLF and with boundary layer transition fixed near the leading edge. There were no significant differences between the two conditions except an increasing in drag, which resulted in longer takeoff distances and reduced climb performance.

  16. Turbulence production due to secondary vortex cutting in a turbine rotor

    NASA Astrophysics Data System (ADS)

    Binder, A.

    1985-10-01

    Measurements of the unsteady flow field near and within a turbine rotor were made by means of a Laser-2-Focus velocimeter. The testing was performed in a single-stage cold-air turbine at part-load and near-design conditions. Random unsteadiness and flow angle results indicate that the secondary vortices of the stator break down after being cut and deformed by the rotor blades. A quantitative comparison shows that some of the energy contained in these secondary vortices is thereby converted into turbulence energy in the front part of the rotor. An attempt is made to explain this turbulence energy production as caused by the vortex breakdown.

  17. Computer-oriented synthesis of wide-band non-uniform negative resistance amplifiers

    NASA Technical Reports Server (NTRS)

    Branner, G. R.; Chan, S.-P.

    1975-01-01

    This paper presents a synthesis procedure which provides design values for broad-band amplifiers using non-uniform negative resistance devices. Employing a weighted least squares optimization scheme, the technique, based on an extension of procedures for uniform negative resistance devices, is capable of providing designs for a variety of matching network topologies. It also provides, for the first time, quantitative results for predicting the effects of parameter element variations on overall amplifier performance. The technique is also unique in that it employs exact partial derivatives for optimization and sensitivity computation. In comparison with conventional procedures, significantly improved broad-band designs are shown to result.

  18. Comparison of qualitative and quantitative evaluation of diffusion-weighted MRI and chemical-shift imaging in the differentiation of benign and malignant vertebral body fractures.

    PubMed

    Geith, Tobias; Schmidt, Gerwin; Biffar, Andreas; Dietrich, Olaf; Dürr, Hans Roland; Reiser, Maximilian; Baur-Melnyk, Andrea

    2012-11-01

    The objective of our study was to compare the diagnostic value of qualitative diffusion-weighted imaging (DWI), quantitative DWI, and chemical-shift imaging in a single prospective cohort of patients with acute osteoporotic and malignant vertebral fractures. The study group was composed of patients with 26 osteoporotic vertebral fractures (18 women, eight men; mean age, 69 years; age range, 31 years 6 months to 86 years 2 months) and 20 malignant vertebral fractures (nine women, 11 men; mean age, 63.4 years; age range, 24 years 8 months to 86 years 4 months). T1-weighted, STIR, and T2-weighted sequences were acquired at 1.5 T. A DW reverse fast imaging with steady-state free precession (PSIF) sequence at different delta values was evaluated qualitatively. A DW echo-planar imaging (EPI) sequence and a DW single-shot turbo spin-echo (TSE) sequence at different b values were evaluated qualitatively and quantitatively using the apparent diffusion coefficient. Opposed-phase sequences were used to assess signal intensity qualitatively. The signal loss between in- and opposed-phase images was determined quantitatively. Two-tailed Fisher exact test, Mann-Whitney test, and receiver operating characteristic analysis were performed. Sensitivities, specificities, and accuracies were determined. Qualitative DW-PSIF imaging (delta = 3 ms) showed the best performance for distinguishing between benign and malignant fractures (sensitivity, 100%; specificity, 88.5%; accuracy, 93.5%). Qualitative DW-EPI (b = 50 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.50]) and DW single-shot TSE imaging (b = 100 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.18]; b = 400 s/mm(2) [p = 0.18]; b = 600 s/mm(2) [p = 0.39]) did not indicate significant differences between benign and malignant fractures. DW-EPI using a b value of 500 s/mm(2) (p = 0.01) indicated significant differences between benign and malignant vertebral fractures. Quantitative DW-EPI (p = 0.09) and qualitative opposed-phase imaging (p = 0.06) did not exhibit significant differences, quantitative DW single-shot TSE imaging (p = 0.002) and quantitative chemical-shift imaging (p = 0.01) showed significant differences between benign and malignant fractures. The DW-PSIF sequence (delta = 3 ms) had the highest accuracy in differentiating benign from malignant vertebral fractures. Quantitative chemical-shift imaging and quantitative DW single-shot TSE imaging had a lower accuracy than DW-PSIF imaging because of a large overlap. Qualitative assessment of opposed-phase, DW-EPI, and DW single-shot TSE sequences and quantitative assessment of the DW-EPI sequence were not suitable for distinguishing between benign and malignant vertebral fractures.

  19. Ionization Electron Signal Processing in Single Phase LArTPCs II. Data/Simulation Comparison and Performance in MicroBooNE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, C.; et al.

    The single-phase liquid argon time projection chamber (LArTPC) provides a large amount of detailed information in the form of fine-grained drifted ionization charge from particle traces. To fully utilize this information, the deposited charge must be accurately extracted from the raw digitized waveforms via a robust signal processing chain. Enabled by the ultra-low noise levels associated with cryogenic electronics in the MicroBooNE detector, the precise extraction of ionization charge from the induction wire planes in a single-phase LArTPC is qualitatively demonstrated on MicroBooNE data with event display images, and quantitatively demonstrated via waveform-level and track-level metrics. Improved performance of inductionmore » plane calorimetry is demonstrated through the agreement of extracted ionization charge measurements across different wire planes for various event topologies. In addition to the comprehensive waveform-level comparison of data and simulation, a calibration of the cryogenic electronics response is presented and solutions to various MicroBooNE-specific TPC issues are discussed. This work presents an important improvement in LArTPC signal processing, the foundation of reconstruction and therefore physics analyses in MicroBooNE.« less

  20. How Well Does LCA Model Land Use Impacts on Biodiversity?--A Comparison with Approaches from Ecology and Conservation.

    PubMed

    Curran, Michael; de Souza, Danielle Maia; Antón, Assumpció; Teixeira, Ricardo F M; Michelsen, Ottar; Vidal-Legaz, Beatriz; Sala, Serenella; Milà i Canals, Llorenç

    2016-03-15

    The modeling of land use impacts on biodiversity is considered a priority in life cycle assessment (LCA). Many diverging approaches have been proposed in an expanding literature on the topic. The UNEP/SETAC Life Cycle Initiative is engaged in building consensus on a shared modeling framework to highlight best-practice and guide model application by practitioners. In this paper, we evaluated the performance of 31 models from both the LCA and the ecology/conservation literature (20 from LCA, 11 from non-LCA fields) according to a set of criteria reflecting (i) model completeness, (ii) biodiversity representation, (iii) impact pathway coverage, (iv) scientific quality, and (v) stakeholder acceptance. We show that LCA models tend to perform worse than those from ecology and conservation (although not significantly), implying room for improvement. We identify seven best-practice recommendations that can be implemented immediately to improve LCA models based on existing approaches in the literature. We further propose building a "consensus model" through weighted averaging of existing information, to complement future development. While our research focuses on conceptual model design, further quantitative comparison of promising models in shared case studies is an essential prerequisite for future informed model choice.

  1. Comparison of amplitude-decorrelation, speckle-variance and phase-variance OCT angiography methods for imaging the human retina and choroid

    PubMed Central

    Gorczynska, Iwona; Migacz, Justin V.; Zawadzki, Robert J.; Capps, Arlie G.; Werner, John S.

    2016-01-01

    We compared the performance of three OCT angiography (OCTA) methods: speckle variance, amplitude decorrelation and phase variance for imaging of the human retina and choroid. Two averaging methods, split spectrum and volume averaging, were compared to assess the quality of the OCTA vascular images. All data were acquired using a swept-source OCT system at 1040 nm central wavelength, operating at 100,000 A-scans/s. We performed a quantitative comparison using a contrast-to-noise (CNR) metric to assess the capability of the three methods to visualize the choriocapillaris layer. For evaluation of the static tissue noise suppression in OCTA images we proposed to calculate CNR between the photoreceptor/RPE complex and the choriocapillaris layer. Finally, we demonstrated that implementation of intensity-based OCT imaging and OCT angiography methods allows for visualization of retinal and choroidal vascular layers known from anatomic studies in retinal preparations. OCT projection imaging of data flattened to selected retinal layers was implemented to visualize retinal and choroidal vasculature. User guided vessel tracing was applied to segment the retinal vasculature. The results were visualized in a form of a skeletonized 3D model. PMID:27231598

  2. Computerized resources in language therapy with children of the autistic spectrum.

    PubMed

    Fernandes, Fernanda Dreux Miranda; Santos, Thaís Helena Ferreira; Amato, Cibelle Albuquerque de la Higuera; Molini-Avejonas, Daniela Regina

    2010-01-01

    The use of computerized technology in language therapy with children of the autistic spectrum. To assess the interference of using computers and specific programs during language therapy in the functional communicative profile and socio-cognitive performance of children of the autistic spectrum. 23 children with ages ranging between 3 and 12 years were individually video recorded prior to and after a set of 10 regular language therapy sessions (i.e. a total of two video samples per subject) using computerized games according to the child's choice. The following expressions were used by the therapists to describe the children's performance during the use of computers: more attentive, more communicative initiatives, more eye contact, more interactive, more verbalizations, more attention and more action requests. Qualitative and quantitative progresses were identified, although without statistical significance. Those progresses were observed after a time period that is smaller than the usually applied to this kind of comparison and it seems to be a promising result. More controlled associations and comparisons were not possible due to the groups' heterogeneity and therefore more consistent conclusions are not possible. It was clear that the subjects presented different reactions to the use of computerized resources during language therapy.

  3. A comparative study of first-derivative spectrophotometry and column high-performance liquid chromatography applied to the determination of repaglinide in tablets and for dissolution testing.

    PubMed

    AlKhalidi, Bashar A; Shtaiwi, Majed; AlKhatib, Hatim S; Mohammad, Mohammad; Bustanji, Yasser

    2008-01-01

    A fast and reliable method for the determination of repaglinide is highly desirable to support formulation screening and quality control. A first-derivative UV spectroscopic method was developed for the determination of repaglinide in tablet dosage form and for dissolution testing. First-derivative UV absorbance was measured at 253 nm. The developed method was validated for linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ) in comparison to the U.S. Pharmacopeia (USP) column high-performance liquid chromatographic (HPLC) method. The first-derivative UV spectrophotometric method showed excellent linearity [correlation coefficient (r) = 0.9999] in the concentration range of 1-35 microg/mL and precision (relative standard deviation < 1.5%). The LOD and LOQ were 0.23 and 0.72 microg/mL, respectively, and good recoveries were achieved (98-101.8%). Statistical comparison of results of the first-derivative UV spectrophotometric and the USP HPLC methods using the t-test showed that there was no significant difference between the 2 methods. Additionally, the method was successfully used for the dissolution test of repaglinide and was found to be reliable, simple, fast, and inexpensive.

  4. Performance comparison of two resolution modeling PET reconstruction algorithms in terms of physical figures of merit used in quantitative imaging.

    PubMed

    Matheoud, R; Ferrando, O; Valzano, S; Lizio, D; Sacchetti, G; Ciarmiello, A; Foppiano, F; Brambilla, M

    2015-07-01

    Resolution modeling (RM) of PET systems has been introduced in iterative reconstruction algorithms for oncologic PET. The RM recovers the loss of resolution and reduces the associated partial volume effect. While these methods improved the observer performance, particularly in the detection of small and faint lesions, their impact on quantification accuracy still requires thorough investigation. The aim of this study was to characterize the performances of the RM algorithms under controlled conditions simulating a typical (18)F-FDG oncologic study, using an anthropomorphic phantom and selected physical figures of merit, used for image quantification. Measurements were performed on Biograph HiREZ (B_HiREZ) and Discovery 710 (D_710) PET/CT scanners and reconstructions were performed using the standard iterative reconstructions and the RM algorithms associated to each scanner: TrueX and SharpIR, respectively. RM determined a significant improvement in contrast recovery for small targets (≤17 mm diameter) only for the D_710 scanner. The maximum standardized uptake value (SUVmax) increased when RM was applied using both scanners. The SUVmax of small targets was on average lower with the B_HiREZ than with the D_710. Sharp IR improved the accuracy of SUVmax determination, whilst TrueX showed an overestimation of SUVmax for sphere dimensions greater than 22 mm. The goodness of fit of adaptive threshold algorithms worsened significantly when RM algorithms were employed for both scanners. Differences in general quantitative performance were observed for the PET scanners analyzed. Segmentation of PET images using adaptive threshold algorithms should not be undertaken in conjunction with RM reconstructions. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Clinical performance of the novel DiaSorin LIAISON(®) XL murex: HBsAg Quant, HCV-Ab, HIV-Ab/Ag assays.

    PubMed

    Krawczyk, Adalbert; Hintze, Christian; Ackermann, Jessica; Goitowski, Birgit; Trippler, Martin; Grüner, Nico; Neumann-Fraune, Maria; Verheyen, Jens; Fiedler, Melanie

    2014-01-01

    The fully automated and closed LIAISON(®)XL platform was developed for reliable detection of infection markers like hepatitis B virus (HBV) surface antigen (HBsAg), hepatitis C virus (HCV) antibodies (Ab) or human immunodeficiency virus (HIV)-Ag/Ab. To date, less is known about the diagnostic performance of this system in direct comparison to the common Abbott ARCHITECT(®) platform. We compared the diagnostic performance and usability of the DiaSorin LIAISON(®)XL with the commonly used Abbott ARCHITECT(®) system. The qualitative performance of the above mentioned assays was compared in about 500 sera. Quantitative tests were performed for HBsAg-positive samples from patients under therapy (n=289) and in vitro expressed mutants (n=37). For HCV-Ab, a total number of 155 selected samples from patients chronically infected with different HCV genotypes were tested. The concordance between both systems was 99.4% for HBsAg, 98.81% for HCV-Ab, and 99.6% for HIV-Ab/Ag. The quantitative LIAISON(®)XL murex HBsAg assay detected all mutants in comparable amounts to the HBsAg wild type and yielded highly reliable HBsAg kinetics in patients treated with antiviral drugs. Dilution experiments using the 2nd International Standard for HBsAg (WHO) showed a high accuracy of this test. HCV-Ab from patients infected with genotypes 1-3 were equally detected in both systems. Interestingly, S/CO levels of HCV-Ab from patients infected with genotype 3 seem to be relatively low using both systems. The LIAISON(®)XL platform proved to be an excellent system for diagnostics of HBV, HCV, and HIV with equal performance compared to the ARCHITECT(®) system. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Robustness analysis of superpixel algorithms to image blur, additive Gaussian noise, and impulse noise

    NASA Astrophysics Data System (ADS)

    Brekhna, Brekhna; Mahmood, Arif; Zhou, Yuanfeng; Zhang, Caiming

    2017-11-01

    Superpixels have gradually become popular in computer vision and image processing applications. However, no comprehensive study has been performed to evaluate the robustness of superpixel algorithms in regard to common forms of noise in natural images. We evaluated the robustness of 11 recently proposed algorithms to different types of noise. The images were corrupted with various degrees of Gaussian blur, additive white Gaussian noise, and impulse noise that either made the object boundaries weak or added extra information to it. We performed a robustness analysis of simple linear iterative clustering (SLIC), Voronoi Cells (VCells), flooding-based superpixel generation (FCCS), bilateral geodesic distance (Bilateral-G), superpixel via geodesic distance (SSS-G), manifold SLIC (M-SLIC), Turbopixels, superpixels extracted via energy-driven sampling (SEEDS), lazy random walk (LRW), real-time superpixel segmentation by DBSCAN clustering, and video supervoxels using partially absorbing random walks (PARW) algorithms. The evaluation process was carried out both qualitatively and quantitatively. For quantitative performance comparison, we used achievable segmentation accuracy (ASA), compactness, under-segmentation error (USE), and boundary recall (BR) on the Berkeley image database. The results demonstrated that all algorithms suffered performance degradation due to noise. For Gaussian blur, Bilateral-G exhibited optimal results for ASA and USE measures, SLIC yielded optimal compactness, whereas FCCS and DBSCAN remained optimal for BR. For the case of additive Gaussian and impulse noises, FCCS exhibited optimal results for ASA, USE, and BR, whereas Bilateral-G remained a close competitor in ASA and USE for Gaussian noise only. Additionally, Turbopixel demonstrated optimal performance for compactness for both types of noise. Thus, no single algorithm was able to yield optimal results for all three types of noise across all performance measures. Conclusively, to solve real-world problems effectively, more robust superpixel algorithms must be developed.

  7. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography

    PubMed Central

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Åke; Winter, Reidar

    2009-01-01

    Background Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Methods Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. Results There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 ± 3.7% and -0.2 ± 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Conclusion Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant. PMID:19706183

  8. Verification of a VRF Heat Pump Computer Model in EnergyPlus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigusse, Bereket; Raustad, Richard

    2013-06-15

    This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-loadmore » performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.« less

  9. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  10. Comparison of Quantitative and Qualitative Research Traditions: Epistemological, Theoretical, and Methodological Differences

    ERIC Educational Resources Information Center

    Yilmaz, Kaya

    2013-01-01

    There has been much discussion about quantitative and qualitative approaches to research in different disciplines. In the behavioural and social sciences, these two paradigms are compared to reveal their relative strengths and weaknesses. But the debate about both traditions has commonly taken place in academic books. It is hard to find an article…

  11. Multi-laboratory comparison of quantitative PCR assays for detection and quantification of Fusarium virguliforme from soybean roots and soil

    USDA-ARS?s Scientific Manuscript database

    Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...

  12. COMPARISON OF POPULATIONS OF MOULD SPECIES IN HOMES IN THE UK AND US USING MOLD-SPECIFIC QUANTITATIVE PCR (MSQPCR)

    EPA Science Inventory

    The goal of this research was to compare the populations of 81 mold species in homes in USA and UK using mould specific quantitative polymerase chain reaction (MSQPCR) technology. Dust samples were obtained from randomly selected homes in Great Britain (n=11). The mould populat...

  13. Identities and Transformational Experiences for Quantitative Problem Solving: Gender Comparisons of First-Year University Science Students

    ERIC Educational Resources Information Center

    Hudson, Peter; Matthews, Kelly

    2012-01-01

    Women are underrepresented in science, technology, engineering and mathematics (STEM) areas in university settings; however this may be the result of attitude rather than aptitude. There is widespread agreement that quantitative problem-solving is essential for graduate competence and preparedness in science and other STEM subjects. The research…

  14. COMPARISON OF ENTEROCOCCUS MEASUREMENTS IN FRESHWATER AT TWO RECREATIONAL BEACHES BY QUANTITATIVE POLYMERASE CHAIN REACTION AND MEMBRANE FILER CULTURE ANALYSIS

    EPA Science Inventory

    Cell densities of the fecal pollution indicator genus, Enterococcus, were determined by a rapid (2-3 hr) quantitative PCR (QPCR) analysis based method in 100 ml water samples collected from recreational beaches on Lake Michigan and Lake Erie during the summer of 2003. Enumeration...

  15. Examining the Inclusion of Quantitative Research in a Meta-Ethnographic Review

    ERIC Educational Resources Information Center

    Booker, Rhae-Ann Richardson

    2010-01-01

    This study explored how one might extend meta-ethnography to quantitative research for the advancement of interpretive review methods. Using the same population of 139 studies on racial-ethnic matching as data, my investigation entailed an extended meta-ethnography (EME) and comparison of its results to a published meta-analysis (PMA). Adhering to…

  16. A Backscatter-Lidar Forward-Operator

    NASA Astrophysics Data System (ADS)

    Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland

    2015-04-01

    We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.

  17. Comparison of Dynamic Contrast Enhanced MRI and Quantitative SPECT in a Rat Glioma Model

    PubMed Central

    Skinner, Jack T.; Yankeelov, Thomas E.; Peterson, Todd E.; Does, Mark D.

    2012-01-01

    Pharmacokinetic modeling of dynamic contrast enhanced (DCE)-MRI data provides measures of the extracellular volume fraction (ve) and the volume transfer constant (Ktrans) in a given tissue. These parameter estimates may be biased, however, by confounding issues such as contrast agent and tissue water dynamics, or assumptions of vascularization and perfusion made by the commonly used model. In contrast to MRI, radiotracer imaging with SPECT is insensitive to water dynamics. A quantitative dual-isotope SPECT technique was developed to obtain an estimate of ve in a rat glioma model for comparison to the corresponding estimates obtained using DCE-MRI with a vascular input function (VIF) and reference region model (RR). Both DCE-MRI methods produced consistently larger estimates of ve in comparison to the SPECT estimates, and several experimental sources were postulated to contribute to these differences. PMID:22991315

  18. Acoustic radiation force impulse (ARFI) elastography for detection of renal damage in children.

    PubMed

    Göya, Cemil; Hamidi, Cihad; Ece, Aydın; Okur, Mehmet Hanifi; Taşdemir, Bekir; Çetinçakmak, Mehmet Güli; Hattapoğlu, Salih; Teke, Memik; Şahin, Cahit

    2015-01-01

    Acoustic radiation force impulse (ARFI) imaging is a promising method for noninvasive evaluation of the renal parenchyma. To investigate the contribution of ARFI quantitative US elastography for the detection of renal damage in kidneys with and without vesicoureteral reflux (VUR). One hundred seventy-six kidneys of 88 children (46 male, 42 female) who had been referred for voiding cystourethrography and 20 healthy controls were prospectively investigated. Patients were assessed according to severity of renal damage on dimercaptosuccinic acid (DMSA) scintigraphy. Ninety-eight age- and gender-matched healthy children constituted the control group. Quantitative shear wave velocity (SWV) measurements were performed in the upper and lower poles and in the interpolar region of each kidney. DMSA scintigraphy was performed in 62 children (124 kidneys). Comparisons of SWV values of kidneys with and without renal damage and/or VUR were done. Significantly higher SWV values were found in non-damaged kidneys. Severely damaged kidneys had the lowest SWV values (P < 0.001). High-grade (grade V-IV) refluxing kidneys had the lowest SWV values, while non-refluxing kidneys had the highest values (P < 0.05). Significant negative correlations were found between the mean quantitative US elastography values and DMSA scarring score (r = -0.788, P < 0.001) and VUR grade (r = -0.634, P < 0.001). SWV values of the control kidneys were significantly higher than those of damaged kidneys (P < 0.05). Our findings suggest decreasing SWV of renal units with increasing grades of vesicoureteric reflux, increasing DMSA-assessed renal damage and decreasing DMSA-assessed differential function.

  19. Qualitative and quantitative two-dimensional thin-layer chromatography/high performance liquid chromatography/diode-array/electrospray-ionization-time-of-flight mass spectrometry of cholinesterase inhibitors.

    PubMed

    Mroczek, Tomasz

    2016-09-10

    Recently launched thin-layer chromatography-mass spectrometry (TLC-MS) interface enabling extraction of compounds directly from TLC plates into MS ion source was unusually extended into two-dimensional thin-layer chromatography/high performance liquid chromatography (2D, TLC/HPLC) system by its a direct connection to a rapid resolution 50×2.1mm, I.D. C18 column compartment followed by detection by diode array (DAD) and electrospray ionisation time-of-flight mass spectrometry (ESI-TOF-MS). In this way, even not separated bands of complicated mixtures of natural compounds could be analysed structurally, only within 1-2min after development of TLC plates. In comparison to typically applied TLC-MS interface, no ion suppression for acidic mobile phases was observed. Also, substantial increase in ESI-TOF-MS sensitivities and quality of spectra, were noticed. It has been utilised in combination with TLC- based bioautographic approaches of acetylcholinesterase (AChE) inhibitors, However, it can be also applied in any other procedures related to bioactivity (e.g. 2,2-Diphenyl-1-picryl-hydrazyl-DPPH screen test for radicals). This system has been also used for determination of half maximal inhibitory concentration (IC50 values) of the active inhibitor-galanthamine, as an example. Moreover, AChE inhibitory potencies of some of purified plant extracts, never studied before, have been quantitatively measured. This is first report of usage such the 2D TLC/HPLC/MS system both for qualitative and quantitative evaluation of cholinesterase inhibitors in biological matrices. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Quantitative analysis of thyroid tumors vascularity: A comparison between 3-D contrast-enhanced ultrasound and 3-D Power Doppler on benign and malignant thyroid nodules.

    PubMed

    Caresio, Cristina; Caballo, Marco; Deandrea, Maurilio; Garberoglio, Roberto; Mormile, Alberto; Rossetto, Ruth; Limone, Paolo; Molinari, Filippo

    2018-05-15

    To perform a comparative quantitative analysis of Power Doppler ultrasound (PDUS) and Contrast-Enhancement ultrasound (CEUS) for the quantification of thyroid nodules vascularity patterns, with the goal of identifying biomarkers correlated with the malignancy of the nodule with both imaging techniques. We propose a novel method to reconstruct the vascular architecture from 3-D PDUS and CEUS images of thyroid nodules, and to automatically extract seven quantitative features related to the morphology and distribution of vascular network. Features include three tortuosity metrics, the number of vascular trees and branches, the vascular volume density, and the main spatial vascularity pattern. Feature extraction was performed on 20 thyroid lesions (ten benign and ten malignant), of which we acquired both PDUS and CEUS. MANOVA (multivariate analysis of variance) was used to differentiate benign and malignant lesions based on the most significant features. The analysis of the extracted features showed a significant difference between the benign and malignant nodules for both PDUS and CEUS techniques for all the features. Furthermore, by using a linear classifier on the significant features identified by the MANOVA, benign nodules could be entirely separated from the malignant ones. Our early results confirm the correlation between the morphology and distribution of blood vessels and the malignancy of the lesion, and also show (at least for the dataset used in this study) a considerable similarity in terms of findings of PDUS and CEUS imaging for thyroid nodules diagnosis and classification. © 2018 American Association of Physicists in Medicine.

Top