Sample records for problem quantitative comparison

  1. Identities and Transformational Experiences for Quantitative Problem Solving: Gender Comparisons of First-Year University Science Students

    ERIC Educational Resources Information Center

    Hudson, Peter; Matthews, Kelly

    2012-01-01

    Women are underrepresented in science, technology, engineering and mathematics (STEM) areas in university settings; however this may be the result of attitude rather than aptitude. There is widespread agreement that quantitative problem-solving is essential for graduate competence and preparedness in science and other STEM subjects. The research…

  2. LCSH and PRECIS in Music: A Comparison.

    ERIC Educational Resources Information Center

    Gabbard, Paula Beversdorf

    1985-01-01

    By studying examples of their applications by two major English language bibliographic agencies, this article compares strengths and weaknesses of PRECIS and Library of Congress Subject Headings for books about music. Highlights include quantitative and qualitative analysis, comparison of number of subject statements, and terminology problems in…

  3. Menstrual Problems Experienced by Women with Learning Disabilities

    ERIC Educational Resources Information Center

    Rodgers, Jackie; Lipscombe, Jo; Santer, Miriam

    2006-01-01

    Background: Menstruation appears to be problematic for women with learning disabilities, yet there has been little quantitative research on their experiences, or comparisons with other groups of women. This paper considers the nature and extent of menstrual problems experienced by women with learning disabilities. Methods: The data reported here…

  4. Inclusion and Student Learning: A Quantitative Comparison of Special and General Education Student Performance Using Team and Solo-Teaching

    ERIC Educational Resources Information Center

    Jamison, Joseph A.

    2013-01-01

    This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…

  5. A Comparison of Behavioral and Emotional Characteristics in Children with Autism, Prader-Willi Syndrome, and Williams Syndrome

    ERIC Educational Resources Information Center

    Dimitropoulos, Anastasia; Ho, Alan Y.; Klaiman, Cheryl; Koenig, Kathy; Schultz, Robert T.

    2009-01-01

    In order to investigate unique and shared characteristics and to determine factors predictive of group classification, quantitative comparisons of behavioral and emotional problems were assessed using the Developmental Behavior Checklist (DBC-P) and the Vineland Adaptive Behavior Scales in autistic disorder, Williams syndrome (WS), and…

  6. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  7. A Quantitative Comparison of the Relative Performance of VHF and UHF Broadcast Systems. Technical Monograph Number 1.

    ERIC Educational Resources Information Center

    Rubin, Philip A.; And Others

    A study was undertaken to: (1) assess problems with UHF television systems; and (2) identify problem-solving activities on which different broadcast institutions could cooperate, The model for comparing UHF with VHF broadcast/reception services assigned performance disparity figures to each of the following elements: (1) transmitter and…

  8. Obesity prevention: Comparison of techniques and potential solution

    NASA Astrophysics Data System (ADS)

    Zulkepli, Jafri; Abidin, Norhaslinda Zainal; Zaibidi, Nerda Zura

    2014-12-01

    Over the years, obesity prevention has been a broadly studied subject by both academicians and practitioners. It is one of the most serious public health issue as it can cause numerous chronic health and psychosocial problems. Research is needed to suggest a population-based strategy for obesity prevention. In the academic environment, the importance of obesity prevention has triggered various problem solving approaches. A good obesity prevention model, should comprehend and cater all complex and dynamics issues. Hence, the main purpose of this paper is to discuss the qualitative and quantitative approaches on obesity prevention study and to provide an extensive literature review on various recent modelling techniques for obesity prevention. Based on these literatures, the comparison of both quantitative and qualitative approahes are highlighted and the justification on the used of system dynamics technique to solve the population of obesity is discussed. Lastly, a potential framework solution based on system dynamics modelling is proposed.

  9. Numerical formulation for the prediction of solid/liquid change of a binary alloy

    NASA Technical Reports Server (NTRS)

    Schneider, G. E.; Tiwari, S. N.

    1990-01-01

    A computational model is presented for the prediction of solid/liquid phase change energy transport including the influence of free convection fluid flow in the liquid phase region. The computational model considers the velocity components of all non-liquid phase change material control volumes to be zero but fully solves the coupled mass-momentum problem within the liquid region. The thermal energy model includes the entire domain and uses an enthalpy like model and a recently developed method for handling the phase change interface nonlinearity. Convergence studies are performed and comparisons made with experimental data for two different problem specifications. The convergence studies indicate that grid independence was achieved and the comparison with experimental data indicates excellent quantitative prediction of the melt fraction evolution. Qualitative data is also provided in the form of velocity vector diagrams and isotherm plots for selected times in the evolution of both problems. The computational costs incurred are quite low by comparison with previous efforts on solving these problems.

  10. A Taylor weak-statement algorithm for hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Kim, J. W.

    1987-01-01

    Finite element analysis, applied to computational fluid dynamics (CFD) problem classes, presents a formal procedure for establishing the ingredients of a discrete approximation numerical solution algorithm. A classical Galerkin weak-statement formulation, formed on a Taylor series extension of the conservation law system, is developed herein that embeds a set of parameters eligible for constraint according to specification of suitable norms. The derived family of Taylor weak statements is shown to contain, as special cases, over one dozen independently derived CFD algorithms published over the past several decades for the high speed flow problem class. A theoretical analysis is completed that facilitates direct qualitative comparisons. Numerical results for definitive linear and nonlinear test problems permit direct quantitative performance comparisons.

  11. A Comparison of Numerical Problem Solving under Three Types of Calculation Conditions.

    ERIC Educational Resources Information Center

    Roberts, Dennis M.; Glynn, Shawn M.

    1978-01-01

    The study reported is the first in a series of investigations designed to empirically test the hypothesis that calculators reduce quantitative working time and increase computational accuracy, and to examine the relative magnitude of benefit that accompanies utilizing calculators compared to manual work. (MN)

  12. Performance evaluation of canine-associated Bacteroidales assays in a multi-laboratory comparison study

    EPA Science Inventory

    The contribution of fecal pollution from dogs in urbanized areas can be significant and is an often underestimated problem. Microbial source tracking methods (MST) utilizing quantitative PCR of dog-associated gene sequences encoding 16S rRNA of Bacteroidales are a useful tool to ...

  13. Problems and Prospects of Implementing Continuous Assessment at Adigrat University

    ERIC Educational Resources Information Center

    Berhe, Teklebrhan; Embiza, Samuel

    2015-01-01

    The purpose of the study is to assess the prospects and implementing continuous assessment (CA) in in higher education. Data were collected through a structured questionnaire from instructors and students of Adigrat University as well as Mekelle and Aksum Universities for comparison purpose. Both quantitative and qualitative data were carried out.…

  14. Differences in College Greek Members' Binge Drinking Behaviors: A Dry/Wet House Comparison

    ERIC Educational Resources Information Center

    Brown-Rice, Kathleen; Furr, Susan

    2015-01-01

    College Greek life students self-report high rates of binge drinking and experience more alcohol-related problems than students who are not members of the Greek system. But little research has been conducted to measure differences in alcohol-free housing (dry) and alcohol-allowed housing (wet). The purpose of this quantitative study was to…

  15. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  16. Supporting the Spectrum Hypothesis: Self-Reported Temperament in Children and Adolescents with High Functioning Autism.

    PubMed

    Burrows, Catherine A; Usher, Lauren V; Schwartz, Caley B; Mundy, Peter C; Henderson, Heather A

    2016-04-01

    This study tested the spectrum hypothesis, which posits that children and adolescents with high functioning autism (HFA) differ quantitatively but not qualitatively from typically developing peers on self-reported temperament. Temperament refers to early-appearing, relatively stable behavioral and emotional tendencies, which relate to maladaptive behaviors across clinical populations. Quantitatively, participants with HFA (N = 104, aged 10-16) self-reported less surgency and more negative affect but did not differ from comparison participants (N = 94, aged 10-16) on effortful control or affiliation. Qualitatively, groups demonstrated comparable reliability of self-reported temperament and associations between temperament and parent-reported behavior problems. These findings support the spectrum hypothesis, highlighting the utility of self-report temperament measures for understanding individual differences in comorbid behavior problems among children and adolescents with HFA.

  17. Supporting the Spectrum Hypothesis: Self-Reported Temperament in Children and Adolescents with High Functioning Autism

    PubMed Central

    Burrows, Catherine A.; Usher, Lauren V.; Schwartz, Caley B.; Mundy, Peter C.; Henderson, Heather A.

    2015-01-01

    This study tested the spectrum hypothesis, which posits that children and adolescents with high functioning autism (HFA) differ quantitatively but not qualitatively from typically developing peers on self-reported temperament. Temperament refers to early-appearing, relatively stable behavioral and emotional tendencies, which relate to maladaptive behaviors across clinical populations. Quantitatively, participants with HFA (N=104, aged 10–16) self-reported less Surgency and more Negative Affect but did not differ from comparison participants (N=94, aged 10–16) on Effortful Control or Affiliation. Qualitatively, groups demonstrated comparable reliability of self-reported temperament and associations between temperament and parent-reported behavior problems. These findings support the spectrum hypothesis, highlighting the utility of self-report temperament measures for understanding individual differences in comorbid behavior problems among children and adolescents with HFA. PMID:26589536

  18. [Morphological verification problems of Chernobyl factor influence on the testis of coal miners of Donbas-liquidators of Chernobyl accident].

    PubMed

    Danylov, Iu V; Motkov, K V; Shevchenko, T I

    2013-01-01

    Problem of a diagnostic of Chernobyl factor influences on different organs and systems of Chernobyl accident liquidators are remain actually until now. Though morbidly background which development at unfavorable work conditions in underground coalminers prevents from objective identification features of Chernobyl factor influences. The qualitative and quantitative histological and immunohistochemical law of morphogenesis changes in testis of Donbas's coalminer - non-liquidators Chernobyl accident in comparison with the group of Donbas's coalminers-liquidators Chernobyl accident, which we were stationed non determined problem. This reason stipulates to development and practical use of mathematical model of morphogenesis of a testis changes.

  19. [Morphological verification problems of Chernobyl factor influence on the prostate of coalminers of Donbas--liquidators of Chernobyl accident].

    PubMed

    Danylov, Iu V; Motkov, K V; Shevchenko, T I

    2013-12-01

    Problem of a diagnostic of Chernobyl factor influences on different organs and systems of Chernobyl accident liquidators are remain actually until now. Though morbidly background which development at unfavorable work conditions in underground coalminers prevents from objective identification features of Chernobyl factor influences. The qualitative and quantitative histological and immunohistochemical law of morphogenesis changes in prostate of Donbas's coalminer-non-liquidators Chernobyl accident in comparison with the group of Donbas's coalminers-liquidators Chernobyl accident which we were stationed non determined problem. This reason stipulates to development and practical use of mathematical model of morphogenesis of a prostatic gland changes.

  20. Effects of Multimedia-Based Instructional Technology on African American Ninth Grade Students' Mastery of Algebra Concepts

    ERIC Educational Resources Information Center

    Malik, Ishan Z.

    2011-01-01

    Urban African American students lack an abstract understanding of algebra and are below their academic level in comparison to other ethnic groups, and this is a pervasive problem (McKinney, Chappell, Berry, & Hickman, 2009). The purpose of this quantitative study using a quasi-experimental design was to determine whether the use of…

  1. Photon-counting-based diffraction phase microscopy combined with single-pixel imaging

    NASA Astrophysics Data System (ADS)

    Shibuya, Kyuki; Araki, Hiroyuki; Iwata, Tetsuo

    2018-04-01

    We propose a photon-counting (PC)-based quantitative-phase imaging (QPI) method for use in diffraction phase microscopy (DPM) that is combined with a single-pixel imaging (SPI) scheme (PC-SPI-DPM). This combination of DPM with the SPI scheme overcomes a low optical throughput problem that has occasionally prevented us from obtaining quantitative-phase images in DPM through use of a high-sensitivity single-channel photodetector such as a photomultiplier tube (PMT). The introduction of a PMT allowed us to perform PC with ease and thus solved a dynamic range problem that was inherent to SPI. As a proof-of-principle experiment, we performed a comparison study of analogue-based SPI-DPM and PC-SPI-DPM for a 125-nm-thick indium tin oxide (ITO) layer coated on a silica glass substrate. We discuss the basic performance of the method and potential future modifications of the proposed system.

  2. Integrating quantitative thinking into an introductory biology course improves students' mathematical reasoning in biological contexts.

    PubMed

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students' apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students' understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students' inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students' biology learning.

  3. Integrating Quantitative Thinking into an Introductory Biology Course Improves Students’ Mathematical Reasoning in Biological Contexts

    PubMed Central

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students’ apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students’ understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students’ inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students’ biology learning. PMID:24591504

  4. Sequential Inverse Problems Bayesian Principles and the Logistic Map Example

    NASA Astrophysics Data System (ADS)

    Duan, Lian; Farmer, Chris L.; Moroz, Irene M.

    2010-09-01

    Bayesian statistics provides a general framework for solving inverse problems, but is not without interpretation and implementation problems. This paper discusses difficulties arising from the fact that forward models are always in error to some extent. Using a simple example based on the one-dimensional logistic map, we argue that, when implementation problems are minimal, the Bayesian framework is quite adequate. In this paper the Bayesian Filter is shown to be able to recover excellent state estimates in the perfect model scenario (PMS) and to distinguish the PMS from the imperfect model scenario (IMS). Through a quantitative comparison of the way in which the observations are assimilated in both the PMS and the IMS scenarios, we suggest that one can, sometimes, measure the degree of imperfection.

  5. Real medical benefit assessed by indirect comparison.

    PubMed

    Falissard, Bruno; Zylberman, Myriam; Cucherat, Michel; Izard, Valérie; Meyer, François

    2009-01-01

    Frequently, in data packages submitted for Marketing Approval to the CHMP, there is a lack of relevant head-to-head comparisons of medicinal products that could enable national authorities responsible for the approval of reimbursement to assess the Added Therapeutic Value (ASMR) of new clinical entities or line extensions of existing therapies.Indirect or mixed treatment comparisons (MTC) are methods stemming from the field of meta-analysis that have been designed to tackle this problem. Adjusted indirect comparisons, meta-regressions, mixed models, Bayesian network analyses pool results of randomised controlled trials (RCTs), enabling a quantitative synthesis.The REAL procedure, recently developed by the HAS (French National Authority for Health), is a mixture of an MTC and effect model based on expert opinions. It is intended to translate the efficacy observed in the trials into effectiveness expected in day-to-day clinical practice in France.

  6. A two-factor error model for quantitative steganalysis

    NASA Astrophysics Data System (ADS)

    Böhme, Rainer; Ker, Andrew D.

    2006-02-01

    Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.

  7. Fat water decomposition using globally optimal surface estimation (GOOSE) algorithm.

    PubMed

    Cui, Chen; Wu, Xiaodong; Newell, John D; Jacob, Mathews

    2015-03-01

    This article focuses on developing a novel noniterative fat water decomposition algorithm more robust to fat water swaps and related ambiguities. Field map estimation is reformulated as a constrained surface estimation problem to exploit the spatial smoothness of the field, thus minimizing the ambiguities in the recovery. Specifically, the differences in the field map-induced frequency shift between adjacent voxels are constrained to be in a finite range. The discretization of the above problem yields a graph optimization scheme, where each node of the graph is only connected with few other nodes. Thanks to the low graph connectivity, the problem is solved efficiently using a noniterative graph cut algorithm. The global minimum of the constrained optimization problem is guaranteed. The performance of the algorithm is compared with that of state-of-the-art schemes. Quantitative comparisons are also made against reference data. The proposed algorithm is observed to yield more robust fat water estimates with fewer fat water swaps and better quantitative results than other state-of-the-art algorithms in a range of challenging applications. The proposed algorithm is capable of considerably reducing the swaps in challenging fat water decomposition problems. The experiments demonstrate the benefit of using explicit smoothness constraints in field map estimation and solving the problem using a globally convergent graph-cut optimization algorithm. © 2014 Wiley Periodicals, Inc.

  8. Geometrical comparison of two protein structures using Wigner-D functions.

    PubMed

    Saberi Fathi, S M; White, Diana T; Tuszynski, Jack A

    2014-10-01

    In this article, we develop a quantitative comparison method for two arbitrary protein structures. This method uses a root-mean-square deviation characterization and employs a series expansion of the protein's shape function in terms of the Wigner-D functions to define a new criterion, which is called a "similarity value." We further demonstrate that the expansion coefficients for the shape function obtained with the help of the Wigner-D functions correspond to structure factors. Our method addresses the common problem of comparing two proteins with different numbers of atoms. We illustrate it with a worked example. © 2014 Wiley Periodicals, Inc.

  9. The Kelvin-Helmholtz instability of boundary-layer plasmas in the kinetic regime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinbusch, Benedikt, E-mail: b.steinbusch@fz-juelich.de; Gibbon, Paul, E-mail: p.gibbon@fz-juelich.de; Department of Mathematics, Centre for Mathematical Plasma Astrophysics, Katholieke Universiteit Leuven

    2016-05-15

    The dynamics of the Kelvin-Helmholtz instability are investigated in the kinetic, high-frequency regime with a novel, two-dimensional, mesh-free tree code. In contrast to earlier studies which focused on specially prepared equilibrium configurations in order to compare with fluid theory, a more naturally occurring plasma-vacuum boundary layer is considered here with relevance to both space plasma and linear plasma devices. Quantitative comparisons of the linear phase are made between the fluid and kinetic models. After establishing the validity of this technique via comparison to linear theory and conventional particle-in-cell simulation for classical benchmark problems, a quantitative analysis of the more complexmore » magnetized plasma-vacuum layer is presented and discussed. It is found that in this scenario, the finite Larmor orbits of the ions result in significant departures from the effective shear velocity and width underlying the instability growth, leading to generally slower development and stronger nonlinear coupling between fast growing short-wavelength modes and longer wavelengths.« less

  10. A Comparison of Social Cognitive Profiles in Children with Autism Spectrum Disorders and Attention-Deficit/Hyperactivity Disorder: A Matter of Quantitative but Not Qualitative Difference?

    ERIC Educational Resources Information Center

    Demopoulos, Carly; Hopkins, Joyce; Davis, Amy

    2013-01-01

    The aim of this study was to compare social cognitive profiles of children and adolescents with Autism Spectrum Disorders (ASD) and ADHD. Participants diagnosed with an ASD (n = 137) were compared to participants with ADHD (n = 436) on tests of facial and vocal affect recognition, social judgment and problem-solving, and parent- and teacher-report…

  11. Morphology enabled dipole inversion (MEDI) from a single-angle acquisition: comparison with COSMOS in human brain imaging.

    PubMed

    Liu, Tian; Liu, Jing; de Rochefort, Ludovic; Spincemaille, Pascal; Khalidov, Ildar; Ledoux, James Robert; Wang, Yi

    2011-09-01

    Magnetic susceptibility varies among brain structures and provides insights into the chemical and molecular composition of brain tissues. However, the determination of an arbitrary susceptibility distribution from the measured MR signal phase is a challenging, ill-conditioned inverse problem. Although a previous method named calculation of susceptibility through multiple orientation sampling (COSMOS) has solved this inverse problem both theoretically and experimentally using multiple angle acquisitions, it is often impractical to carry out on human subjects. Recently, the feasibility of calculating the brain susceptibility distribution from a single-angle acquisition was demonstrated using morphology enabled dipole inversion (MEDI). In this study, we further improved the original MEDI method by sparsifying the edges in the quantitative susceptibility map that do not have a corresponding edge in the magnitude image. Quantitative susceptibility maps generated by the improved MEDI were compared qualitatively and quantitatively with those generated by calculation of susceptibility through multiple orientation sampling. The results show a high degree of agreement between MEDI and calculation of susceptibility through multiple orientation sampling, and the practicality of MEDI allows many potential clinical applications. Copyright © 2011 Wiley-Liss, Inc.

  12. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    PubMed

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  13. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    PubMed

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Forecast errors in dust vertical distributions over Rome (Italy): Multiple particle size representation and cloud contributions

    NASA Astrophysics Data System (ADS)

    Kishcha, P.; Alpert, P.; Shtivelman, A.; Krichak, S. O.; Joseph, J. H.; Kallos, G.; Katsafados, P.; Spyrou, C.; Gobbi, G. P.; Barnaba, F.; Nickovic, S.; PéRez, C.; Baldasano, J. M.

    2007-08-01

    In this study, forecast errors in dust vertical distributions were analyzed. This was carried out by using quantitative comparisons between dust vertical profiles retrieved from lidar measurements over Rome, Italy, performed from 2001 to 2003, and those predicted by models. Three models were used: the four-particle-size Dust Regional Atmospheric Model (DREAM), the older one-particle-size version of the SKIRON model from the University of Athens (UOA), and the pre-2006 one-particle-size Tel Aviv University (TAU) model. SKIRON and DREAM are initialized on a daily basis using the dust concentration from the previous forecast cycle, while the TAU model initialization is based on the Total Ozone Mapping Spectrometer aerosol index (TOMS AI). The quantitative comparison shows that (1) the use of four-particle-size bins in the dust modeling instead of only one-particle-size bins improves dust forecasts; (2) cloud presence could contribute to noticeable dust forecast errors in SKIRON and DREAM; and (3) as far as the TAU model is concerned, its forecast errors were mainly caused by technical problems with TOMS measurements from the Earth Probe satellite. As a result, dust forecast errors in the TAU model could be significant even under cloudless conditions. The DREAM versus lidar quantitative comparisons at different altitudes show that the model predictions are more accurate in the middle part of dust layers than in the top and bottom parts of dust layers.

  15. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  16. Traveling Salesman Problem for Surveillance Mission Using Particle Swarm Optimization

    DTIC Science & Technology

    2001-03-20

    design of experiments, results of the experiments, and qualitative and quantitative analysis . Conclusions and recommendations based on the qualitative and...characterize the algorithm. Such analysis and comparison between LK and a non-deterministic algorithm produces claims such as "Lin-Kernighan algorithm takes... based on experiments 5 and 6. All other parameters are the same as the baseline (see 4.2.1.2). 4.2.2.6 Experiment 10 - Fine Tuning PSO AS: 85,95% Global

  17. Evaluation of body-wise and organ-wise registrations for abdominal organs

    NASA Astrophysics Data System (ADS)

    Xu, Zhoubing; Panjwani, Sahil A.; Lee, Christopher P.; Burke, Ryan P.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Abramson, Richard G.; Landman, Bennett A.

    2016-03-01

    Identifying cross-sectional and longitudinal correspondence in the abdomen on computed tomography (CT) scans is necessary for quantitatively tracking change and understanding population characteristics, yet abdominal image registration is a challenging problem. The key difficulty in solving this problem is huge variations in organ dimensions and shapes across subjects. The current standard registration method uses the global or body-wise registration technique, which is based on the global topology for alignment. This method (although producing decent results) has substantial influence of outliers, thus leaving room for significant improvement. Here, we study a new image registration approach using local (organ-wise registration) by first creating organ-specific bounding boxes and then using these regions of interest (ROIs) for aligning references to target. Based on Dice Similarity Coefficient (DSC), Mean Surface Distance (MSD) and Hausdorff Distance (HD), the organ-wise approach is demonstrated to have significantly better results by minimizing the distorting effects of organ variations. This paper compares exclusively the two registration methods by providing novel quantitative and qualitative comparison data and is a subset of the more comprehensive problem of improving the multi-atlas segmentation by using organ normalization.

  18. Application of NIR spectroscopy in the assessment of diabetic foot disorders

    NASA Astrophysics Data System (ADS)

    Schleicher, Eckhard; Hampel, Uwe; Freyer, Richard

    2001-10-01

    Diabetic foot syndrome (DFS) is a common sequel of long-term diabetes mellitus. There is a urgent need of noninvasive, objective and quantitative diagnostic tools to assess tissue viability and perfusion for a successful therapy. NIR spectroscopy seems to be qualified to measure local capillary hemoglobin saturation of the outer extremities in patients with progressive diabetic disorders. We investigate how NIR spectroscopy can be applied to the assessment of diabetic foot problems such as neuropathy and angiopathy. Thereby we use spatially resolved spectroscopy in conjunction with a specially developed continuous-wave laser spectrometer. Comparison of intra- and interindividual measurements is expected to yield quantitative measures of local tissue viability which is a prerequisite for a successful therapy.

  19. Dissociative conceptual and quantitative problem solving outcomes across interactive engagement and traditional format introductory physics

    NASA Astrophysics Data System (ADS)

    McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.

    2016-12-01

    The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present study included both pre- and post-course conceptual-learning assessments and a new quantitative physics problem-solving assessment that included three representative conservation of energy problems from a first-semester calculus-based college physics course. Scores for problem translation, plan coherence, solution execution, and evaluation of solution plausibility were extracted for each problem. Over 450 students in three IE-based sections and two traditional lecture sections taught at the same university during the same semester participated. As expected, the IE-based course produced more robust gains on a Force Concept Inventory than did the lecture course. By contrast, when the full sample was considered, gains in quantitative problem solving were significantly greater for lecture than IE-based physics; when students were matched on pre-test scores, there was still no advantage for IE-based physics on gains in quantitative problem solving. Further, the association between performance on the concept inventory and quantitative problem solving was minimal. These results highlight that improved conceptual understanding does not necessarily support improved quantitative physics problem solving, and that the instructional method appears to have less bearing on gains in quantitative problem solving than does the kinds of problems emphasized in the courses and homework and the overlap of these problems to those on the assessment.

  20. Development of a Quantitative Decision Metric for Selecting the Most Suitable Discretization Method for SN Transport Problems

    NASA Astrophysics Data System (ADS)

    Schunert, Sebastian

    In this work we develop a quantitative decision metric for spatial discretization methods of the SN equations. The quantitative decision metric utilizes performance data from selected test problems for computing a fitness score that is used for the selection of the most suitable discretization method for a particular SN transport application. The fitness score is aggregated as a weighted geometric mean of single performance indicators representing various performance aspects relevant to the user. Thus, the fitness function can be adjusted to the particular needs of the code practitioner by adding/removing single performance indicators or changing their importance via the supplied weights. Within this work a special, broad class of methods is considered, referred to as nodal methods. This class is naturally comprised of the DGFEM methods of all function space families. Within this work it is also shown that the Higher Order Diamond Difference (HODD) method is a nodal method. Building on earlier findings that the Arbitrarily High Order Method of the Nodal type (AHOTN) is also a nodal method, a generalized finite-element framework is created to yield as special cases various methods that were developed independently using profoundly different formalisms. A selection of test problems related to a certain performance aspect are considered: an Method of Manufactured Solutions (MMS) test suite for assessing accuracy and execution time, Lathrop's test problem for assessing resilience against occurrence of negative fluxes, and a simple, homogeneous cube test problem to verify if a method possesses the thick diffusive limit. The contending methods are implemented as efficiently as possible under a common SN transport code framework to level the playing field for a fair comparison of their computational load. Numerical results are presented for all three test problems and a qualitative rating of each method's performance is provided for each aspect: accuracy/efficiency, resilience against negative fluxes, and possession of the thick diffusion limit, separately. The choice of the most efficient method depends on the utilized error norm: in Lp error norms higher order methods such as the AHOTN method of order three perform best, while for computing integral quantities the linear nodal (LN) method is most efficient. The most resilient method against occurrence of negative fluxes is the simple corner balance (SCB) method. A validation of the quantitative decision metric is performed based on the NEA box-inbox suite of test problems. The validation exercise comprises two stages: first prediction of the contending methods' performance via the decision metric and second computing the actual scores based on data obtained from the NEA benchmark problem. The comparison of predicted and actual scores via a penalty function (ratio of predicted best performer's score to actual best score) completes the validation exercise. It is found that the decision metric is capable of very accurate predictions (penalty < 10%) in more than 83% of the considered cases and features penalties up to 20% for the remaining cases. An exception to this rule is the third test case NEA-III intentionally set up to incorporate a poor match of the benchmark with the "data" problems. However, even under these worst case conditions the decision metric's suggestions are never detrimental. Suggestions for improving the decision metric's accuracy are to increase the pool of employed data, to refine the mapping of a given configuration to a case in the database, and to better characterize the desired target quantities.

  1. Implications of the Java language on computer-based patient records.

    PubMed

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs.

  2. Telerobotics - Display, control, and communication problems

    NASA Technical Reports Server (NTRS)

    Stark, Lawrence; Kim, Won-Soo; Tendick, Frank; Hannaford, Blake; Ellis, Stephen

    1987-01-01

    An experimental telerobotics simulation is described suitable for studying human operator (HO) performance. Simple manipulator pick-and-place and tracking tasks allowed quantitative comparison of a number of calligraphic display viewing conditions. An enhanced perspective display was effective with a reference line from target to base, with or without a complex three-dimensional grid framing the view. This was true especially if geometrical display parameters such as azimuth and elevation were arranged to be near optimal. Quantitative comparisons were made possible, utilizing control performance measures such as root mean square error. There was a distinct preference for controlling the manipulator in end-effector Cartesian space for the primitive pick-and-place task, rather than controlling joint angles and then, via direct kinematis, the end-effector position. An introduced communication delay was found to produce decrease in performance. In considerable part, this difficulty could be compensated for by preview control information. The fact that neurological control of normal human movement contains a sampled data period of 0.2 s may relate to this robustness of HO control to delay.

  3. Psychosocial Treatment Efficacy for Disruptive Behavior Problems in Very Young Children: A Meta-Analytic Examination

    PubMed Central

    Comer, Jonathan S.; Chow, Candice; Chan, Priscilla T.; Cooper-Vince, Christine; Wilson, Lianna A.S.

    2012-01-01

    Objective Service use trends showing increased off-label prescribing in very young children and reduced psychotherapy use raise concerns about quality of care for early disruptive behavior problems. Meta-analysis can empirically clarify best practices and guide clinical decision making by providing a quantitative synthesis of a body of literature, identifying the magnitude of overall effects across studies, and determining systematic factors associated with effect variations. Method We used random-effects meta-analytic procedures to empirically evaluate the overall effect of psychosocial treatments on early disruptive behavior problems, as well as potential moderators of treatment response. Thirty-six controlled trials, evaluating 3,042 children, met selection criteria (mean sample age, 4.7 years; 72.0% male; 33.1% minority youth). Results Psychosocial treatments collectively demonstrated a large and sustained effect on early disruptive behavior problems (Hedges’ g = 0.82), with the largest effects associated with behavioral treatments (Hedges’ g = 0.88), samples with higher proportions of older and male youth, and comparisons against treatment as usual (Hedges’ g = 1.17). Across trials, effects were largest for general externalizing problems (Hedges’ g =0.90) and problems of oppositionality and noncompliance (Hedges’ g = 0.76), and were weakest, relatively speaking, for problems of impulsivity and hyperactivity (Hedges’ g = 0.61). Conclusions In the absence of controlled trials evaluating psychotropic interventions, findings provide robust quantitative support that psychosocial treatments should constitute first-line treatment for early disruptive behavior problems. Against a backdrop of concerning trends in the availability and use of supported interventions, findings underscore the urgency of improving dissemination efforts for supported psychosocial treatment options, and removing systematic barriers to psychosocial care for affected youth. PMID:23265631

  4. On the Coplanar Integrable Case of the Twice-Averaged Hill Problem with Central Body Oblateness

    NASA Astrophysics Data System (ADS)

    Vashkov'yak, M. A.

    2018-01-01

    The twice-averaged Hill problem with the oblateness of the central planet is considered in the case where its equatorial plane coincides with the plane of its orbital motion relative to the perturbing body. A qualitative study of this so-called coplanar integrable case was begun by Y. Kozai in 1963 and continued by M.L. Lidov and M.V. Yarskaya in 1974. However, no rigorous analytical solution of the problem can be obtained due to the complexity of the integrals. In this paper we obtain some quantitative evolution characteristics and propose an approximate constructive-analytical solution of the evolution system in the form of explicit time dependences of satellite orbit elements. The methodical accuracy has been estimated for several orbits of artificial lunar satellites by comparison with the numerical solution of the evolution system.

  5. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  6. The photospheric magnetic flux budget

    NASA Technical Reports Server (NTRS)

    Schrijver, C. J.; Harvey, K. L.

    1994-01-01

    The ensemble of bipolar regions and the magnetic network both contain a substantial and strongly variable part of the photospheric magnetic flux at any phase in the solar cycle. The time-dependent distribution of the magnetic flux over and within these components reflects the action of the dynamo operating in the solar interior. We perform a quantitative comparison of the flux emerging in the ensemble of magnetic bipoles with the observed flux content of the solar photosphere. We discuss the photospheric flux budget in terms of flux appearance and disappearance, and argue that a nonlinear dependence exists between the flux present in the photosphere and the rate of flux appearance and disappearance. In this context, we discuss the problem of making quantitative statements about dynamos in cool stars other than the Sun.

  7. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies.

    PubMed

    Madill, A; Jordan, A; Shirley, C

    2000-02-01

    The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.

  8. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  9. Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team 1998

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available under the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.

  10. Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available un- der the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching an@ vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.

  11. The importance of the boundary condition in the transport of intensity equation based phase measurement

    NASA Astrophysics Data System (ADS)

    Zhang, Jialin; Chen, Qian; Li, Jiaji; Zuo, Chao

    2017-02-01

    The transport of intensity equation (TIE) is a powerful tool for direct quantitative phase retrieval in microscopy imaging. However, there may be some problems when dealing with the boundary condition of the TIE. The previous work introduces a hard-edged aperture to the camera port of the traditional bright field microscope to generate the boundary signal for the TIE solver. Under this Neumann boundary condition, we can obtain the quantitative phase without any assumption or prior knowledge about the test object and the setup. In this paper, we will demonstrate the effectiveness of this method based on some experiments in practice. The micro lens array will be used for the comparison of two TIE solvers results based on introducing the aperture or not and this accurate quantitative phase imaging technique allows measuring cell dry mass which is used in biology to follow cell cycle, to investigate cell metabolism, or to address effects of drugs.

  12. The Application of SILAC Mouse in Human Body Fluid Proteomics Analysis Reveals Protein Patterns Associated with IgA Nephropathy.

    PubMed

    Zhao, Shilin; Li, Rongxia; Cai, Xiaofan; Chen, Wanjia; Li, Qingrun; Xing, Tao; Zhu, Wenjie; Chen, Y Eugene; Zeng, Rong; Deng, Yueyi

    2013-01-01

    Body fluid proteome is the most informative proteome from a medical viewpoint. But the lack of accurate quantitation method for complicated body fluid limited its application in disease research and biomarker discovery. To address this problem, we introduced a novel strategy, in which SILAC-labeled mouse serum was used as internal standard for human serum and urine proteome analysis. The SILAC-labeled mouse serum was mixed with human serum and urine, and multidimensional separation coupled with tandem mass spectrometry (IEF-LC-MS/MS) analysis was performed. The shared peptides between two species were quantified by their SILAC pairs, and the human-only peptides were quantified by mouse peptides with coelution. The comparison for the results from two replicate experiments indicated the high repeatability of our strategy. Then the urine from Immunoglobulin A nephropathy patients treated and untreated was compared by this quantitation strategy. Fifty-three peptides were found to be significantly changed between two groups, including both known diagnostic markers for IgAN and novel candidates, such as Complement C3, Albumin, VDBP, ApoA,1 and IGFBP7. In conclusion, we have developed a practical and accurate quantitation strategy for comparison of complicated human body fluid proteome. The results from such strategy could provide potential disease-related biomarkers for evaluation of treatment.

  13. Application of real-time PCR for total airborne bacterial assessment: Comparison with epifluorescence microscopy and culture-dependent methods

    NASA Astrophysics Data System (ADS)

    Rinsoz, Thomas; Duquenne, Philippe; Greff-Mirguet, Guylaine; Oppliger, Anne

    Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count non-culturable or non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescence microscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the "impaction on nutrient agar" method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria.

  14. A benchmark for comparison of dental radiography analysis algorithms.

    PubMed

    Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia

    2016-07-01

    Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Abortion and mental health: quantitative synthesis and analysis of research published 1995-2009.

    PubMed

    Coleman, Priscilla K

    2011-09-01

    Given the methodological limitations of recently published qualitative reviews of abortion and mental health, a quantitative synthesis was deemed necessary to represent more accurately the published literature and to provide clarity to clinicians. To measure the association between abortion and indicators of adverse mental health, with subgroup effects calculated based on comparison groups (no abortion, unintended pregnancy delivered, pregnancy delivered) and particular outcomes. A secondary objective was to calculate population-attributable risk (PAR) statistics for each outcome. After the application of methodologically based selection criteria and extraction rules to minimise bias, the sample comprised 22 studies, 36 measures of effect and 877 181 participants (163 831 experienced an abortion). Random effects pooled odds ratios were computed using adjusted odds ratios from the original studies and PAR statistics were derived from the pooled odds ratios. Women who had undergone an abortion experienced an 81% increased risk of mental health problems, and nearly 10% of the incidence of mental health problems was shown to be attributable to abortion. The strongest subgroup estimates of increased risk occurred when abortion was compared with term pregnancy and when the outcomes pertained to substance use and suicidal behaviour. This review offers the largest quantitative estimate of mental health risks associated with abortion available in the world literature. Calling into question the conclusions from traditional reviews, the results revealed a moderate to highly increased risk of mental health problems after abortion. Consistent with the tenets of evidence-based medicine, this information should inform the delivery of abortion services.

  16. Computation of Nonlinear Backscattering Using a High-Order Numerical Method

    NASA Technical Reports Server (NTRS)

    Fibich, G.; Ilan, B.; Tsynkov, S.

    2001-01-01

    The nonlinear Schrodinger equation (NLS) is the standard model for propagation of intense laser beams in Kerr media. The NLS is derived from the nonlinear Helmholtz equation (NLH) by employing the paraxial approximation and neglecting the backscattered waves. In this study we use a fourth-order finite-difference method supplemented by special two-way artificial boundary conditions (ABCs) to solve the NLH as a boundary value problem. Our numerical methodology allows for a direct comparison of the NLH and NLS models and for an accurate quantitative assessment of the backscattered signal.

  17. Absolute Helmholtz free energy of highly anharmonic crystals: theory vs Monte Carlo.

    PubMed

    Yakub, Lydia; Yakub, Eugene

    2012-04-14

    We discuss the problem of the quantitative theoretical prediction of the absolute free energy for classical highly anharmonic solids. Helmholtz free energy of the Lennard-Jones (LJ) crystal is calculated accurately while accounting for both the anharmonicity of atomic vibrations and the pair and triple correlations in displacements of the atoms from their lattice sites. The comparison with most precise computer simulation data on sublimation and melting lines revealed that theoretical predictions are in excellent agreement with Monte Carlo simulation data in the whole range of temperatures and densities studied.

  18. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  19. Developing Seventh Grade Students' Understanding of Complex Environmental Problems with Systems Tools and Representations: a Quasi-experimental Study

    NASA Astrophysics Data System (ADS)

    Doganca Kucuk, Zerrin; Saysel, Ali Kerem

    2017-03-01

    A systems-based classroom intervention on environmental education was designed for seventh grade students; the results were evaluated to see its impact on the development of systems thinking skills and standard science achievement and whether the systems approach is a more effective way to teach environmental issues that are dynamic and complex. A quasi-experimental methodology was used to compare performances of the participants in various dimensions, including systems thinking skills, competence in dynamic environmental problem solving and success in science achievement tests. The same pre-, post- and delayed tests were used with both the comparison and experimental groups in the same public middle school in Istanbul. Classroom activities designed for the comparison group (N = 20) followed the directives of the Science and Technology Curriculum, while the experimental group (N = 22) covered the same subject matter through activities benefiting from systems tools and representations such as behaviour over time graphs, causal loop diagrams, stock-flow structures and hands-on dynamic modelling. After a one-month systems-based instruction, the experimental group demonstrated significantly better systems thinking and dynamic environmental problem solving skills. Achievement in dynamic problem solving was found to be relatively stable over time. However, standard science achievement did not improve at all. This paper focuses on the quantitative analysis of the results, the weaknesses of the curriculum and educational implications.

  20. Arithmetic learning with the use of graphic organiser

    NASA Astrophysics Data System (ADS)

    Sai, F. L.; Shahrill, M.; Tan, A.; Han, S. H.

    2018-01-01

    For this study, Zollman’s four corners-and-a-diamond mathematics graphic organiser embedded with Polya’s Problem Solving Model was used to investigate secondary school students’ performance in arithmetic word problems. This instructional learning tool was used to help students break down the given information into smaller units for better strategic planning. The participants were Year 7 students, comprised of 21 male and 20 female students, aged between 11-13 years old, from a co-ed secondary school in Brunei Darussalam. This study mainly adopted a quantitative approach to investigate the types of differences found in the arithmetic word problem pre- and post-tests results from the use of the learning tool. Although the findings revealed slight improvements in the overall comparisons of the students’ test results, the in-depth analysis of the students’ responses in their activity worksheets shows a different outcome. Some students were able to make good attempts in breaking down the key points into smaller information in order to solve the word problems.

  1. On Quantitative Comparative Research in Communication and Language Evolution

    PubMed Central

    Oller, D. Kimbrough; Griebel, Ulrike

    2014-01-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives. PMID:25285057

  2. On Quantitative Comparative Research in Communication and Language Evolution.

    PubMed

    Oller, D Kimbrough; Griebel, Ulrike

    2014-09-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives.

  3. Leaf epidermis images for robust identification of plants

    PubMed Central

    da Silva, Núbia Rosa; Oliveira, Marcos William da Silva; Filho, Humberto Antunes de Almeida; Pinheiro, Luiz Felipe Souza; Rossatto, Davi Rodrigo; Kolb, Rosana Marta; Bruno, Odemir Martinez

    2016-01-01

    This paper proposes a methodology for plant analysis and identification based on extracting texture features from microscopic images of leaf epidermis. All the experiments were carried out using 32 plant species with 309 epidermal samples captured by an optical microscope coupled to a digital camera. The results of the computational methods using texture features were compared to the conventional approach, where quantitative measurements of stomatal traits (density, length and width) were manually obtained. Epidermis image classification using texture has achieved a success rate of over 96%, while success rate was around 60% for quantitative measurements taken manually. Furthermore, we verified the robustness of our method accounting for natural phenotypic plasticity of stomata, analysing samples from the same species grown in different environments. Texture methods were robust even when considering phenotypic plasticity of stomatal traits with a decrease of 20% in the success rate, as quantitative measurements proved to be fully sensitive with a decrease of 77%. Results from the comparison between the computational approach and the conventional quantitative measurements lead us to discover how computational systems are advantageous and promising in terms of solving problems related to Botany, such as species identification. PMID:27217018

  4. Effect of Scaffolding on Helping Introductory Physics Students Solve Quantitative Problems Involving Strong Alternative Conceptions

    ERIC Educational Resources Information Center

    Lin, Shih-Yin; Singh, Chandralekha

    2015-01-01

    It is well known that introductory physics students often have alternative conceptions that are inconsistent with established physical principles and concepts. Invoking alternative conceptions in the quantitative problem-solving process can derail the entire process. In order to help students solve quantitative problems involving strong…

  5. A Comparison of Two Types of Social Support for Mothers of Mentally Ill Children

    PubMed Central

    Scharer, Kathleen; Colon, Eileen; Moneyham, Linda; Hussey, Jim; Tavakoli, Abbas; Shugart, Margaret

    2009-01-01

    PROBLEM The purpose of this analysis was to compare social support offered by two telehealth nursing interventions for mothers of children with serious mental illnesses. METHODS A randomized, controlled, quantitative investigation is underway to test two support interventions, using the telephone (TSS) or Internet (WEB). Qualitative description was used to analyze data generated during telehealth interventions. FINDINGS The behaviors and attitudes of children were challenging for the mothers to manage. Mothers’ emotional reactions included fear, frustration, concern, and guilt. They sought to be advocates for their children. The nurses provided emotional, informational, and appraisal support. TSS mothers were passive recipients, while WEB mothers had to choose to participate. CONCLUSIONS Mothers in both interventions shared similar concerns and sought support related to their child’s problems. PMID:19490279

  6. Comparison of three-way and four-way calibration for the real-time quantitative analysis of drug hydrolysis in complex dynamic samples by excitation-emission matrix fluorescence.

    PubMed

    Yin, Xiao-Li; Gu, Hui-Wen; Liu, Xiao-Lu; Zhang, Shan-Hui; Wu, Hai-Long

    2018-03-05

    Multiway calibration in combination with spectroscopic technique is an attractive tool for online or real-time monitoring of target analyte(s) in complex samples. However, how to choose a suitable multiway calibration method for the resolution of spectroscopic-kinetic data is a troubling problem in practical application. In this work, for the first time, three-way and four-way fluorescence-kinetic data arrays were generated during the real-time monitoring of the hydrolysis of irinotecan (CPT-11) in human plasma by excitation-emission matrix fluorescence. Alternating normalization-weighted error (ANWE) and alternating penalty trilinear decomposition (APTLD) were used as three-way calibration for the decomposition of the three-way kinetic data array, whereas alternating weighted residual constraint quadrilinear decomposition (AWRCQLD) and alternating penalty quadrilinear decomposition (APQLD) were applied as four-way calibration to the four-way kinetic data array. The quantitative results of the two kinds of calibration models were fully compared from the perspective of predicted real-time concentrations, spiked recoveries of initial concentration, and analytical figures of merit. The comparison study demonstrated that both three-way and four-way calibration models could achieve real-time quantitative analysis of the hydrolysis of CPT-11 in human plasma under certain conditions. However, it was also found that both of them possess some critical advantages and shortcomings during the process of dynamic analysis. The conclusions obtained in this paper can provide some helpful guidance for the reasonable selection of multiway calibration models to achieve the real-time quantitative analysis of target analyte(s) in complex dynamic systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Haemodynamics of giant cerebral aneurysm: A comparison between the rigid-wall, one-way and two-way FSI models

    NASA Astrophysics Data System (ADS)

    Khe, A. K.; Cherevko, A. A.; Chupakhin, A. P.; Bobkova, M. S.; Krivoshapkin, A. L.; Orlov, K. Yu

    2016-06-01

    In this paper a computer simulation of a blood flow in cerebral vessels with a giant saccular aneurysm at the bifurcation of the basilar artery is performed. The modelling is based on patient-specific clinical data (both flow domain geometry and boundary conditions for the inlets and outlets). The hydrodynamic and mechanical parameters are calculated in the frameworks of three models: rigid-wall assumption, one-way FSI approach, and full (two-way) hydroelastic model. A comparison of the numerical solutions shows that mutual fluid- solid interaction can result in qualitative changes in the structure of the fluid flow. Other characteristics of the flow (pressure, stress, strain and displacement) qualitatively agree with each other in different approaches. However, the quantitative comparison shows that accounting for the flow-vessel interaction, in general, decreases the absolute values of these parameters. Solving of the hydroelasticity problem gives a more detailed solution at a cost of highly increased computational time.

  8. The early development of stereotypy and self-injury: a review of research methods.

    PubMed

    Symons, F J; Sperry, L A; Dropik, P L; Bodfish, J W

    2005-02-01

    The origin and developmental course of stereotypic and self-injurious behaviour among individuals with developmental disabilities such as intellectual disability (ID) or pervasive development disorders such as autism is not well understood. Twelve studies designed to document the prevalence, nature, or development of stereotypic and/or self-injurious behaviour in children under 5 years of age and identified as at risk for developmental delay or disability were reviewed. Comparisons were made with similar studies with typically developing children. It appears that the onset of naturally occurring rhythmic motor stereotypies is delayed in young at-risk children, but that the sequencing may be similar. A very small database, differences in samples, measures, and designs limited the degree to which comparisons could be made across studies. Future work is needed based on appropriately designed prospective comparison studies and uniform quantitative measures to provide an empirical basis for new knowledge about the early development of one of the most serious behaviour disorders afflicting children with ID and related problems of development.

  9. A quantitative comparison of resolution, scanning speed and lifetime behavior of CVD grown Single Wall Carbon Nanotubes and silicon SPM probes using spectral methods

    NASA Astrophysics Data System (ADS)

    Krause, O.; Bouchiat, V.; Bonnot, A. M.

    2007-03-01

    Due to their extreme aspect ratios and exceptional mechanical properties Carbon Nanotubes terminated silicon probes have proven to be the ''ideal'' probe for Atomic Force Microscopy. But especially for the manufacturing and use of Single Walled Carbon Nanotubes there are serious problems, which have not been solved until today. Here, Single and Double Wall Carbon Nanotubes, batch processed and used as deposited by Chemical Vapor Deposition without any postprocessing, are compared to standard and high resolution silicon probes concerning resolution, scanning speed and lifetime behavior.

  10. a Proposed Benchmark Problem for Scatter Calculations in Radiographic Modelling

    NASA Astrophysics Data System (ADS)

    Jaenisch, G.-R.; Bellon, C.; Schumm, A.; Tabary, J.; Duvauchelle, Ph.

    2009-03-01

    Code Validation is a permanent concern in computer modelling, and has been addressed repeatedly in eddy current and ultrasonic modeling. A good benchmark problem is sufficiently simple to be taken into account by various codes without strong requirements on geometry representation capabilities, focuses on few or even a single aspect of the problem at hand to facilitate interpretation and to avoid that compound errors compensate themselves, yields a quantitative result and is experimentally accessible. In this paper we attempt to address code validation for one aspect of radiographic modeling, the scattered radiation prediction. Many NDT applications can not neglect scattered radiation, and the scatter calculation thus is important to faithfully simulate the inspection situation. Our benchmark problem covers the wall thickness range of 10 to 50 mm for single wall inspections, with energies ranging from 100 to 500 keV in the first stage, and up to 1 MeV with wall thicknesses up to 70 mm in the extended stage. A simple plate geometry is sufficient for this purpose, and the scatter data is compared on a photon level, without a film model, which allows for comparisons with reference codes like MCNP. We compare results of three Monte Carlo codes (McRay, Sindbad and Moderato) as well as an analytical first order scattering code (VXI), and confront them to results obtained with MCNP. The comparison with an analytical scatter model provides insights into the application domain where this kind of approach can successfully replace Monte-Carlo calculations.

  11. If F(ST) does not measure neutral genetic differentiation, then comparing it with Q(ST) is misleading. Or is it?

    PubMed

    Edelaar, Pim; Björklund, Mats

    2011-05-01

    The comparison between neutral genetic differentiation (F(ST) ) and quantitative genetic differentiation (Q(ST) ) is commonly used to test for signatures of selection in population divergence. However, there is an ongoing discussion about what F(ST) actually measures, even resulting in some alternative metrics to express neutral genetic differentiation. If there is a problem with F(ST) , this could have repercussions for its comparison with Q(ST) as well. We show that as the mutation rate of the neutral marker increases, F(ST) decreases: a higher within-population heterozygosity (He) yields a lower F(ST) value. However, the same is true for Q(ST) : a higher mutation rate for the underlying QTL also results in a lower Q(ST) estimate. The effect of mutation rate is equivalent in Q(ST) and F(ST) . Hence, the comparison between Q(ST) and F(ST) remains valid, if one uses neutral markers whose mutation rates are not too high compared to those of quantitative traits. Usage of highly variable neutral markers such as hypervariable microsatellites can lead to serious biases and the incorrect inference that divergent selection has acted on populations. Much of the discussion on F(ST) seems to stem from the misunderstanding that it measures the differentiation of populations, whereas it actually measures the fixation of alleles. In their capacity as measures of population differentiation, Hedrick's G'(ST) and Jost's D reach their maximum value of 1 when populations do not share alleles even when there remains variation within populations, which invalidates them for comparisons with Q(ST) . © 2011 Blackwell Publishing Ltd.

  12. Coupling Conceptual and Quantitative Problems to Develop Expertise in Introductory Physics Students

    NASA Astrophysics Data System (ADS)

    Singh, Chandralekha

    2008-10-01

    We discuss the effect of administering conceptual and quantitative isomorphic problem pairs (CQIPP) back to back vs. asking students to solve only one of the problems in the CQIPP in introductory physics courses. Students who answered both questions in a CQIPP often performed better on the conceptual questions than those who answered the corresponding conceptual questions only. Although students often took advantage of the quantitative counterpart to answer a conceptual question of a CQIPP correctly, when only given the conceptual question, students seldom tried to convert it into a quantitative question, solve it and then reason about the solution conceptually. Even in individual interviews, when students who were only given conceptual questions had difficulty and the interviewer explicitly encouraged them to convert the conceptual question into the corresponding quantitative problem by choosing appropriate variables, a majority of students were reluctant and preferred to guess the answer to the conceptual question based upon their gut feeling.

  13. Quantitative Reasoning in Problem Solving

    ERIC Educational Resources Information Center

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  14. Quantitative Relationships Involving Additive Differences: Numerical Resilience

    ERIC Educational Resources Information Center

    Ramful, Ajay; Ho, Siew Yin

    2014-01-01

    This case study describes the ways in which problems involving additive differences with unknown starting quantities, constrain the problem solver in articulating the inherent quantitative relationship. It gives empirical evidence to show how numerical reasoning takes over as a Grade 6 student instantiates the quantitative relation by resorting to…

  15. Do students benefit from drawing productive diagrams themselves while solving introductory physics problems? The case of two electrostatics problems

    NASA Astrophysics Data System (ADS)

    Maries, Alexandru; Singh, Chandralekha

    2018-01-01

    An appropriate diagram is a required element of a solution building process in physics problem solving and it can transform a given problem into a representation that is easier to exploit for solving the problem. A major focus while helping introductory physics students learn problem solving is to help them appreciate that drawing diagrams facilitates problem solving. We conducted an investigation in which two different interventions were implemented during recitation quizzes throughout the semester in a large enrolment, algebra-based introductory physics course. Students were either (1) asked to solve problems in which the diagrams were drawn for them or (2) explicitly told to draw a diagram. A comparison group was not given any instruction regarding diagrams. We developed a rubric to score the problem solving performance of students in different intervention groups. We investigated two problems involving electric field and electric force and found that students who drew productive diagrams were more successful problem solvers and that a higher level of relevant detail in a student’s diagram corresponded to a better score. We also conducted think-aloud interviews with nine students who were at the time taking an equivalent introductory algebra-based physics course in order to gain insight into how drawing diagrams affects the problem solving process. These interviews supported some of the interpretations of the quantitative results. We end by discussing instructional implications of the findings.

  16. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.

  17. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976

  18. Quantitative Literacy across the Curriculum: Integrating Skills from English Composition, Mathematics, and the Substantive Disciplines

    ERIC Educational Resources Information Center

    Miller, Jane E.

    2010-01-01

    Quantitative literacy is an important proficiency that pertains to "word problems" from science, history, and other fields. Unfortunately, teaching how to solve such problems often is relegated to math courses alone. This article examines how quantitative literacy also involves concepts and skills from English composition and the substantive…

  19. Propagating Qualitative Values Through Quantitative Equations

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak

    1992-01-01

    In most practical problems where traditional numeric simulation is not adequate, one need to reason about a system with both qualitative and quantitative equations. In this paper, we address the problem of propagating qualitative values represented as interval values through quantitative equations. Previous research has produced exponential-time algorithms for approximate solution of the problem. These may not meet the stringent requirements of many real time applications. This paper advances the state of art by producing a linear-time algorithm that can propagate a qualitative value through a class of complex quantitative equations exactly and through arbitrary algebraic expressions approximately. The algorithm was found applicable to Space Shuttle Reaction Control System model.

  20. Single image super-resolution via an iterative reproducing kernel Hilbert space method.

    PubMed

    Deng, Liang-Jian; Guo, Weihong; Huang, Ting-Zhu

    2016-11-01

    Image super-resolution, a process to enhance image resolution, has important applications in satellite imaging, high definition television, medical imaging, etc. Many existing approaches use multiple low-resolution images to recover one high-resolution image. In this paper, we present an iterative scheme to solve single image super-resolution problems. It recovers a high quality high-resolution image from solely one low-resolution image without using a training data set. We solve the problem from image intensity function estimation perspective and assume the image contains smooth and edge components. We model the smooth components of an image using a thin-plate reproducing kernel Hilbert space (RKHS) and the edges using approximated Heaviside functions. The proposed method is applied to image patches, aiming to reduce computation and storage. Visual and quantitative comparisons with some competitive approaches show the effectiveness of the proposed method.

  1. Computational comparison of aortic root stresses in presence of stentless and stented aortic valve bio-prostheses.

    PubMed

    Nestola, M G C; Faggiano, E; Vergara, C; Lancellotti, R M; Ippolito, S; Antona, C; Filippi, S; Quarteroni, A; Scrofani, R

    2017-02-01

    We provide a computational comparison of the performance of stentless and stented aortic prostheses, in terms of aortic root displacements and internal stresses. To this aim, we consider three real patients; for each of them, we draw the two prostheses configurations, which are characterized by different mechanical properties and we also consider the native configuration. For each of these scenarios, we solve the fluid-structure interaction problem arising between blood and aortic root, through Finite Elements. In particular, the Arbitrary Lagrangian-Eulerian formulation is used for the numerical solution of the fluid-dynamic equations and a hyperelastic material model is adopted to predict the mechanical response of the aortic wall and the two prostheses. The computational results are analyzed in terms of aortic flow, internal wall stresses and aortic wall/prosthesis displacements; a quantitative comparison of the mechanical behavior of the three scenarios is reported. The numerical results highlight a good agreement between stentless and native displacements and internal wall stresses, whereas higher/non-physiological stresses are found for the stented case.

  2. How Students Process Equations in Solving Quantitative Synthesis Problems? Role of Mathematical Complexity in Students' Mathematical Performance

    ERIC Educational Resources Information Center

    Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan

    2017-01-01

    We examine students' mathematical performance on quantitative "synthesis problems" with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking,…

  3. Infrared thermography quantitative image processing

    NASA Astrophysics Data System (ADS)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  4. Quantitative analysis of CMV DNA in children the first year after liver transplantation.

    PubMed

    Kullberg-Lindh, Carola; Ascher, Henry; Krantz, Marie; Lindh, Magnus

    2003-08-01

    CMV infection is a major problem after solid organ transplantation especially in children where primary infection is more common than in adults. Early diagnosis is critical and might be facilitated by quantitative analysis of CMV DNA in blood. In this retrospective study of 18 children who had a liver transplantation 1995-2000, serum samples were analysed by Cobas Amplicor Monitor (Roche). Four patients developed symptomatic CMV infection at a mean time of 4 wk after transplantation. They showed maximum CMV DNA levels in serum of 26 400, 1900, 1300 and 970 copies/mL, respectively. In comparison, CA Monitor was positive, at a low level (415 copies/mL), in one of 11 patients with asymptomatic (4) or latent (7) infection. CMV IgM was detected at significant levels (> or =1/80) in all four patients with symptomatic, and in one with asymptomatic CMV infection. Eight patients were given one or several courses of ganciclovir. Five of these lacked symptoms of CMV disease, and had low (415 copies/mL) or undetectable CMV DNA in serum. The data suggest that quantitative analysis of CMV DNA may be of value in early identification of CMV disease and for avoiding unnecessary antiviral treatment.

  5. Detailed Quantitative Classifications of Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Nair, Preethi

    2018-01-01

    Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.

  6. Quantitative and qualitative research across cultures and languages: cultural metrics and their application.

    PubMed

    Wagner, Wolfgang; Hansen, Karolina; Kronberger, Nicole

    2014-12-01

    Growing globalisation of the world draws attention to cultural differences between people from different countries or from different cultures within the countries. Notwithstanding the diversity of people's worldviews, current cross-cultural research still faces the challenge of how to avoid ethnocentrism; comparing Western-driven phenomena with like variables across countries without checking their conceptual equivalence clearly is highly problematic. In the present article we argue that simple comparison of measurements (in the quantitative domain) or of semantic interpretations (in the qualitative domain) across cultures easily leads to inadequate results. Questionnaire items or text produced in interviews or via open-ended questions have culturally laden meanings and cannot be mapped onto the same semantic metric. We call the culture-specific space and relationship between variables or meanings a 'cultural metric', that is a set of notions that are inter-related and that mutually specify each other's meaning. We illustrate the problems and their possible solutions with examples from quantitative and qualitative research. The suggested methods allow to respect the semantic space of notions in cultures and language groups and the resulting similarities or differences between cultures can be better understood and interpreted.

  7. Quantitative analyses of the 3D nuclear landscape recorded with super-resolved fluorescence microscopy.

    PubMed

    Schmid, Volker J; Cremer, Marion; Cremer, Thomas

    2017-07-01

    Recent advancements of super-resolved fluorescence microscopy have revolutionized microscopic studies of cells, including the exceedingly complex structural organization of cell nuclei in space and time. In this paper we describe and discuss tools for (semi-) automated, quantitative 3D analyses of the spatial nuclear organization. These tools allow the quantitative assessment of highly resolved different chromatin compaction levels in individual cell nuclei, which reflect functionally different regions or sub-compartments of the 3D nuclear landscape, and measurements of absolute distances between sites of different chromatin compaction. In addition, these tools allow 3D mapping of specific DNA/RNA sequences and nuclear proteins relative to the 3D chromatin compaction maps and comparisons of multiple cell nuclei. The tools are available in the free and open source R packages nucim and bioimagetools. We discuss the use of masks for the segmentation of nuclei and the use of DNA stains, such as DAPI, as a proxy for local differences in chromatin compaction. We further discuss the limitations of 3D maps of the nuclear landscape as well as problems of the biological interpretation of such data. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. A quantitative comparison of numerical methods for the compressible Euler equations: fifth-order WENO and piecewise-linear Godunov

    NASA Astrophysics Data System (ADS)

    Greenough, J. A.; Rider, W. J.

    2004-05-01

    A numerical study is undertaken comparing a fifth-order version of the weighted essentially non-oscillatory numerical (WENO5) method to a modern piecewise-linear, second-order, version of Godunov's (PLMDE) method for the compressible Euler equations. A series of one-dimensional test problems are examined beginning with classical linear problems and ending with complex shock interactions. The problems considered are: (1) linear advection of a Gaussian pulse in density, (2) Sod's shock tube problem, (3) the "peak" shock tube problem, (4) a version of the Shu and Osher shock entropy wave interaction and (5) the Woodward and Colella interacting shock wave problem. For each problem and method, run times, density error norms and convergence rates are reported for each method as produced from a common code test-bed. The linear problem exhibits the advertised convergence rate for both methods as well as the expected large disparity in overall error levels; WENO5 has the smaller errors and an enormous advantage in overall efficiency (in accuracy per unit CPU time). For the nonlinear problems with discontinuities, however, we generally see both first-order self-convergence of error as compared to an exact solution, or when an analytic solution is not available, a converged solution generated on an extremely fine grid. The overall comparison of error levels shows some variation from problem to problem. For Sod's shock tube, PLMDE has nearly half the error, while on the peak problem the errors are nearly the same. For the interacting blast wave problem the two methods again produce a similar level of error with a slight edge for the PLMDE. On the other hand, for the Shu-Osher problem, the errors are similar on the coarser grids, but favors WENO by a factor of nearly 1.5 on the finer grids used. In all cases holding mesh resolution constant though, PLMDE is less costly in terms of CPU time by approximately a factor of 6. If the CPU cost is taken as fixed, that is run times are equal for both numerical methods, then PLMDE uniformly produces lower errors than WENO for the fixed computation cost on the test problems considered here.

  9. Old wine in new bottles: decanting systemic family process research in the era of evidence-based practice.

    PubMed

    Rohrbaugh, Michael J

    2014-09-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when "solutions" maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle-from qualitative to quantitative observation and back again. © 2014 FPI, Inc.

  10. Old Wine in New Bottles: Decanting Systemic Family Process Research in the Era of Evidence-Based Practice†

    PubMed Central

    Rohrbaugh, Michael J.

    2015-01-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when “solutions” maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation (FAMCON) approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle – from qualitative to quantitative observation and back again. PMID:24905101

  11. Drift mobility of photo-electrons in organic molecular crystals: Quantitative comparison between theory and experiment

    NASA Astrophysics Data System (ADS)

    Reineker, P.; Kenkre, V. M.; Kühne, R.

    1981-08-01

    A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.

  12. Option generation in the treatment of unstable patients: An experienced-novice comparison study.

    PubMed

    Whyte, James; Pickett-Hauber, Roxanne; Whyte, Maria D

    2016-09-01

    There are a dearth of studies that quantitatively measure nurses' appreciation of stimuli and the subsequent generation of options in practice environments. The purpose of this paper was to provide an examination of nurses' ability to solve problems while quantifying the stimuli upon which they focus during patient care activities. The study used a quantitative descriptive method that gathered performance data from a simulated task environment using multi-angle video and audio. These videos were coded and transcripts of all of the actions that occurred in the scenario and the verbal reports of the participants were compiled. The results revealed a pattern of superiority of the experienced exemplar group. Novice actions were characterized by difficulty in following common protocols, inconsistencies in their evaluative approaches, and a pattern of omissions of key actions. The study provides support for the deliberate practice-based programs designed to facilitate higher-level performance in novices. © 2016 John Wiley & Sons Australia, Ltd.

  13. An evaluation of collision models in the Method of Moments for rarefied gas problems

    NASA Astrophysics Data System (ADS)

    Emerson, David; Gu, Xiao-Jun

    2014-11-01

    The Method of Moments offers an attractive approach for solving gaseous transport problems that are beyond the limit of validity of the Navier-Stokes-Fourier equations. Recent work has demonstrated the capability of the regularized 13 and 26 moment equations for solving problems when the Knudsen number, Kn (where Kn is the ratio of the mean free path of a gas to a typical length scale of interest), is in the range 0.1 and 1.0-the so-called transition regime. In comparison to numerical solutions of the Boltzmann equation, the Method of Moments has captured both qualitatively, and quantitatively, results of classical test problems in kinetic theory, e.g. velocity slip in Kramers' problem, temperature jump in Knudsen layers, the Knudsen minimum etc. However, most of these results have been obtained for Maxwell molecules, where molecules repel each other according to an inverse fifth-power rule. Recent work has incorporated more traditional collision models such as BGK, S-model, and ES-BGK, the latter being important for thermal problems where the Prandtl number can vary. We are currently investigating the impact of these collision models on fundamental low-speed problems of particular interest to micro-scale flows that will be discussed and evaluated in the presentation. Engineering and Physical Sciences Research Council under Grant EP/I011927/1 and CCP12.

  14. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  15. Comparison of 13C Nuclear Magnetic Resonance and Fourier Transform Infrared spectroscopy for estimating humification and aromatization of soil organic matter

    NASA Astrophysics Data System (ADS)

    Rogers, K.; Cooper, W. T.; Hodgkins, S. B.; Verbeke, B. A.; Chanton, J.

    2017-12-01

    Solid state direct polarization 13C NMR spectroscopy (DP-NMR) is generally considered the most quantitatively reliable method for soil organic matter (SOM) characterization, including determination of the relative abundances of carbon functional groups. These functional abundances can then be used to calculate important soil parameters such as degree of humification and extent of aromaticity that reveal differences in reactivity or compositional changes along gradients (e.g. thaw chronosequence in permafrost). Unfortunately, the 13C NMR DP-NMR experiment is time-consuming, with a single sample often requiring over 24 hours of instrument time. Alternatively, solid state cross polarization 13C NMR (CP-NMR) can circumvent this problem, reducing analyses times to 4-6 hours but with some loss of quantitative reliability. Attenuated Total Reflectance Fourier Transform Infrared spectroscopy (ATR-FTIR) is a quick and relatively inexpensive method for characterizing solid materials, and has been suggested as an alternative to NMR for analysis of soil organic matter and determination of humification (HI) and aromatization (AI) indices. However, the quantitative reliability of ATR-FTIR for SOM analyses has never been verified, nor have any ATR-FTIR data been compared to similar measurements by NMR. In this work we focused on FTIR vibrational bands that correspond to the three functional groups used to calculate HI and AI values: carbohydrates (1030 cm-1), aromatics (1510, 1630 cm-1), and aliphatics (2850, 2920 cm-1). Data from ATR-FTIR measurements were compared to analogous quantitation by DP- and CP-NMR using peat samples from Sweden, Minnesota, and North Carolina. DP- and CP-NMR correlate very strongly, although the correlations are not always 1:1. Direct comparison of relative abundances of the three functional groups determined by NMR and ATR-FTIR yielded satisfactory results for carbohydrates (r2= 0.78) and aliphatics (r2=0.58), but less so for aromatics (r2= 0.395). ATR-FTIR has to this point been used primarily for relative abundance analyses (e.g. calculating HI and AI values), but these results suggest FTIR can provide quantitative reliability that approaches that of NMR.

  16. Seismic waveform inversion best practices: regional, global and exploration test cases

    NASA Astrophysics Data System (ADS)

    Modrak, Ryan; Tromp, Jeroen

    2016-09-01

    Reaching the global minimum of a waveform misfit function requires careful choices about the nonlinear optimization, preconditioning and regularization methods underlying an inversion. Because waveform inversion problems are susceptible to erratic convergence associated with strong nonlinearity, one or two test cases are not enough to reliably inform such decisions. We identify best practices, instead, using four seismic near-surface problems, one regional problem and two global problems. To make meaningful quantitative comparisons between methods, we carry out hundreds of inversions, varying one aspect of the implementation at a time. Comparing nonlinear optimization algorithms, we find that limited-memory BFGS provides computational savings over nonlinear conjugate gradient methods in a wide range of test cases. Comparing preconditioners, we show that a new diagonal scaling derived from the adjoint of the forward operator provides better performance than two conventional preconditioning schemes. Comparing regularization strategies, we find that projection, convolution, Tikhonov regularization and total variation regularization are effective in different contexts. Besides questions of one strategy or another, reliability and efficiency in waveform inversion depend on close numerical attention and care. Implementation details involving the line search and restart conditions have a strong effect on computational cost, regardless of the chosen nonlinear optimization algorithm.

  17. Artistic image analysis using graph-based learning approaches.

    PubMed

    Carneiro, Gustavo

    2013-08-01

    We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.

  18. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  19. "All is well": professionals' documentation of social determinants of health in Swedish Child Health Services health records concerning maltreated children - a mixed method approach.

    PubMed

    Köhler, Marie; Rosvall, Maria; Emmelin, Maria

    2016-08-15

    Knowledge about social determinants of health has influenced global health strategies, including early childhood interventions. Some psychosocial circumstances - such as poverty, parental mental health problems, abuse and partner violence - increase the risk of child maltreatment and neglect. Healthcare professionals' awareness of psychosocial issues is of special interest, since they both have the possibility and the obligation to identify vulnerable children. Child Health Services health records of 100 children in Malmö, Sweden, who had been placed in, or were to be placed in family foster care, were compared with health records of a matched comparison group of 100 children who were not placed in care. A mixed-method approach integrating quantitative and qualitative analysis was applied. The documentation about the foster care group was more voluminous than for the comparison group. The content was problem-oriented and dominated by severe parental health and social problems, while the child's own experiences were neglected. The professionals documented interaction with healthcare and social functions, but very few reports to the Social Services were noted. For both groups, notes about social structures were almost absent. Child Health Service professionals facing vulnerable children document parental health issues and interaction with healthcare, but they fail to document living conditions thereby making social structures invisible in the health records. The child perspective is insufficiently integrated in the documentation and serious child protection needs remain unmet, if professionals avoid reporting to Social Services.

  20. Simulation of 2D rarefied gas flows based on the numerical solution of the Boltzmann equation

    NASA Astrophysics Data System (ADS)

    Poleshkin, Sergey O.; Malkov, Ewgenij A.; Kudryavtsev, Alexey N.; Shershnev, Anton A.; Bondar, Yevgeniy A.; Kohanchik, A. A.

    2017-10-01

    There are various methods for calculating rarefied gas flows, in particular, statistical methods and deterministic methods based on the finite-difference solutions of the Boltzmann nonlinear kinetic equation and on the solutions of model kinetic equations. There is no universal method; each has its disadvantages in terms of efficiency or accuracy. The choice of the method depends on the problem to be solved and on parameters of calculated flows. Qualitative theoretical arguments help to determine the range of parameters of effectively solved problems for each method; however, it is advisable to perform comparative tests of calculations of the classical problems performed by different methods and with different parameters to have quantitative confirmation of this reasoning. The paper provides the results of the calculations performed by the authors with the help of the Direct Simulation Monte Carlo method and finite-difference methods of solving the Boltzmann equation and model kinetic equations. Based on this comparison, conclusions are made on selecting a particular method for flow simulations in various ranges of flow parameters.

  1. Extracting morphologies from third harmonic generation images of structurally normal human brain tissue.

    PubMed

    Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C

    2017-06-01

    The morphologies contained in 3D third harmonic generation (THG) images of human brain tissue can report on the pathological state of the tissue. However, the complexity of THG brain images makes the usage of modern image processing tools, especially those of image filtering, segmentation and validation, to extract this information challenging. We developed a salient edge-enhancing model of anisotropic diffusion for image filtering, based on higher order statistics. We split the intrinsic 3-phase segmentation problem into two 2-phase segmentation problems, each of which we solved with a dedicated model, active contour weighted by prior extreme. We applied the novel proposed algorithms to THG images of structurally normal ex-vivo human brain tissue, revealing key tissue components-brain cells, microvessels and neuropil, enabling statistical characterization of these components. Comprehensive comparison to manually delineated ground truth validated the proposed algorithms. Quantitative comparison to second harmonic generation/auto-fluorescence images, acquired simultaneously from the same tissue area, confirmed the correctness of the main THG features detected. The software and test datasets are available from the authors. z.zhang@vu.nl. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. Towards discrete wavelet transform-based human activity recognition

    NASA Astrophysics Data System (ADS)

    Khare, Manish; Jeon, Moongu

    2017-06-01

    Providing accurate recognition of human activities is a challenging problem for visual surveillance applications. In this paper, we present a simple and efficient algorithm for human activity recognition based on a wavelet transform. We adopt discrete wavelet transform (DWT) coefficients as a feature of human objects to obtain advantages of its multiresolution approach. The proposed method is tested on multiple levels of DWT. Experiments are carried out on different standard action datasets including KTH and i3D Post. The proposed method is compared with other state-of-the-art methods in terms of different quantitative performance measures. The proposed method is found to have better recognition accuracy in comparison to the state-of-the-art methods.

  3. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  4. Topology Optimization using the Level Set and eXtended Finite Element Methods: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Villanueva Perez, Carlos Hernan

    Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.

  5. Investigating Children's Abilities to Count and Make Quantitative Comparisons

    ERIC Educational Resources Information Center

    Lee, Joohi; Md-Yunus, Sham'ah

    2016-01-01

    This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…

  6. Optimization in the utility maximization framework for conservation planning: a comparison of solution procedures in a study of multifunctional agriculture

    PubMed Central

    Stoms, David M.; Davis, Frank W.

    2014-01-01

    Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management. PMID:25538868

  7. Optimization in the utility maximization framework for conservation planning: a comparison of solution procedures in a study of multifunctional agriculture

    USGS Publications Warehouse

    Kreitler, Jason R.; Stoms, David M.; Davis, Frank W.

    2014-01-01

    Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management.

  8. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;…

  9. Errors Made by Elementary Fourth Grade Students When Modelling Word Problems and the Elimination of Those Errors through Scaffolding

    ERIC Educational Resources Information Center

    Ulu, Mustafa

    2017-01-01

    This study aims to identify errors made by primary school students when modelling word problems and to eliminate those errors through scaffolding. A 10-question problem-solving achievement test was used in the research. The qualitative and quantitative designs were utilized together. The study group of the quantitative design comprises 248…

  10. Quantitative comparison of 3D third harmonic generation and fluorescence microscopy images.

    PubMed

    Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C

    2018-01-01

    Third harmonic generation (THG) microscopy is a label-free imaging technique that shows great potential for rapid pathology of brain tissue during brain tumor surgery. However, the interpretation of THG brain images should be quantitatively linked to images of more standard imaging techniques, which so far has been done qualitatively only. We establish here such a quantitative link between THG images of mouse brain tissue and all-nuclei-highlighted fluorescence images, acquired simultaneously from the same tissue area. For quantitative comparison of a substantial pair of images, we present here a segmentation workflow that is applicable for both THG and fluorescence images, with a precision of 91.3 % and 95.8 % achieved respectively. We find that the correspondence between the main features of the two imaging modalities amounts to 88.9 %, providing quantitative evidence of the interpretation of dark holes as brain cells. Moreover, 80 % bright objects in THG images overlap with nuclei highlighted in the fluorescence images, and they are 2 times smaller than the dark holes, showing that cells of different morphologies can be recognized in THG images. We expect that the described quantitative comparison is applicable to other types of brain tissue and with more specific staining experiments for cell type identification. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Field-effect sensors - from pH sensing to biosensing: sensitivity enhancement using streptavidin-biotin as a model system.

    PubMed

    Lowe, Benjamin M; Sun, Kai; Zeimpekis, Ioannis; Skylaris, Chris-Kriton; Green, Nicolas G

    2017-11-06

    Field-Effect Transistor sensors (FET-sensors) have been receiving increasing attention for biomolecular sensing over the last two decades due to their potential for ultra-high sensitivity sensing, label-free operation, cost reduction and miniaturisation. Whilst the commercial application of FET-sensors in pH sensing has been realised, their commercial application in biomolecular sensing (termed BioFETs) is hindered by poor understanding of how to optimise device design for highly reproducible operation and high sensitivity. In part, these problems stem from the highly interdisciplinary nature of the problems encountered in this field, in which knowledge of biomolecular-binding kinetics, surface chemistry, electrical double layer physics and electrical engineering is required. In this work, a quantitative analysis and critical review has been performed comparing literature FET-sensor data for pH-sensing with data for sensing of biomolecular streptavidin binding to surface-bound biotin systems. The aim is to provide the first systematic, quantitative comparison of BioFET results for a single biomolecular analyte, specifically streptavidin, which is the most commonly used model protein in biosensing experiments, and often used as an initial proof-of-concept for new biosensor designs. This novel quantitative and comparative analysis of the surface potential behaviour of a range of devices demonstrated a strong contrast between the trends observed in pH-sensing and those in biomolecule-sensing. Potential explanations are discussed in detail and surface-chemistry optimisation is shown to be a vital component in sensitivity-enhancement. Factors which can influence the response, yet which have not always been fully appreciated, are explored and practical suggestions are provided on how to improve experimental design.

  12. The Local Geometry of Multiattribute Tradeoff Preferences

    PubMed Central

    McGeachie, Michael; Doyle, Jon

    2011-01-01

    Existing representations for multiattribute ceteris paribus preference statements have provided useful treatments and clear semantics for qualitative comparisons, but have not provided similarly clear representations or semantics for comparisons involving quantitative tradeoffs. We use directional derivatives and other concepts from elementary differential geometry to interpret conditional multiattribute ceteris paribus preference comparisons that state bounds on quantitative tradeoff ratios. This semantics extends the familiar economic notion of marginal rate of substitution to multiple continuous or discrete attributes. The same geometric concepts also provide means for interpreting statements about the relative importance of different attributes. PMID:21528018

  13. GraphCrunch 2: Software tool for network modeling, alignment and clustering.

    PubMed

    Kuchaiev, Oleksii; Stevanović, Aleksandar; Hayes, Wayne; Pržulj, Nataša

    2011-01-19

    Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, GraphCruch 2 implements an algorithm for clustering nodes within a network based solely on their topological similarities. Using GraphCrunch 2, we demonstrate that eukaryotic and viral PPI networks may belong to different graph model families and show that topology-based clustering can reveal important functional similarities between proteins within yeast and human PPI networks. GraphCrunch 2 is a software tool that implements the latest research on biological network analysis. It parallelizes computationally intensive tasks to fully utilize the potential of modern multi-core CPUs. It is open-source and freely available for research use. It runs under the Windows and Linux platforms.

  14. Testing for nonrandom shape similarity between sister cells using automated shape comparison

    NASA Astrophysics Data System (ADS)

    Guo, Monica; Marshall, Wallace F.

    2009-02-01

    Several reports in the biological literature have indicated that when a living cell divides, the two daughter cells have a tendency to be mirror images of each other in terms of their overall cell shape. This phenomenon would be consistent with inheritance of spatial organization from mother cell to daughters. However the published data rely on a small number of examples that were visually chosen, raising potential concerns about inadvertent selection bias. We propose to revisit this issue using automated quantitative shape comparison methods which would have no contribution from the observer and which would allow statistical testing of similarity in large numbers of cells. In this report we describe a first order approach to the problem using rigid curve matching. Using test images, we compare a pointwise correspondence based distance metric with a chamfer matching strategy and find that the latter provides better correspondence and smaller distances between aligned curves, especially when we allow nonrigid deformation of the outlines in addition to rotation.

  15. Systematic evaluation of NASA precipitation radar estimates using NOAA/NSSL National Mosaic QPE products

    NASA Astrophysics Data System (ADS)

    Kirstetter, P.; Hong, Y.; Gourley, J. J.; Chen, S.; Flamig, Z.; Zhang, J.; Howard, K.; Petersen, W. A.

    2011-12-01

    Proper characterization of the error structure of TRMM Precipitation Radar (PR) quantitative precipitation estimation (QPE) is needed for their use in TRMM combined products, water budget studies and hydrological modeling applications. Due to the variety of sources of error in spaceborne radar QPE (attenuation of the radar signal, influence of land surface, impact of off-nadir viewing angle, etc.) and the impact of correction algorithms, the problem is addressed by comparison of PR QPEs with reference values derived from ground-based measurements (GV) using NOAA/NSSL's National Mosaic QPE (NMQ) system. An investigation of this subject has been carried out at the PR estimation scale (instantaneous and 5 km) on the basis of a 3-month-long data sample. A significant effort has been carried out to derive a bias-corrected, robust reference rainfall source from NMQ. The GV processing details will be presented along with preliminary results of PR's error characteristics using contingency table statistics, probability distribution comparisons, scatter plots, semi-variograms, and systematic biases and random errors.

  16. Single underwater image enhancement based on color cast removal and visibility restoration

    NASA Astrophysics Data System (ADS)

    Li, Chongyi; Guo, Jichang; Wang, Bo; Cong, Runmin; Zhang, Yan; Wang, Jian

    2016-05-01

    Images taken under underwater condition usually have color cast and serious loss of contrast and visibility. Degraded underwater images are inconvenient for observation and analysis. In order to address these problems, an underwater image-enhancement method is proposed. A simple yet effective underwater image color cast removal algorithm is first presented based on the optimization theory. Then, based on the minimum information loss principle and inherent relationship of medium transmission maps of three color channels in an underwater image, an effective visibility restoration algorithm is proposed to recover visibility, contrast, and natural appearance of degraded underwater images. To evaluate the performance of the proposed method, qualitative comparison, quantitative comparison, and color accuracy test are conducted. Experimental results demonstrate that the proposed method can effectively remove color cast, improve contrast and visibility, and recover natural appearance of degraded underwater images. Additionally, the proposed method is comparable to and even better than several state-of-the-art methods.

  17. Quantitative Comparisons of a Coarse-Grid LES with Experimental Data for Backward-Facing Step Flow

    NASA Astrophysics Data System (ADS)

    McDonough, J. M.

    1999-11-01

    A novel approach to LES employing an additive decomposition of both solutions and governing equations (similar to ``multi-level'' approaches of Dubois et al.,Dynamic Multilevel Methods and the Simulation of Turbulence, Cambridge University Press, 1999) is presented; its main structural features are lack of filtering of governing equations (instead, solutions are filtered to remove aliasing due to under resolution) and direct modeling of subgrid-scale primitive variables (rather than modeling their correlations) in the manner proposed by Hylin and McDonough (Int. J. Fluid Mech. Res. 26, 228-256, 1999). A 2-D implementation of this formalism is applied to the backward-facing step flow studied experimentally by Driver and Seegmiller (AIAA J. 23, 163-171, 1985) and Driver et al. (AIAA J. 25, 914-919, 1987), and run on grids sufficiently coarse to permit easy extension to 3-D, industrially-realistic problems. Comparisons of computed and experimental mean quantities (velocity profiles, turbulence kinetic energy, reattachment lengths, etc.) and effects of grid refinement will be presented.

  18. Deep Learning for Magnetic Resonance Fingerprinting: A New Approach for Predicting Quantitative Parameter Values from Time Series.

    PubMed

    Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas

    2017-01-01

    The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.

  19. Underwater image enhancement based on the dark channel prior and attenuation compensation

    NASA Astrophysics Data System (ADS)

    Guo, Qingwen; Xue, Lulu; Tang, Ruichun; Guo, Lingrui

    2017-10-01

    Aimed at the two problems of underwater imaging, fog effect and color cast, an Improved Segmentation Dark Channel Prior (ISDCP) defogging method is proposed to solve the fog effects caused by physical properties of water. Due to mass refraction of light in the process of underwater imaging, fog effects would lead to image blurring. And color cast is closely related to different degree of attenuation while light with different wavelengths is traveling in water. The proposed method here integrates the ISDCP and quantitative histogram stretching techniques into the image enhancement procedure. Firstly, the threshold value is set during the refinement process of the transmission maps to identify the original mismatching, and to conduct the differentiated defogging process further. Secondly, a method of judging the propagating distance of light is adopted to get the attenuation degree of energy during the propagation underwater. Finally, the image histogram is stretched quantitatively in Red-Green-Blue channel respectively according to the degree of attenuation in each color channel. The proposed method ISDCP can reduce the computational complexity and improve the efficiency in terms of defogging effect to meet the real-time requirements. Qualitative and quantitative comparison for several different underwater scenes reveals that the proposed method can significantly improve the visibility compared with previous methods.

  20. Hierarchical mutual information for the comparison of hierarchical community structures in complex networks

    NASA Astrophysics Data System (ADS)

    Perotti, Juan Ignacio; Tessone, Claudio Juan; Caldarelli, Guido

    2015-12-01

    The quest for a quantitative characterization of community and modular structure of complex networks produced a variety of methods and algorithms to classify different networks. However, it is not clear if such methods provide consistent, robust, and meaningful results when considering hierarchies as a whole. Part of the problem is the lack of a similarity measure for the comparison of hierarchical community structures. In this work we give a contribution by introducing the hierarchical mutual information, which is a generalization of the traditional mutual information and makes it possible to compare hierarchical partitions and hierarchical community structures. The normalized version of the hierarchical mutual information should behave analogously to the traditional normalized mutual information. Here the correct behavior of the hierarchical mutual information is corroborated on an extensive battery of numerical experiments. The experiments are performed on artificial hierarchies and on the hierarchical community structure of artificial and empirical networks. Furthermore, the experiments illustrate some of the practical applications of the hierarchical mutual information, namely the comparison of different community detection methods and the study of the consistency, robustness, and temporal evolution of the hierarchical modular structure of networks.

  1. A resilience-oriented approach for quantitatively assessing recurrent spatial-temporal congestion on urban roads.

    PubMed

    Tang, Junqing; Heinimann, Hans Rudolf

    2018-01-01

    Traffic congestion brings not only delay and inconvenience, but other associated national concerns, such as greenhouse gases, air pollutants, road safety issues and risks. Identification, measurement, tracking, and control of urban recurrent congestion are vital for building a livable and smart community. A considerable amount of works has made contributions to tackle the problem. Several methods, such as time-based approaches and level of service, can be effective for characterizing congestion on urban streets. However, studies with systemic perspectives have been minor in congestion quantification. Resilience, on the other hand, is an emerging concept that focuses on comprehensive systemic performance and characterizes the ability of a system to cope with disturbance and to recover its functionality. In this paper, we symbolized recurrent congestion as internal disturbance and proposed a modified metric inspired by the well-applied "R4" resilience-triangle framework. We constructed the metric with generic dimensions from both resilience engineering and transport science to quantify recurrent congestion based on spatial-temporal traffic patterns and made the comparison with other two approaches in freeway and signal-controlled arterial cases. Results showed that the metric can effectively capture congestion patterns in the study area and provides a quantitative benchmark for comparison. Also, it suggested not only a good comparative performance in measuring strength of proposed metric, but also its capability of considering the discharging process in congestion. The sensitivity tests showed that proposed metric possesses robustness against parameter perturbation in Robustness Range (RR), but the number of identified congestion patterns can be influenced by the existence of ϵ. In addition, the Elasticity Threshold (ET) and the spatial dimension of cell-based platform differ the congestion results significantly on both the detected number and intensity. By tackling this conventional problem with emerging concept, our metric provides a systemic alternative approach and enriches the toolbox for congestion assessment. Future work will be conducted on a larger scale with multiplex scenarios in various traffic conditions.

  2. Variable pixel size ionospheric tomography

    NASA Astrophysics Data System (ADS)

    Zheng, Dunyong; Zheng, Hongwei; Wang, Yanjun; Nie, Wenfeng; Li, Chaokui; Ao, Minsi; Hu, Wusheng; Zhou, Wei

    2017-06-01

    A novel ionospheric tomography technique based on variable pixel size was developed for the tomographic reconstruction of the ionospheric electron density (IED) distribution. In variable pixel size computerized ionospheric tomography (VPSCIT) model, the IED distribution is parameterized by a decomposition of the lower and upper ionosphere with different pixel sizes. Thus, the lower and upper IED distribution may be very differently determined by the available data. The variable pixel size ionospheric tomography and constant pixel size tomography are similar in most other aspects. There are some differences between two kinds of models with constant and variable pixel size respectively, one is that the segments of GPS signal pay should be assigned to the different kinds of pixel in inversion; the other is smoothness constraint factor need to make the appropriate modified where the pixel change in size. For a real dataset, the variable pixel size method distinguishes different electron density distribution zones better than the constant pixel size method. Furthermore, it can be non-chided that when the effort is spent to identify the regions in a model with best data coverage. The variable pixel size method can not only greatly improve the efficiency of inversion, but also produce IED images with high fidelity which are the same as a used uniform pixel size method. In addition, variable pixel size tomography can reduce the underdetermined problem in an ill-posed inverse problem when the data coverage is irregular or less by adjusting quantitative proportion of pixels with different sizes. In comparison with constant pixel size tomography models, the variable pixel size ionospheric tomography technique achieved relatively good results in a numerical simulation. A careful validation of the reliability and superiority of variable pixel size ionospheric tomography was performed. Finally, according to the results of the statistical analysis and quantitative comparison, the proposed method offers an improvement of 8% compared with conventional constant pixel size tomography models in the forward modeling.

  3. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  4. Research on a Unique Instructional Framework for Elevating Students’ Quantitative Problem Solving Abilities

    NASA Astrophysics Data System (ADS)

    Prather, Edward E.; Wallace, Colin Scott

    2018-06-01

    We present an instructional framework that allowed a first time physics instructor to improve students quantitative problem solving abilities by more than a letter grade over what was achieved by students in an experienced instructor’s course. This instructional framework uses a Think-Pair-Share approach to foster collaborative quantitative problem solving during the lecture portion of a large enrollment introductory calculus-based mechanics course. Through the development of carefully crafted and sequenced TPS questions, we engage students in rich discussions on key problem solving issues that we typically only hear about when a student comes for help during office hours. Current work in the sophomore E&M course illustrates that this framework is generalizable to classes beyond the introductory level and for topics beyond mechanics.

  5. Ice nucleation by particles immersed in supercooled cloud droplets.

    PubMed

    Murray, B J; O'Sullivan, D; Atkinson, J D; Webb, M E

    2012-10-07

    The formation of ice particles in the Earth's atmosphere strongly affects the properties of clouds and their impact on climate. Despite the importance of ice formation in determining the properties of clouds, the Intergovernmental Panel on Climate Change (IPCC, 2007) was unable to assess the impact of atmospheric ice formation in their most recent report because our basic knowledge is insufficient. Part of the problem is the paucity of quantitative information on the ability of various atmospheric aerosol species to initiate ice formation. Here we review and assess the existing quantitative knowledge of ice nucleation by particles immersed within supercooled water droplets. We introduce aerosol species which have been identified in the past as potentially important ice nuclei and address their ice-nucleating ability when immersed in a supercooled droplet. We focus on mineral dusts, biological species (pollen, bacteria, fungal spores and plankton), carbonaceous combustion products and volcanic ash. In order to make a quantitative comparison we first introduce several ways of describing ice nucleation and then summarise the existing information according to the time-independent (singular) approximation. Using this approximation in combination with typical atmospheric loadings, we estimate the importance of ice nucleation by different aerosol types. According to these estimates we find that ice nucleation below about -15 °C is dominated by soot and mineral dusts. Above this temperature the only materials known to nucleate ice are biological, with quantitative data for other materials absent from the literature. We conclude with a summary of the challenges our community faces.

  6. Gaussian variational ansatz in the problem of anomalous sea waves: Comparison with direct numerical simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruban, V. P., E-mail: ruban@itp.ac.ru

    2015-05-15

    The nonlinear dynamics of an obliquely oriented wave packet on a sea surface is analyzed analytically and numerically for various initial parameters of the packet in relation to the problem of the so-called rogue waves. Within the Gaussian variational ansatz applied to the corresponding (1+2)-dimensional hyperbolic nonlinear Schrödinger equation (NLSE), a simplified Lagrangian system of differential equations is derived that describes the evolution of the coefficients of the real and imaginary quadratic forms appearing in the Gaussian. This model provides a semi-quantitative description of the process of nonlinear spatiotemporal focusing, which is one of the most probable mechanisms of roguemore » wave formation in random wave fields. The system of equations is integrated in quadratures, which allows one to better understand the qualitative differences between linear and nonlinear focusing regimes of a wave packet. Predictions of the Gaussian model are compared with the results of direct numerical simulation of fully nonlinear long-crested waves.« less

  7. Evaluation of the clinical sensitivity for the quantification of human immunodeficiency virus type 1 RNA in plasma: Comparison of the new COBAS TaqMan HIV-1 with three current HIV-RNA assays--LCx HIV RNA quantitative, VERSANT HIV-1 RNA 3.0 (bDNA) and COBAS AMPLICOR HIV-1 Monitor v1.5.

    PubMed

    Katsoulidou, Antigoni; Petrodaskalaki, Maria; Sypsa, Vana; Papachristou, Eleni; Anastassopoulou, Cleo G; Gargalianos, Panagiotis; Karafoulidou, Anastasia; Lazanas, Marios; Kordossis, Theodoros; Andoniadou, Anastasia; Hatzakis, Angelos

    2006-02-01

    The COBAS TaqMan HIV-1 test (Roche Diagnostics) was compared with the LCx HIV RNA quantitative assay (Abbott Laboratories), the Versant HIV-1 RNA 3.0 (bDNA) assay (Bayer) and the COBAS Amplicor HIV-1 Monitor v1.5 test (Roche Diagnostics), using plasma samples of various viral load levels from HIV-1-infected individuals. In the comparison of TaqMan with LCx, TaqMan identified as positive 77.5% of the 240 samples versus 72.1% identified by LCx assay, while their overall agreement was 94.6% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.91). Similarly, in the comparison of TaqMan with bDNA 3.0, both methods identified 76.3% of the 177 samples as positive, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.95). Finally, in the comparison of TaqMan with Monitor v1.5, TaqMan identified 79.5% of the 156 samples as positive versus 80.1% identified by Monitor v1.5, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.96). In conclusion, the new COBAS TaqMan HIV-1 test showed excellent agreement with other widely used commercially available tests for the quantitation of HIV-1 viral load.

  8. COMPARISON OF GENETIC METHODS TO OPTICAL METHODS IN THE IDENTIFICATION AND ASSESSMENT OF MOLD IN THE BUILT ENVIRONMENT -- COMPARISON OF TAQMAN AND MICROSCOPIC ANALYSIS OF CLADOSPORIUM SPORES RETRIEVED FROM ZEFON AIR-O-CELL TRACES

    EPA Science Inventory

    Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.

    In this pilot study, quantitative...

  9. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  10. Dissociative Conceptual and Quantitative Problem Solving Outcomes across Interactive Engagement and Traditional Format Introductory Physics

    ERIC Educational Resources Information Center

    McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.

    2016-01-01

    The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present…

  11. Solving Quantitative Problems: Guidelines for Teaching Derived from Research.

    ERIC Educational Resources Information Center

    Kramers-Pals, H.; Pilot, A.

    1988-01-01

    Presents four guidelines for teaching quantitative problem-solving based on research results: analyze difficulties of students, develop a system of heuristics, select and map key relations, and design instruction with proper orientation, exercise, and feedback. Discusses the four guidelines and uses flow charts and diagrams to show how the…

  12. Combining Experiments and Simulations Using the Maximum Entropy Principle

    PubMed Central

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges. PMID:24586124

  13. Quantitative analysis of a reconstruction method for fully three-dimensional PET.

    PubMed

    Suckling, J; Ott, R J; Deehan, B J

    1992-03-01

    The major advantage of positron emission tomography (PET) using large area planar detectors over scintillator-based commercial ring systems is the potentially larger (by a factor of two or three) axial field-of-view (FOV). However, to achieve the space invariance of the point spread function necessary for Fourier filtering a polar angle rejection criterion is applied to the data during backprojection resulting in a trade-off between FOV size and sensitivity. A new algorithm due to Defrise and co-workers developed for list-mode data overcomes this problem with a solution involving the division of the image into several subregions. A comparison between the existing backprojection-then-filter algorithm and the new method (with three subregions) has been made using both simulated and real data collected from the MUP-PET positron camera. Signal-to-noise analysis reveals that improvements of up to a factor of 1.4 are possible resulting from an increased data usage of up to a factor of 2.5 depending on the axial extent of the imaged object. Quantitation is also improved.

  14. A comparison of two types of social support for mothers of mentally ill children.

    PubMed

    Scharer, Kathleen; Colon, Eileen; Moneyham, Linda; Hussey, Jim; Tavakoli, Abbas; Shugart, Margaret

    2009-05-01

    The purpose of this analysis was to compare social support offered by two telehealth nursing interventions for mothers of children with serious mental illnesses. A randomized, controlled, quantitative investigation is underway to test two support interventions, using the telephone (TSS) or Internet (WEB). Qualitative description was used to analyze data generated during telehealth interventions. The behaviors and attitudes of children were challenging for the mothers to manage. Mothers' emotional reactions included fear, frustration, concern, and guilt. They sought to be advocates for their children. The nurses provided emotional, informational, and appraisal support. TSS mothers were passive recipients, while WEB mothers had to choose to participate. Mothers in both interventions shared similar concerns and sought support related to their child's problems.

  15. Preprocessing film-copied MRI for studying morphological brain changes.

    PubMed

    Pham, Tuan D; Eisenblätter, Uwe; Baune, Bernhard T; Berger, Klaus

    2009-06-15

    The magnetic resonance imaging (MRI) of the brain is one of the important data items for studying memory and morbidity in elderly as these images can provide useful information through the quantitative measures of various regions of interest of the brain. As an effort to fully automate the biomedical analysis of the brain that can be combined with the genetic data of the same human population and where the records of the original MRI data are missing, this paper presents two effective methods for addressing this imaging problem. The first method handles the restoration of the film-copied MRI. The second method involves the segmentation of the image data. Experimental results and comparisons with other methods suggest the usefulness of the proposed image analysis methodology.

  16. Nuclear physics: Macroscopic aspects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiatecki, W.J.

    1993-12-01

    A systematic macroscopic, leptodermous approach to nuclear statics and dynamics is described, based formally on the assumptions {h_bar} {yields} 0 and b/R << 1, where b is the surface diffuseness and R the nuclear radius. The resulting static model of shell-corrected nuclear binding energies and deformabilities is accurate to better than 1 part in a thousand and yields a firm determination of the principal properties of the nuclear fluid. As regards dynamics, the above approach suggests that nuclear shape evolutions will often be dominated by dissipation, but quantitative comparisons with experimental data are more difficult than in the case ofmore » statics. In its simplest liquid drop version the model exhibits interesting formal connections to the classic astronomical problem of rotating gravitating masses.« less

  17. The Kinetics of Selective Biological Transport

    PubMed Central

    Miller, D. M.

    1968-01-01

    The simplest biological transport system so far extensively investigated is that of monosaccharides in human erythrocytes. Despite its simplicity there is still considerable doubt and divergence of opinion concerning its mechanism. Some confusion may arise as a result of the comparison of diverse data obtained by different workers using a variety of experimental techniques. To minimize this problem, an attempt is made here to repeat, under standard conditions and with as much care as possible, five of the more definitive types of experiments previously performed on this system. It is hoped that the result of this effort is an internally consistent set of data with which the quantitative predictions of various proposed mechanisms may be compared as a primary criterion for their acceptability. PMID:5696215

  18. Large Area Silicon Sheet by EFG

    NASA Technical Reports Server (NTRS)

    Wald, F. V.

    1979-01-01

    Displaced die concepts were explored along with some initial work on buckle characterization. Convective impurity redistribution was further studied. Growth from single cartridges was continued to create a quality baseline to allow comparison of the results with those in the upcoming multiple run and to choose the most appropriate die design. Fabrication and assembly work on the actual five ribbon furnace continued. Progress was made toward the development of the video optical system for edge position and meniscus height control. In preparation for a detailed program, designed to explore the buckling problem, ribbon guidance in the machine was improved. Buckle free, full width ribbon was grown under stable conditions without a cold shoe, an achievement essential to finally arrive at quantitative correlations between growth conditions and buckle formation.

  19. Comparative recreational assessment of Karaganda city public green spaces

    NASA Astrophysics Data System (ADS)

    Akylbekova, I. S.; Zengina, T. Yu

    2018-01-01

    This article represents evaluation of recreation environment on the territory of the large industrial city of Karaganda, located in the dry steppe zone of Central Kazakhstan. A comparison of quantitative and qualitative indicators, level of recreational attractiveness and providing the citizens with public green spaces, allowed to make a more complete characterization the urban recreation places and to identify the city districts, which require prioritized fundraising for development of existing parks and public gardens, and for creation of new territories of recreational purpose. Based on the results of conducted expert assessment and sociological survey of visitors, the main problems of urban green areas were identified and also the most high-demand trends and practical recommendations for their improvement and further use were proposed.

  20. Congruent climate-related genecological responses from molecular markers and quantitative traits for western white pine (Pinus monticola)

    Treesearch

    Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim

    2009-01-01

    Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...

  1. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  2. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  3. Quantitative and qualitative approaches in educational research — problems and examples of controlled understanding through interpretive methods

    NASA Astrophysics Data System (ADS)

    Neumann, Karl

    1987-06-01

    In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.

  4. Lifetime influences for cannabis cessation in male incarcerated indigenous australians.

    PubMed

    Jacups, Susan; Rogerson, Bernadette

    2015-01-01

    Urban non-indigenous populations report life events (marriages, employment) as influences for self-initiated cannabis cessation. However, this hasn't been investigated in remote indigenous populations with different social paradigms. We investigate cannabis use, harms, and poly-substance misuse in 101 consenting male incarcerated indigenous Australians. Interviews applied quantitative and qualitative questions assessing demographic characteristics, criminal history, drug use, the Marijuana Problems Inventory (MPI), and cannabis-cessation influences. Comparisons used Chi Square, Analysis of Variance, and Nvivo software. Cannabis use groups (current users, ex-users, and never users) were demographically similar except that current users reported more juvenile legal problems, younger school departure, and lower school achievement (p < 0.05). Mean cannabis consumption was 12.3 cones/day. Incarceration and family responsibilities were the strongest cessation influences. Employment responsibilities and negative self-image were rarely cited as influences. High cannabis use, with its associated problems, is concerning. These identified influences indicate incarceration should be used for substance reduction programs, plus post-release follow-up. Community-based programs focusing on positive influences, such as family responsibilities and social cohesion, may be successful within indigenous populations with strong kinship responsibilities, rather than programs that focus solely on substance harms.

  5. Link-Based Similarity Measures Using Reachability Vectors

    PubMed Central

    Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin

    2014-01-01

    We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188

  6. 75 FR 68468 - List of Fisheries for 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...

  7. Characteristics of students in comparative problem solving

    NASA Astrophysics Data System (ADS)

    Irfan, M.; Sudirman; Rahardi, R.

    2018-01-01

    Often teachers provided examples and exercised to students with regard to comparative problems consisting of one quantity. In this study, the researchers gave the problem of comparison with the two quantities mixed. It was necessary to have a good understanding to solve this problem. This study aimed to determine whether students understand the comparison in depth and be able to solve the problem of non-routine comparison. This study used qualitative explorative methods, with researchers conducting in-depth interviews on subjects to explore the thinking process when solving comparative problems. The subject of this study was three students selected by purposive sampling of 120 students. From this research, researchers found there were three subjects with different characteristics, namely: subject 1, he did the first and second questions with methods of elimination and substitution (non-comparison); subject 2, he did the first question with the concept of comparison although the answer was wrong, and did the second question with the method of elimination and substitution (non-comparison); and subject 3, he did both questions with the concept of comparison. In the first question, he did wrong because he was unable to understand the problem, while on the second he did correctly. From the characteristics of the answers, the researchers divided into 3 groups based on thinking process, namely: blind-proportion, partial-proportion, and proportion thinking.

  8. Design and implementation of optical imaging and sensor systems for characterization of deep-sea biological camouflage

    NASA Astrophysics Data System (ADS)

    Haag, Justin Mathew

    The visual ecology of deep-sea animals has long been of scientific interest. In the open ocean, where there is no physical structure to hide within or behind, diverse strategies have evolved to solve the problem of camouflage from a potential predator. Simulations of specific predator-prey scenarios have yielded estimates of the range of possible appearances that an animal may exhibit. However, there is a limited amount of quantitative information available related to both animal appearance and the light field at mesopelagic depths (200 m to 1000 m). To mitigate this problem, novel optical instrumentation, taking advantage of recent technological advances, was developed and is described in this dissertation. In the first half of this dissertation, the appearance of mirrored marine animals is quantitatively evaluated. A portable optical imaging scatterometer was developed to measure angular reflectance, described by the bidirectional reflectance distribution function (BRDF), of biological specimens. The instrument allows for BRDF capture from samples of arbitrary size, over a significant fraction of the reflectance hemisphere. Multiple specimens representing two species of marine animals, collected at mesopelagic depths, were characterized using the scatterometer. Low-dimensional parametric models were developed to simplify use of the data sets, and to validate the BRDF method. Results from principal component analysis confirm that BRDF measurements can be used to study intra- and interspecific variability of mirrored marine animal appearance. Collaborative efforts utilizing the BRDF data sets to develop physically-based scattering models are underway. In the second half of this dissertation, another key part of the deep-sea biological camouflage problem is examined. Two underwater radiometers, capable of low-light measurements, were developed to address the lack of available information related to the deep-sea light field. Quantitative comparison of spectral downward irradiance profiles at blue (~470~nm) and green (~560~nm) wavelengths, collected at Pacific and Atlantic field stations, provide insight into the presence of Raman (inelastic) scattering effects at mesopelagic depths. The radiometers were also used to collect in situ flashes of bioluminescence. Collaborations utilizing both the downward irradiance and bioluminescence data sets are planned.

  9. Detection and quantitation of HPV in genital and oral tissues and fluids by real time PCR

    PubMed Central

    2010-01-01

    Background Human papillomaviruses (HPVs) remain a serious world health problem due to their association with anogenital/oral cancers and warts. While over 100 HPV types have been identified, a subset is associated with malignancy. HPV16 and 18 are the most prevalent oncogenic types, while HPV6 and 11 are most commonly responsible for anogenital warts. While other quantitative PCR (qPCR) assays detect oncogenic HPV, there is no single tube assay distinguishing the most frequent oncogenic types and the most common types found in warts. Results A Sybr Green-based qPCR assay was developed utilizing degenerate primers to the highly conserved HPV E1 theoretically detecting any HPV type. A single tube multiplex qPCR assay was also developed using type-specific primer pairs and TaqMan probes that allowed for detection and quantitation of HPV6,11,16,18. Each HPV type was detected over a range from 2 × 101 to 2 × 106copies/reaction providing a reliable method of quantitating type-specific HPV in 140 anogenital/cutaneous/oral benign and malignant specimens. 35 oncogenic and low risk alpha genus HPV types were detected. Concordance was detected in previously typed specimens. Comparisons to the gold standard detected an overall sensitivity of 89% (95% CI: 77% - 96%) and specificity of 90% (95%CI: 52% - 98%). Conclusion There was good agreement between the ability of the qPCR assays described here to identify HPV types in malignancies previously typed using standard methods. These novel qPCR assays will allow rapid detection and quantitation of HPVs to assess their role in viral pathogenesis. PMID:20723234

  10. What works with worked examples: Extending self-explanation and analogical comparison to synthesis problems

    NASA Astrophysics Data System (ADS)

    Badeau, Ryan; White, Daniel R.; Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.

    2017-12-01

    The ability to solve physics problems that require multiple concepts from across the physics curriculum—"synthesis" problems—is often a goal of physics instruction. Three experiments were designed to evaluate the effectiveness of two instructional methods employing worked examples on student performance with synthesis problems; these instructional techniques, analogical comparison and self-explanation, have previously been studied primarily in the context of single-concept problems. Across three experiments with students from introductory calculus-based physics courses, both self-explanation and certain kinds of analogical comparison of worked examples significantly improved student performance on a target synthesis problem, with distinct improvements in recognition of the relevant concepts. More specifically, analogical comparison significantly improved student performance when the comparisons were invoked between worked synthesis examples. In contrast, similar comparisons between corresponding pairs of worked single-concept examples did not significantly improve performance. On a more complicated synthesis problem, self-explanation was significantly more effective than analogical comparison, potentially due to differences in how successfully students encoded the full structure of the worked examples. Finally, we find that the two techniques can be combined for additional benefit, with the trade-off of slightly more time on task.

  11. Quantitative Courses in a Liberal Education Program: A Case Study

    ERIC Educational Resources Information Center

    Wismath, Shelly L.; Mackay, D. Bruce

    2012-01-01

    This essay argues for the importance of quantitative reasoning skills as part of a liberal education and describes the successful introduction of a mathematics-based quantitative skills course at a small Canadian university. Today's students need quantitative problem-solving skills, to function as adults, professionals, consumers, and citizens in…

  12. A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.

    PubMed

    Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain

    2015-10-01

    Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.

  13. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions

    PubMed Central

    Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862

  14. Integrated Computational System for Aerodynamic Steering and Visualization

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    In February of 1994, an effort from the Fluid Dynamics and Information Sciences Divisions at NASA Ames Research Center with McDonnel Douglas Aerospace Company and Stanford University was initiated to develop, demonstrate, validate and disseminate automated software for numerical aerodynamic simulation. The goal of the initiative was to develop a tri-discipline approach encompassing CFD, Intelligent Systems, and Automated Flow Feature Recognition to improve the utility of CFD in the design cycle. This approach would then be represented through an intelligent computational system which could accept an engineer's definition of a problem and construct an optimal and reliable CFD solution. Stanford University's role focused on developing technologies that advance visualization capabilities for analysis of CFD data, extract specific flow features useful for the design process, and compare CFD data with experimental data. During the years 1995-1997, Stanford University focused on developing techniques in the area of tensor visualization and flow feature extraction. Software libraries were created enabling feature extraction and exploration of tensor fields. As a proof of concept, a prototype system called the Integrated Computational System (ICS) was developed to demonstrate CFD design cycle. The current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will (1) briefly review the technologies developed during 1995-1997 (2) describe current technologies in the area of comparison techniques, (4) describe the theory of our new method researched during the grant year (5) summarize a few of the results and finally (6) discuss work within the last 6 months that are direct extensions from the grant.

  15. Visualization of Problem Solving Related to the Quantitative Composition of Solutions in the Dynamic "GeoGebra" Environment

    ERIC Educational Resources Information Center

    Kostic, V. Dj.; Jovanovic, V. P. Stankov; Sekulic, T. M.; Takaci, Dj. B.

    2016-01-01

    Problem solving in the field of quantitative composition of solutions (QCS), expressed as mass share and molar concentration, is essential for chemistry students. Since successful chemistry education is based on different mathematical contents, it is important to be proficient in both mathematical and chemistry concepts as well as interconnections…

  16. Behavior problems in school-aged physically abused and neglected children in Spain.

    PubMed

    de Paúl, J; Arruabarrena, M I

    1995-04-01

    The present study investigated behavior problems in school-aged physically abused, neglected, and comparison children in the Basque Country (Spain). Data from the Teacher's Report Form of the Child Behavior Checklist was obtained on 66 children consisting of three groups (17 physically abused children, 24 physically neglected children, and 25 low-risk comparison children). The three groups were matched on seven sociodemographic variables. Overall, the abused and neglected children were higher than the comparison group on Total Behavior Problems scores. However, only neglected children obtained higher scores than the comparison group on the total score of the Externalized Scale, and only abused children scored higher than the comparison group on the total score of the Internalized Scale. Follow-up analysis indicated that both abused and neglected children had higher scores on the Social Problems, Delinquent Behavior, and Attention Problems subscales. Moreover, neglected children had higher scores on the Aggressive Behavior subscale than the comparison children, and abused children had higher scores on the Withdrawn subscale than the comparison children. The abused and neglected children also showed a lower school adjustment than the comparison group. Possible explanations of these findings are discussed and their implications for research and treatment are considered.

  17. A unified material decomposition framework for quantitative dual- and triple-energy CT imaging.

    PubMed

    Zhao, Wei; Vernekohl, Don; Han, Fei; Han, Bin; Peng, Hao; Yang, Yong; Xing, Lei; Min, James K

    2018-04-21

    Many clinical applications depend critically on the accurate differentiation and classification of different types of materials in patient anatomy. This work introduces a unified framework for accurate nonlinear material decomposition and applies it, for the first time, in the concept of triple-energy CT (TECT) for enhanced material differentiation and classification as well as dual-energy CT (DECT). We express polychromatic projection into a linear combination of line integrals of material-selective images. The material decomposition is then turned into a problem of minimizing the least-squares difference between measured and estimated CT projections. The optimization problem is solved iteratively by updating the line integrals. The proposed technique is evaluated by using several numerical phantom measurements under different scanning protocols. The triple-energy data acquisition is implemented at the scales of micro-CT and clinical CT imaging with commercial "TwinBeam" dual-source DECT configuration and a fast kV switching DECT configuration. Material decomposition and quantitative comparison with a photon counting detector and with the presence of a bow-tie filter are also performed. The proposed method provides quantitative material- and energy-selective images examining realistic configurations for both DECT and TECT measurements. Compared to the polychromatic kV CT images, virtual monochromatic images show superior image quality. For the mouse phantom, quantitative measurements show that the differences between gadodiamide and iodine concentrations obtained using TECT and idealized photon counting CT (PCCT) are smaller than 8 and 1 mg/mL, respectively. TECT outperforms DECT for multicontrast CT imaging and is robust with respect to spectrum estimation. For the thorax phantom, the differences between the concentrations of the contrast map and the corresponding true reference values are smaller than 7 mg/mL for all of the realistic configurations. A unified framework for both DECT and TECT imaging has been established for the accurate extraction of material compositions using currently available commercial DECT configurations. The novel technique is promising to provide an urgently needed solution for several CT-based diagnostic and therapy applications, especially for the diagnosis of cardiovascular and abdominal diseases where multicontrast imaging is involved. © 2018 American Association of Physicists in Medicine.

  18. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  19. High pressure rinsing system comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Sertore; M. Fusetti; P. Michelato

    2007-06-01

    High pressure rinsing (HPR) is a key process for the surface preparation of high field superconducting cavities. A portable apparatus for the water jet characterization, based on the transferred momentum between the water jet and a load cell, has been used in different laboratories. This apparatus allows to collected quantitative parameters that characterize the HPR water jet. In this paper, we present a quantitative comparison of the different water jet produced by various nozzles routinely used in different laboratories for the HPR process

  20. Field Demonstration Report Applied Innovative Technologies for Characterization of Nitrocellulose- and Nitroglycerine Contaminated Buildings and Soils, Rev 1

    DTIC Science & Technology

    2007-01-05

    positive / false negatives. The quantitative on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison...Conclusion ...............................................................................................3-9 3.2 Quantitative Analysis Using CRREL...3-37 3.3 Quantitative Analysis for NG by GC/TID.........................................................3-38 3.3.1 Introduction

  1. Modifying ``Six Ideas that Shaped Physics'' for a Life-Science major audience at Hope College

    NASA Astrophysics Data System (ADS)

    Mader, Catherine

    2005-04-01

    The ``Six Ideas That Shaped Physics'' textbook has been adapted and used for use in the algebra-based introductory physics course for non-physics science majors at Hope College. The results of the first use will be presented. Comparison of FCI for pre and post test scores will be compared with results from 8 years of results from both the algebra-based course and the calculus-based course (when we first adopted ``Six Ideas that Shaped Physcs" for the Calculus-based course). In addition, comparison on quantitative tests and homework problems with prior student groups will also be made. Because a large fraction of the audience in the algebra-based course is life-science majors, a goal of this project is to make the material relevant for these students. Supplemental materials that emphasize the connection between the life sciences and the fundamental physics concepts are being be developed to accompany the new textbook. Samples of these materials and how they were used (and received) during class testing will be presented.

  2. The Earthquake‐Source Inversion Validation (SIV) Project

    USGS Publications Warehouse

    Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.

  3. Comparison of TRMM 2A25 Products Version 6 and Version 7 with NOAA/NSSL Ground Radar-Based National Mosaic QPE

    NASA Technical Reports Server (NTRS)

    Kirstetter, Pierre-Emmanuel; Hong, Y.; Gourley, J. J.; Schwaller, M.; Petersen, W; Zhang, J.

    2012-01-01

    Characterization of the error associated to satellite rainfall estimates is a necessary component of deterministic and probabilistic frameworks involving spaceborne passive and active microwave measurements for applications ranging from water budget studies to forecasting natural hazards related to extreme rainfall events. We focus here on the error structure of Tropical Rainfall Measurement Mission (TRMM) Precipitation Radar (PR) quantitative precipitation estimation (QPE) at ground. The problem was addressed in a previous paper by comparison of 2A25 version 6 (V6) product with reference values derived from NOAA/NSSL's ground radar-based National Mosaic and QPE system (NMQ/Q2). The primary contribution of this study is to compare the new 2A25 version 7 (V7) products that were recently released as a replacement of V6. This new version is considered superior over land areas. Several aspects of the two versions are compared and quantified including rainfall rate distributions, systematic biases, and random errors. All analyses indicate V7 is an improvement over V6.

  4. Joint association of sleep problems and psychosocial working conditions with registered long-term sickness absence. A Danish cohort study.

    PubMed

    Madsen, Ida Eh; Larsen, Ann D; Thorsen, Sannie V; Pejtersen, Jan H; Rugulies, Reiner; Sivertsen, Børge

    2016-07-01

    Sleep problems and adverse psychosocial working conditions are associated with increased risk of long-term sickness absence. Because sleep problems affect role functioning they may also exacerbate any effects of psychosocial working conditions and vice versa. We examined whether sleep problems and psychosocial working conditions interact in their associations with long-term sickness absence. We linked questionnaire data from participants to two surveys of random samples of the Danish working population (N=10 752) with registries on long-term sick leave during five years after questionnaire response. We defined sleep problems by self-reported symptoms and/or register data on hypnotics purchases of hypnotics. Psychosocial working conditions included quantitative and emotional demands, influence, supervisor recognition and social support, leadership quality, and social support from colleagues. Using time-to-event models, we calculated hazard ratios (HR) and differences and examined interaction as departure from multiplicativity and additivity. During 40 165 person-years of follow-up, we identified 2313 episodes of long-terms sickness absence. Sleep problems predicted risk of long-term sickness absence [HR 1.54, 95% confidence interval (95% CI) 1.38-1.73]. This association was statistically significantly stronger among participants with high quantitative demands and weaker among those with high supervisor recognition (P<0.0001). High quantitative demands exacerbated the association of sleep problems with risk of long-term sickness absence whereas high supervisor recognition buffered this association. To prevent long-term sickness absence among employees with sleep problems, workplace modifications focusing on quantitative demands and supervisor recognition may be considered. Workplace interventions for these factors may more effectively prevent sickness absence when targeted at this group. The efficacy and effectiveness of such interventions needs to be established in future studies.

  5. A Quantitative and Combinatorial Approach to Non-Linear Meanings of Multiplication

    ERIC Educational Resources Information Center

    Tillema, Erik; Gatza, Andrew

    2016-01-01

    We provide a conceptual analysis of how combinatorics problems have the potential to support students to establish non-linear meanings of multiplication (NLMM). The problems we analyze we have used in a series of studies with 6th, 8th, and 10th grade students. We situate the analysis in prior work on students' quantitative and multiplicative…

  6. Coronal magnetic fields and the solar wind

    NASA Technical Reports Server (NTRS)

    Newkirk, G., Jr.

    1972-01-01

    Current information is presented on coronal magnetic fields as they bear on problems of the solar wind. Both steady state fields and coronal transient events are considered. A brief critique is given of the methods of calculating coronal magnetic fields including the potential (current free) models, exact solutions for the solar wind and field interaction, and source surface models. These solutions are compared with the meager quantitative observations which are available at this time. Qualitative comparisons between the shapes of calculated magnetic field lines and the forms visible in the solar corona at several recent eclipses are displayed. These suggest that: (1) coronal streamers develop above extended magnetic arcades which connect unipolar regions of opposite polarity; and (2) loops, arches, and rays in the corona correspond to preferentially filled magnetic tubes in the approximately potential field.

  7. A maximal chromatic expansion method of mapping multichannel imagery into color space. [North Dakota

    NASA Technical Reports Server (NTRS)

    Juday, R. D.; Abotteen, R. A. (Principal Investigator)

    1978-01-01

    The author has identified the following significant results. A color film generation method that maximally expands the chromaticity and aligns Kauth brightness with the gray axis was presented. In comparison with the current LACIE film product, the new color film product has more contrast and more colors and appears to be brighter. The field boundaries in the new product were more pronounced than in the current LACIE product. The speckle effect was one problem in the new product. The yellowness speckle can be treated using an equation. This equation can be used to eliminate any speckle introduced by the greenness. This product leads logically toward another that will employ quantitative colorimetry which will account for some of the eye's perception of color stimuli.

  8. Quality-assurance study of the special - purpose finite-element program - SPECTROM: I. Thermal, thermoelastic, and viscoelastic problems. [Comparison with MARC-CDC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, R.A.

    1980-12-01

    This comparison study involves a preliminary verification of finite element calculations. The methodology of the comparison study consists of solving four example problems with both the SPECTROM finite element program and the MARC-CDC general purpose finite element program. The results show close agreement for all example problems.

  9. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  10. [Qualitative and quantitative comparisons of three individual deprivation scores for outpatients attending a free hospital care clinic in Paris].

    PubMed

    Fouchard, A; Bréchat, P-H; Castiel, D; Pascal, J; Sass, C; Lebas, J; Chauvin, P

    2014-08-01

    Inequality in health care is a growing problem, leading to the development of different tools for the assessment of individual deprivation. In France, three tools are mainly used: Epices (which stands for "score for the evaluation of social deprivation and health inequities among the centers for medical examination"), a score called "Handicap social" and a screening tool built for medical consultations by Pascal et al. at Nantes' hospital. The purpose of this study was to make a metrological assessment of those tools and a quantitative comparison by using them on a single deprived population. In order to assess the metrological properties of the three scores, we used the quality criteria published by Terwee et al. which are: content validity, internal consistency, criterion validity, construct validity, reproducibility (agreement and reliability), responsiveness, floor and ceiling effects and interpretability. For the comparison, we used data from the patients who had attended a free hospital outpatient clinic dedicated to socially deprived people in Paris, during one month in 2010. The "Handicap social" survey was first filled in by the 721 outpatients before being recoded to allow the comparison with the other scores. While the population of interest was quite well defined by all three scores, other quality criteria were less satisfactory. For this outpatient population, the "Handicap social" score classed 3.2% as non-deprived (class 1), 32.7% as socially deprived (class 2) and 64.7% as very deprived (class 3). With the Epices score, the rates of deprivation varied from 97.9% to 100% depending on the way the score was estimated. For the Pascal score, rates ranged from 83.4% to 88.1%. On a subgroup level, only the Pascal score showed statistically significant associations with gender, occupation, education and origin. These three scores have very different goal and meanings. They are not interchangeable. Users should be aware of their advantages and disadvantages in order to use them wisely. Much remains to be done to fully assess their metrological performances. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  11. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    EPA Science Inventory

    Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...

  12. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    PubMed

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  13. Comparative Performance of Reagents and Platforms for Quantitation of Cytomegalovirus DNA by Digital PCR

    PubMed Central

    Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.

    2016-01-01

    A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685

  14. A New Framework for Analysis of Coevolutionary Systems-Directed Graph Representation and Random Walks.

    PubMed

    Chong, Siang Yew; Tiňo, Peter; He, Jun; Yao, Xin

    2017-11-20

    Studying coevolutionary systems in the context of simplified models (i.e., games with pairwise interactions between coevolving solutions modeled as self plays) remains an open challenge since the rich underlying structures associated with pairwise-comparison-based fitness measures are often not taken fully into account. Although cyclic dynamics have been demonstrated in several contexts (such as intransitivity in coevolutionary problems), there is no complete characterization of cycle structures and their effects on coevolutionary search. We develop a new framework to address this issue. At the core of our approach is the directed graph (digraph) representation of coevolutionary problems that fully captures structures in the relations between candidate solutions. Coevolutionary processes are modeled as a specific type of Markov chains-random walks on digraphs. Using this framework, we show that coevolutionary problems admit a qualitative characterization: a coevolutionary problem is either solvable (there is a subset of solutions that dominates the remaining candidate solutions) or not. This has an implication on coevolutionary search. We further develop our framework that provides the means to construct quantitative tools for analysis of coevolutionary processes and demonstrate their applications through case studies. We show that coevolution of solvable problems corresponds to an absorbing Markov chain for which we can compute the expected hitting time of the absorbing class. Otherwise, coevolution will cycle indefinitely and the quantity of interest will be the limiting invariant distribution of the Markov chain. We also provide an index for characterizing complexity in coevolutionary problems and show how they can be generated in a controlled manner.

  15. Comparison of propidium monoazide-quantitative PCR and reverse transcription quantitative PCR for viability detection of fresh Cryptosporidium oocysts following disinfection and after long-term storage in water samples

    EPA Science Inventory

    Purified oocysts of Cryptosporidium parvum were used to evaluate applicability of two quantitative PCR (qPCR) viability detection methods in raw surface water and disinfection treated water. Propidium monoazide-qPCR targeting hsp70 gene was compared to reverse transcription (RT)-...

  16. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances †

    PubMed Central

    Alarifi, Abdulrahman; Al-Salman, AbdulMalik; Alsaleh, Mansour; Alnafessah, Ahmad; Al-Hadhrami, Suheer; Al-Ammar, Mai A.; Al-Khalifa, Hend S.

    2016-01-01

    In recent years, indoor positioning has emerged as a critical function in many end-user applications; including military, civilian, disaster relief and peacekeeping missions. In comparison with outdoor environments, sensing location information in indoor environments requires a higher precision and is a more challenging task in part because various objects reflect and disperse signals. Ultra WideBand (UWB) is an emerging technology in the field of indoor positioning that has shown better performance compared to others. In order to set the stage for this work, we provide a survey of the state-of-the-art technologies in indoor positioning, followed by a detailed comparative analysis of UWB positioning technologies. We also provide an analysis of strengths, weaknesses, opportunities, and threats (SWOT) to analyze the present state of UWB positioning technologies. While SWOT is not a quantitative approach, it helps in assessing the real status and in revealing the potential of UWB positioning to effectively address the indoor positioning problem. Unlike previous studies, this paper presents new taxonomies, reviews some major recent advances, and argues for further exploration by the research community of this challenging problem space. PMID:27196906

  17. Cause and effect: the linkage between the health information seeking behavior and the online environment--a review.

    PubMed

    Bratucu, R; Gheorghe, I R; Purcarea, R M; Gheorghe, C M; Popa Velea, O; Purcarea, V L

    2014-09-15

    Today, health care consumers are taking more control over their health care problems, investing more time in finding and getting information as well as looking for proper methods in order to investigate more closely the health care information received from their physicians. Unfortunately, in health care consumers' views, the trustworthiness of health authorities and institutions has declined in the last years. So, consumers have found a new solution to their health problems, that is, the Internet. Recently, studies revealed that consumers seeking for health information have more options to look for data in comparison to the methods used a few years ago. Therefore, due to the available technology, consumers have more outlets to search for information. For instance, the Internet is a source that has revolutionized the way consumers seek data due its customized methods of assessing both quantitative and qualitative information which may be achieved with minimal effort and low costs, offering at the same time, several advantages such as making the decision process more efficient.

  18. Cause and effect: the linkage between the health information seeking behavior and the online environment- a review

    PubMed Central

    Bratucu, R; Gheorghe, IR; Purcarea, RM; Gheorghe, CM; Popa Velea, O; Purcarea, VL

    2014-01-01

    Abstract Today, health care consumers are taking more control over their health care problems, investing more time in finding and getting information as well as looking for proper methods in order to investigate more closely the health care information received from their physicians. Unfortunately, in health care consumers’ views, the trustworthiness of health authorities and institutions has declined in the last years. So, consumers have found a new solution to their health problems, that is, the Internet. Recently, studies revealed that consumers seeking for health information have more options to look for data in comparison to the methods used a few years ago. Therefore, due to the available technology, consumers have more outlets to search for information. For instance, the Internet is a source that has revolutionized the way consumers seek data due its customized methods of assessing both quantitative and qualitative information which may be achieved with minimal effort and low costs, offering at the same time, several advantages such as making the decision process more efficient. PMID:25408746

  19. Metabolic Compartmentation – A System Level Property of Muscle Cells

    PubMed Central

    Saks, Valdur; Beraud, Nathalie; Wallimann, Theo

    2008-01-01

    Problems of quantitative investigation of intracellular diffusion and compartmentation of metabolites are analyzed. Principal controversies in recently published analyses of these problems for the living cells are discussed. It is shown that the formal theoretical analysis of diffusion of metabolites based on Fick's equation and using fixed diffusion coefficients for diluted homogenous aqueous solutions, but applied for biological systems in vivo without any comparison with experimental results, may lead to misleading conclusions, which are contradictory to most biological observations. However, if the same theoretical methods are used for analysis of actual experimental data, the apparent diffusion constants obtained are orders of magnitude lower than those in diluted aqueous solutions. Thus, it can be concluded that local restrictions of diffusion of metabolites in a cell are a system-level properties caused by complex structural organization of the cells, macromolecular crowding, cytoskeletal networks and organization of metabolic pathways into multienzyme complexes and metabolons. This results in microcompartmentation of metabolites, their channeling between enzymes and in modular organization of cellular metabolic networks. The perspectives of further studies of these complex intracellular interactions in the framework of Systems Biology are discussed. PMID:19325782

  20. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances.

    PubMed

    Alarifi, Abdulrahman; Al-Salman, AbdulMalik; Alsaleh, Mansour; Alnafessah, Ahmad; Al-Hadhrami, Suheer; Al-Ammar, Mai A; Al-Khalifa, Hend S

    2016-05-16

    In recent years, indoor positioning has emerged as a critical function in many end-user applications; including military, civilian, disaster relief and peacekeeping missions. In comparison with outdoor environments, sensing location information in indoor environments requires a higher precision and is a more challenging task in part because various objects reflect and disperse signals. Ultra WideBand (UWB) is an emerging technology in the field of indoor positioning that has shown better performance compared to others. In order to set the stage for this work, we provide a survey of the state-of-the-art technologies in indoor positioning, followed by a detailed comparative analysis of UWB positioning technologies. We also provide an analysis of strengths, weaknesses, opportunities, and threats (SWOT) to analyze the present state of UWB positioning technologies. While SWOT is not a quantitative approach, it helps in assessing the real status and in revealing the potential of UWB positioning to effectively address the indoor positioning problem. Unlike previous studies, this paper presents new taxonomies, reviews some major recent advances, and argues for further exploration by the research community of this challenging problem space.

  1. Communicating Treatment Risk Reduction to People With Low Numeracy Skills: A Cross-Cultural Comparison

    PubMed Central

    2009-01-01

    Objectives. We sought to address denominator neglect (i.e. the focus on the number of treated and nontreated patients who died, without sufficiently considering the overall numbers of patients) in estimates of treatment risk reduction, and analyzed whether icon arrays aid comprehension. Methods. We performed a survey of probabilistic, national samples in the United States and Germany in July and August of 2008. Participants received scenarios involving equally effective treatments but differing in the overall number of treated and nontreated patients. In some conditions, the number who received a treatment equaled the number who did not; in others the number was smaller or larger. Some participants received icon arrays. Results. Participants—particularly those with low numeracy skills—showed denominator neglect in treatment risk reduction perceptions. Icon arrays were an effective method for eliminating denominator neglect. We found cross-cultural differences that are important in light of the countries' different medical systems. Conclusions. Problems understanding numerical information often reside not in the mind but in the problem's representation. These findings suggest suitable ways to communicate quantitative medical data. PMID:19833983

  2. Application of boundary integral equations to elastoplastic problems

    NASA Technical Reports Server (NTRS)

    Mendelson, A.; Albers, L. U.

    1975-01-01

    The application of boundary integral equations to elastoplastic problems is reviewed. Details of the analysis as applied to torsion problems and to plane problems is discussed. Results are presented for the elastoplastic torsion of a square cross section bar and for the plane problem of notched beams. A comparison of different formulations as well as comparisons with experimental results are presented.

  3. Teaching Integrative Physiology Using the Quantitative Circulatory Physiology Model and Case Discussion Method: Evaluation of the Learning Experience

    ERIC Educational Resources Information Center

    Rodriguez-Barbero, A.; Lopez-Novoa, J. M.

    2008-01-01

    One of the problems that we have found when teaching human physiology in a Spanish medical school is that the degree of understanding by the students of the integration between organs and systems is rather poor. We attempted to remedy this problem by using a case discussion method together with the Quantitative Circulatory Physiology (QCP)…

  4. A quantitative comparison of leading-edge vortices in incompressible and supersonic flows

    DOT National Transportation Integrated Search

    2002-01-14

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that pl...

  5. Continuum simulation of the discharge of the granular silo: a validation test for the μ(I) visco-plastic flow law.

    PubMed

    Staron, L; Lagrée, P-Y; Popinet, S

    2014-01-01

    Using a continuum Navier-Stokes solver with the μ(I) flow law implemented to model the viscous behavior, and the discrete Contact Dynamics algorithm, the discharge of granular silos is simulated in two dimensions from the early stages of the discharge until complete release of the material. In both cases, the Beverloo scaling is recovered. We first do not attempt a quantitative comparison, but focus on the qualitative behavior of velocity and pressure at different locations in the flow. A good agreement for the velocity is obtained in the regions of rapid flows, while areas of slow creep are not entirely captured by the continuum model. The pressure field shows a general good agreement, while bulk deformations are found to be similar in both approaches. The influence of the parameters of the μ(I) flow law is systematically investigated, showing the importance of the dependence on the inertial number I to achieve quantitative agreement between continuum and discrete discharge. However, potential problems involving the systems size, the configuration and "non-local" effects, are suggested. Yet the general ability of the continuum model to reproduce qualitatively the granular behavior is found to be very encouraging.

  6. Comparative spectral analysis of veterinary powder product by continuous wavelet and derivative transforms

    NASA Astrophysics Data System (ADS)

    Dinç, Erdal; Kanbur, Murat; Baleanu, Dumitru

    2007-10-01

    Comparative simultaneous determination of chlortetracycline and benzocaine in the commercial veterinary powder product was carried out by continuous wavelet transform (CWT) and classical derivative transform (or classical derivative spectrophotometry). In this quantitative spectral analysis, two proposed analytical methods do not require any chemical separation process. In the first step, several wavelet families were tested to find an optimal CWT for the overlapping signal processing of the analyzed compounds. Subsequently, we observed that the coiflets (COIF-CWT) method with dilation parameter, a = 400, gives suitable results for this analytical application. For a comparison, the classical derivative spectrophotometry (CDS) approach was also applied to the simultaneous quantitative resolution of the same analytical problem. Calibration functions were obtained by measuring the transform amplitudes corresponding to zero-crossing points for both CWT and CDS methods. The utility of these two analytical approaches were verified by analyzing various synthetic mixtures consisting of chlortetracycline and benzocaine and they were applied to the real samples consisting of veterinary powder formulation. The experimental results obtained from the COIF-CWT approach were statistically compared with those obtained by classical derivative spectrophotometry and successful results were reported.

  7. Automated Detection of Electroencephalography Artifacts in Human, Rodent and Canine Subjects using Machine Learning.

    PubMed

    Levitt, Joshua; Nitenson, Adam; Koyama, Suguru; Heijmans, Lonne; Curry, James; Ross, Jason T; Kamerling, Steven; Saab, Carl Y

    2018-06-23

    Electroencephalography (EEG) invariably contains extra-cranial artifacts that are commonly dealt with based on qualitative and subjective criteria. Failure to account for EEG artifacts compromises data interpretation. We have developed a quantitative and automated support vector machine (SVM)-based algorithm to accurately classify artifactual EEG epochs in awake rodent, canine and humans subjects. An embodiment of this method also enables the determination of 'eyes open/closed' states in human subjects. The levels of SVM accuracy for artifact classification in humans, Sprague Dawley rats and beagle dogs were 94.17%, 83.68%, and 85.37%, respectively, whereas 'eyes open/closed' states in humans were labeled with 88.60% accuracy. Each of these results was significantly higher than chance. Comparison with Existing Methods: Other existing methods, like those dependent on Independent Component Analysis, have not been tested in non-human subjects, and require full EEG montages, instead of only single channels, as this method does. We conclude that our EEG artifact detection algorithm provides a valid and practical solution to a common problem in the quantitative analysis and assessment of EEG in pre-clinical research settings across evolutionary spectra. Copyright © 2018. Published by Elsevier B.V.

  8. Quantitative evaluation of the matrix effect in bioanalytical methods based on LC-MS: A comparison of two approaches.

    PubMed

    Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna

    2018-06-05

    Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Representational task formats and problem solving strategies in kinematics and work

    NASA Astrophysics Data System (ADS)

    Ibrahim, Bashirah; Rebello, N. Sanjay

    2012-06-01

    Previous studies have reported that students employed different problem solving approaches when presented with the same task structured with different representations. In this study, we explored and compared students’ strategies as they attempted tasks from two topical areas, kinematics and work. Our participants were 19 engineering students taking a calculus-based physics course. The tasks were presented in linguistic, graphical, and symbolic forms and requested either a qualitative solution or a value. The analysis was both qualitative and quantitative in nature focusing principally on the characteristics of the strategies employed as well as the underlying reasoning for their applications. A comparison was also made for the same student’s approach with the same kind of representation across the two topics. Additionally, the participants’ overall strategies across the different tasks, in each topic, were considered. On the whole, we found that the students prefer manipulating equations irrespective of the representational format of the task. They rarely recognized the applicability of a “qualitative” approach to solve the problem although they were aware of the concepts involved. Even when the students included visual representations in their solutions, they seldom used these representations in conjunction with the mathematical part of the problem. Additionally, the students were not consistent in their approach for interpreting and solving problems with the same kind of representation across the two topical areas. The representational format, level of prior knowledge, and familiarity with a topic appeared to influence their strategies, their written responses, and their ability to recognize qualitative ways to attempt a problem. The nature of the solution does not seem to impact the strategies employed to handle the problem.

  10. Social Comparison and Body Image in Adolescence: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Krayer, A.; Ingledew, D. K.; Iphofen, R.

    2008-01-01

    This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…

  11. Target Scattering Metrics: Model-Model and Model-Data Comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  12. Target Scattering Metrics: Model-Model and Model Data comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  13. Teaching Problem-Solving Skills to Nuclear Engineering Students

    ERIC Educational Resources Information Center

    Waller, E.; Kaye, M. H.

    2012-01-01

    Problem solving is an essential skill for nuclear engineering graduates entering the workforce. Training in qualitative and quantitative aspects of problem solving allows students to conceptualise and execute solutions to complex problems. Solutions to problems in high consequence fields of study such as nuclear engineering require rapid and…

  14. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  15. Integrating Quantitative Skills in Introductory Ecology: Investigations of Wild Bird Feeding Preferences

    ERIC Educational Resources Information Center

    Small, Christine J.; Newtoff, Kiersten N.

    2013-01-01

    Undergraduate biology education is undergoing dramatic changes, emphasizing student training in the "tools and practices" of science, particularly quantitative and problem-solving skills. We redesigned a freshman ecology lab to emphasize the importance of scientific inquiry and quantitative reasoning in biology. This multi-week investigation uses…

  16. Qualitative Research? Quantitative Research? What's the Problem? Resolving the Dilemma via a Postconstructivist Approach.

    ERIC Educational Resources Information Center

    Shank, Gary

    It is argued that the debate between qualitative and quantitative research for educational researchers is actually an argument between constructivism and positivism. Positivism has been the basis for most quantitative research in education. Two different things are actually meant when constructivism is discussed (constructivism and…

  17. Promoting Quantitative Literacy in an Online College Algebra Course

    ERIC Educational Resources Information Center

    Tunstall, Luke; Bossé, Michael J.

    2016-01-01

    College algebra (a university freshman level algebra course) fulfills the quantitative literacy requirement of many college's general education programs and is a terminal course for most who take it. An online problem-based learning environment provides a unique means of engaging students in quantitative discussions and research. This article…

  18. KEY COMPARISON: Key comparison CCQM-K60: Total selenium and selenomethionine in selenised wheat flour

    NASA Astrophysics Data System (ADS)

    Goenaga Infante, Heidi; Sargent, Mike

    2010-01-01

    Key comparison CCQM-K60 was performed to assess the analytical capabilities of national metrology institutes (NMIs) to accurately quantitate the mass fraction of selenomethionine (SeMet) and total selenium (at low mg kg-1 levels) in selenised wheat flour. It was organized by the Inorganic Analysis Working Group (IAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM) as a follow-up key comparison to the previous pilot study CCQM-P86 on selenised yeast tablets. LGC Limited (Teddington, UK) and the Institute for National Measurement Standards, National Research Council Canada (NRCC, Ottawa, Canada) acted as the coordinating laboratories. CCQM-K60 was organized in parallel with a pilot study (CCQM-P86.1) involving not only NMIs but also expert laboratories worldwide, thus enabling them to assess their capabilities, discover problems and learn how to modify analytical procedures accordingly. Nine results for total Se and four results for SeMet were reported by the participant NMIs. Methods used for sample preparation were microwave assisted acid digestion for total Se and multiple-step enzymatic hydrolysis and hydrolysis with methanesulfonic acid for SeMet. For total Se, detection techniques included inductively coupled plasma mass spectrometry (ICP-MS) with external calibration, standard additions or isotope dilution analysis (IDMS); instrumental neutron activation analysis (INAA); and graphite furnace atomic absorption spectrometry (GFAAS) with external calibration. For determination of SeMet in the wheat flour sample, the four NMIs relied upon measurements using species-specific IDMS (using 76Se-enriched SeMet) with HPLC-ICP-MS. Eight of the nine participating NMIs reported results for total Se within 3.5% deviation from the key comparison reference value (KCRV). For SeMet, the four participating NMIs reported results within 3.2% deviation from the KCRV. This shows that the performance of the majority of the CCQM-K60 participants was very good, illustrating their ability to obtain accurate results for such analytes in a complex food matrix containing approximately 17 mg kg-1 Se. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  19. Constructivist, Problem-Based Learning Does Work: A Meta-Analysis of Curricular Comparisons Involving a Single Medical School

    ERIC Educational Resources Information Center

    Schmidt, Henk G.; van der Molen, Henk T.; te Winkel, Wilco W. R.; Wijnen, Wynand H. F. W.

    2009-01-01

    Effects of problem-based learning as reported in curricular comparison studies have been shown to be inconsistent over different medical schools. Therefore, we decided to summarize effects of a single well-established problem-based curriculum rather than to add up sometimes-conflicting findings from different problem-based curricula. Effect sizes…

  20. Family-Centered Care in Juvenile Justice Institutions: A Mixed Methods Study Protocol.

    PubMed

    Simons, Inge; Mulder, Eva; Rigter, Henk; Breuk, René; van der Vaart, Wander; Vermeiren, Robert

    2016-09-12

    Treatment and rehabilitation interventions in juvenile justice institutions aim to prevent criminal reoffending by adolescents and to enhance their prospects of successful social reintegration. There is evidence that these goals are best achieved when the institution adopts a family-centered approach, involving the parents of the adolescents. The Academic Workplace Forensic Care for Youth has developed two programs for family-centered care for youth detained in groups for short-term and long-term stay, respectively. The overall aim of our study is to evaluate the family-centered care program in the first two years after the first steps of its implementation in short-term stay groups of two juvenile justice institutions in the Netherlands. The current paper discusses our study design. Based on a quantitative pilot study, we opted for a study with an explanatory sequential mixed methods design. This pilot is considered the first stage of our study. The second stage of our study includes concurrent quantitative and qualitative approaches. The quantitative part of our study is a pre-post quasi-experimental comparison of family-centered care with usual care in short-term stay groups. The qualitative part of our study involves in-depth interviews with adolescents, parents, and group workers to elaborate on the preceding quantitative pilot study and to help interpret the outcomes of the quasi-experimental quantitative part of the study. We believe that our study will result in the following findings. In the quantitative comparison of usual care with family-centered care, we assume that in the latter group, parents will be more involved with their child and with the institution, and that parents and adolescents will be more motivated to take part in therapy. In addition, we expect family-centered care to improve family interactions, to decrease parenting stress, and to reduce problem behavior among the adolescents. Finally, we assume that adolescents, parents, and the staff of the institutions will be more satisfied with family-centered care than with usual care. In the qualitative part of our study, we will identify the needs and expectations in family-centered care as well as factors influencing parental participation. Insight in these factors will help to further improve our program of family-centered care and its implementation in practice. Our study results will be published over the coming years. A juvenile justice institution is a difficult setting to evaluate care programs. A combination of practice-based research methods is needed to address all major implementation issues. The study described here takes on the challenge by means of practice-based research. We expect the results of our study to contribute to the improvement of care for adolescents detained in juvenile justice institutions, and for their families.

  1. Quantitative Diagnosis of Continuous-Valued, Stead-State Systems

    NASA Technical Reports Server (NTRS)

    Rouquette, N.

    1995-01-01

    Quantitative diagnosis involves numerically estimating the values of unobservable parameters that best explain the observed parameter values. We consider quantitative diagnosis for continuous, lumped- parameter, steady-state physical systems because such models are easy to construct and the diagnosis problem is considerably simpler than that for corresponding dynamic models. To further tackle the difficulties of numerically inverting a simulation model to compute a diagnosis, we propose to decompose a physical system model in terms of feedback loops. This decomposition reduces the dimension of the problem and consequently decreases the diagnosis search space. We illustrate this approach on a model of thermal control system studied in earlier research.

  2. Quantitation of spatially-localized proteins in tissue samples using MALDI-MRM imaging.

    PubMed

    Clemis, Elizabeth J; Smith, Derek S; Camenzind, Alexander G; Danell, Ryan M; Parker, Carol E; Borchers, Christoph H

    2012-04-17

    MALDI imaging allows the creation of a "molecular image" of a tissue slice. This image is reconstructed from the ion abundances in spectra obtained while rastering the laser over the tissue. These images can then be correlated with tissue histology to detect potential biomarkers of, for example, aberrant cell types. MALDI, however, is known to have problems with ion suppression, making it difficult to correlate measured ion abundance with concentration. It would be advantageous to have a method which could provide more accurate protein concentration measurements, particularly for screening applications or for precise comparisons between samples. In this paper, we report the development of a novel MALDI imaging method for the localization and accurate quantitation of proteins in tissues. This method involves optimization of in situ tryptic digestion, followed by reproducible and uniform deposition of an isotopically labeled standard peptide from a target protein onto the tissue, using an aerosol-generating device. Data is acquired by MALDI multiple reaction monitoring (MRM) mass spectrometry (MS), and accurate peptide quantitation is determined from the ratio of MRM transitions for the endogenous unlabeled proteolytic peptides to the corresponding transitions from the applied isotopically labeled standard peptides. In a parallel experiment, the quantity of the labeled peptide applied to the tissue was determined using a standard curve generated from MALDI time-of-flight (TOF) MS data. This external calibration curve was then used to determine the quantity of endogenous peptide in a given area. All standard curves generate by this method had coefficients of determination greater than 0.97. These proof-of-concept experiments using MALDI MRM-based imaging show the feasibility for the precise and accurate quantitation of tissue protein concentrations over 2 orders of magnitude, while maintaining the spatial localization information for the proteins.

  3. Bayesian parameter estimation in spectral quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Pulkkinen, Aki; Cox, Ben T.; Arridge, Simon R.; Kaipio, Jari P.; Tarvainen, Tanja

    2016-03-01

    Photoacoustic tomography (PAT) is an imaging technique combining strong contrast of optical imaging to high spatial resolution of ultrasound imaging. These strengths are achieved via photoacoustic effect, where a spatial absorption of light pulse is converted into a measurable propagating ultrasound wave. The method is seen as a potential tool for small animal imaging, pre-clinical investigations, study of blood vessels and vasculature, as well as for cancer imaging. The goal in PAT is to form an image of the absorbed optical energy density field via acoustic inverse problem approaches from the measured ultrasound data. Quantitative PAT (QPAT) proceeds from these images and forms quantitative estimates of the optical properties of the target. This optical inverse problem of QPAT is illposed. To alleviate the issue, spectral QPAT (SQPAT) utilizes PAT data formed at multiple optical wavelengths simultaneously with optical parameter models of tissue to form quantitative estimates of the parameters of interest. In this work, the inverse problem of SQPAT is investigated. Light propagation is modelled using the diffusion equation. Optical absorption is described with chromophore concentration weighted sum of known chromophore absorption spectra. Scattering is described by Mie scattering theory with an exponential power law. In the inverse problem, the spatially varying unknown parameters of interest are the chromophore concentrations, the Mie scattering parameters (power law factor and the exponent), and Gruneisen parameter. The inverse problem is approached with a Bayesian method. It is numerically demonstrated, that estimation of all parameters of interest is possible with the approach.

  4. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  5. Quantitative imaging technique using the layer-stripping algorithm

    NASA Astrophysics Data System (ADS)

    Beilina, L.

    2017-07-01

    We present the layer-stripping algorithm for the solution of the hyperbolic coefficient inverse problem (CIP). Our numerical examples show quantitative reconstruction of small tumor-like inclusions in two-dimensions.

  6. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  7. Qualitative and Quantitative Analysis for Facial Complexion in Traditional Chinese Medicine

    PubMed Central

    Zhao, Changbo; Li, Guo-zheng; Li, Fufeng; Wang, Zhi; Liu, Chang

    2014-01-01

    Facial diagnosis is an important and very intuitive diagnostic method in Traditional Chinese Medicine (TCM). However, due to its qualitative and experience-based subjective property, traditional facial diagnosis has a certain limitation in clinical medicine. The computerized inspection method provides classification models to recognize facial complexion (including color and gloss). However, the previous works only study the classification problems of facial complexion, which is considered as qualitative analysis in our perspective. For quantitative analysis expectation, the severity or degree of facial complexion has not been reported yet. This paper aims to make both qualitative and quantitative analysis for facial complexion. We propose a novel feature representation of facial complexion from the whole face of patients. The features are established with four chromaticity bases splitting up by luminance distribution on CIELAB color space. Chromaticity bases are constructed from facial dominant color using two-level clustering; the optimal luminance distribution is simply implemented with experimental comparisons. The features are proved to be more distinctive than the previous facial complexion feature representation. Complexion recognition proceeds by training an SVM classifier with the optimal model parameters. In addition, further improved features are more developed by the weighted fusion of five local regions. Extensive experimental results show that the proposed features achieve highest facial color recognition performance with a total accuracy of 86.89%. And, furthermore, the proposed recognition framework could analyze both color and gloss degrees of facial complexion by learning a ranking function. PMID:24967342

  8. The earth's radiation budget and its relation to atmospheric hydrology. III - Comparison of observations over the oceans with a GCM

    NASA Technical Reports Server (NTRS)

    Stephens, Graeme L.; Randall, David A.; Wittmeyer, Ian L.; Dazlich, Donald A.; Tjemkes, Stephen

    1993-01-01

    The ability of the Colorado State University general circulation model (GCM) to simulate interactions between the hydrological cycle and the radiative processes on earth was examined by comparing various sensitivity relationships established by the model with those observed on earth, and the observed and calculated seasonal cycles of the greenhouse effect and cloud radiative forcing. Results showed that, although the GCM model used was able to simulate well some aspects of the observed sensitivities, there were many serious quantitative differences, including problems in the simulation of the column vapor in the tropics and an excessively strong clear-sky greenhouse effect in the mid-latitudes. These differences led to an underestimation by the model of the sensitivity of the clear-sky greenhouse to changes in sea surface temperature.

  9. A coupled sharp-interface immersed boundary-finite-element method for flow-structure interaction with application to human phonation.

    PubMed

    Zheng, X; Xue, Q; Mittal, R; Beilamowicz, S

    2010-11-01

    A new flow-structure interaction method is presented, which couples a sharp-interface immersed boundary method flow solver with a finite-element method based solid dynamics solver. The coupled method provides robust and high-fidelity solution for complex flow-structure interaction (FSI) problems such as those involving three-dimensional flow and viscoelastic solids. The FSI solver is used to simulate flow-induced vibrations of the vocal folds during phonation. Both two- and three-dimensional models have been examined and qualitative, as well as quantitative comparisons, have been made with established results in order to validate the solver. The solver is used to study the onset of phonation in a two-dimensional laryngeal model and the dynamics of the glottal jet in a three-dimensional model and results from these studies are also presented.

  10. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    PubMed

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Noise Maps for Quantitative and Clinical Severity Towards Long-Term ECG Monitoring.

    PubMed

    Everss-Villalba, Estrella; Melgarejo-Meseguer, Francisco Manuel; Blanco-Velasco, Manuel; Gimeno-Blanes, Francisco Javier; Sala-Pla, Salvador; Rojo-Álvarez, José Luis; García-Alberola, Arcadi

    2017-10-25

    Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.

  12. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  13. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  14. The Influence of Self-Efficacy Beliefs and Metacognitive Prompting on Genetics Problem Solving Ability among High School Students in Kenya

    NASA Astrophysics Data System (ADS)

    Aurah, Catherine Muhonja

    Within the framework of social cognitive theory, the influence of self-efficacy beliefs and metacognitive prompting on genetics problem solving ability among high school students in Kenya was examined through a mixed methods research design. A quasi-experimental study, supplemented by focus group interviews, was conducted to investigate both the outcomes and the processes of students' genetics problem-solving ability. Focus group interviews substantiated and supported findings from the quantitative instruments. The study was conducted in 17 high schools in Western Province, Kenya. A total of 2,138 high school students were purposively sampled. A sub-sample of 48 students participated in focus group interviews to understand their perspectives and experiences during the study so as to corroborate the quantitative data. Quantitative data were analyzed through descriptive statistics, zero-order correlations, 2 x 2 factorial ANOVA,, and sequential hierarchical multiple regressions. Qualitative data were transcribed, coded, and reported thematically. Results revealed metacognitive prompts had significant positive effects on student problem-solving ability independent of gender. Self-efficacy and metacognitive prompting significantly predicted genetics problem-solving ability. Gender differences were revealed, with girls outperforming boys on the genetics problem-solving test. Furthermore, self-efficacy moderated the relationship between metacognitive prompting and genetics problem-solving ability. This study established a foundation for instructional methods for biology teachers and recommendations are made for implementing metacognitive prompting in a problem-based learning environment in high schools and science teacher education programs in Kenya.

  15. Comparison between data obtained through real-time data capture by SMS and a retrospective telephone interview.

    PubMed

    Johansen, Bendt; Wedderkopp, Niels

    2010-05-26

    The aims of the current study were: a) to quantitatively compare data obtained by Short Message Service (SMS) with data from a telephone interview, b) to investigate whether the respondents had found it acceptable to answer the weekly two SMS questions, c) to explore whether an additional weekly third SMS question would have been acceptable, and d) to calculate the total cost of using the SMS technology. SMS technology was used each week for 53 weeks to monitor 260 patients with low back pain (LBP) in a clinical study. Each week, these patients were asked the same two questions: "How many days in the past week have you had problems due to LBP?" and "How many days in the past week have you been off work due to LBP problems?" The last 31 patients were also contacted by telephone 53 weeks after recruitment and asked to recall the number of days with LBP problems and days off work for the a) past week, b) past month, and c) past year. The two sets of answers to the same questions for these patients were compared. Patients were also asked whether a third SMS question would have been acceptable. The test-retest reliability was compared for 1-week, 1-month, and 1-year. Bland-Altman limits of agreement were calculated. The two quantitative questions were reported as percentages. Actual costs for the SMS-Track-Questionnaire (SMS-T-Q) were compared with estimated costs for paper version surveys. There was high agreement between telephone interview and SMS-T-Q responses for the 1-week and 1-month recall. In contrast, the 1-year recall showed very low agreement. A third SMS question would have been acceptable. The SMS system was considerably less costly than a paper-based survey, beyond a certain threshold number of questionnaires. SMS-T-Q appears to be a cheaper and better method to collect reliable LBP data than paper-based surveys.

  16. Selecting Evaluation Comparison Groups: A Cluster Analytic Approach.

    ERIC Educational Resources Information Center

    Davis, Todd Mclin; McLean, James E.

    A persistent problem in the evaluation of field-based projects is the lack of no-treatment comparison groups. Frequently, potential comparison groups are confounded by socioeconomic, racial, or other factors. Among the possible methods for dealing with this problem are various matching procedures, but they are cumbersome to use with multiple…

  17. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    ERIC Educational Resources Information Center

    Walters, Charles David

    2017-01-01

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008)…

  18. Embedding Quantitative Methods by Stealth in Political Science: Developing a Pedagogy for Psephology

    ERIC Educational Resources Information Center

    Gunn, Andrew

    2017-01-01

    Student evaluations of quantitative methods courses in political science often reveal they are characterised by aversion, alienation and anxiety. As a solution to this problem, this paper describes a pedagogic research project with the aim of embedding quantitative methods by stealth into the first-year undergraduate curriculum. This paper…

  19. Integrating Quantitative Thinking into an Introductory Biology Course Improves Students' Mathematical Reasoning in Biological Contexts

    ERIC Educational Resources Information Center

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students' apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course…

  20. The new AP Physics exams: Integrating qualitative and quantitative reasoning

    NASA Astrophysics Data System (ADS)

    Elby, Andrew

    2015-04-01

    When physics instructors and education researchers emphasize the importance of integrating qualitative and quantitative reasoning in problem solving, they usually mean using those types of reasoning serially and separately: first students should analyze the physical situation qualitatively/conceptually to figure out the relevant equations, then they should process those equations quantitatively to generate a solution, and finally they should use qualitative reasoning to check that answer for plausibility (Heller, Keith, & Anderson, 1992). The new AP Physics 1 and 2 exams will, of course, reward this approach to problem solving. But one kind of free response question will demand and reward a further integration of qualitative and quantitative reasoning, namely mathematical modeling and sense-making--inventing new equations to capture a physical situation and focusing on proportionalities, inverse proportionalities, and other functional relations to infer what the equation ``says'' about the physical world. In this talk, I discuss examples of these qualitative-quantitative translation questions, highlighting how they differ from both standard quantitative and standard qualitative questions. I then discuss the kinds of modeling activities that can help AP and college students develop these skills and habits of mind.

  1. Information-theoretic model comparison unifies saliency metrics

    PubMed Central

    Kümmerer, Matthias; Wallis, Thomas S. A.; Bethge, Matthias

    2015-01-01

    Learning the properties of an image associated with human gaze placement is important both for understanding how biological systems explore the environment and for computer vision applications. There is a large literature on quantitative eye movement models that seeks to predict fixations from images (sometimes termed “saliency” prediction). A major problem known to the field is that existing model comparison metrics give inconsistent results, causing confusion. We argue that the primary reason for these inconsistencies is because different metrics and models use different definitions of what a “saliency map” entails. For example, some metrics expect a model to account for image-independent central fixation bias whereas others will penalize a model that does. Here we bring saliency evaluation into the domain of information by framing fixation prediction models probabilistically and calculating information gain. We jointly optimize the scale, the center bias, and spatial blurring of all models within this framework. Evaluating existing metrics on these rephrased models produces almost perfect agreement in model rankings across the metrics. Model performance is separated from center bias and spatial blurring, avoiding the confounding of these factors in model comparison. We additionally provide a method to show where and how models fail to capture information in the fixations on the pixel level. These methods are readily extended to spatiotemporal models of fixation scanpaths, and we provide a software package to facilitate their use. PMID:26655340

  2. High-Speed Microscale Optical Tracking Using Digital Frequency-Domain Multiplexing.

    PubMed

    Maclachlan, Robert A; Riviere, Cameron N

    2009-06-01

    Position-sensitive detectors (PSDs), or lateral-effect photodiodes, are commonly used for high-speed, high-resolution optical position measurement. This paper describes the instrument design for multidimensional position and orientation measurement based on the simultaneous position measurement of multiple modulated sources using frequency-domain-multiplexed (FDM) PSDs. The important advantages of this optical configuration in comparison with laser/mirror combinations are that it has a large angular measurement range and allows the use of a probe that is small in comparison with the measurement volume. We review PSD characteristics and quantitative resolution limits, consider the lock-in amplifier measurement system as a communication link, discuss the application of FDM to PSDs, and make comparisons with time-domain techniques. We consider the phase-sensitive detector as a multirate DSP problem, explore parallels with Fourier spectral estimation and filter banks, discuss how to choose the modulation frequencies and sample rates that maximize channel isolation under design constraints, and describe efficient digital implementation. We also discuss hardware design considerations, sensor calibration, probe construction and calibration, and 3-D measurement by triangulation using two sensors. As an example, we characterize the resolution, speed, and accuracy of an instrument that measures the position and orientation of a 10 mm × 5 mm probe in 5 degrees of freedom (DOF) over a 30-mm cube with 4-μm peak-to-peak resolution at 1-kHz sampling.

  3. A Fiducial Approach to Extremes and Multiple Comparisons

    ERIC Educational Resources Information Center

    Wandler, Damian V.

    2010-01-01

    Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…

  4. A New Problem-Posing Approach Based on Problem-Solving Strategy: Analyzing Pre-Service Primary School Teachers' Performance

    ERIC Educational Resources Information Center

    Kiliç, Çigdem

    2017-01-01

    This study examined pre-service primary school teachers' performance in posing problems that require knowledge of problem-solving strategies. Quantitative and qualitative methods were combined. The 120 participants were asked to pose a problem that could be solved by using the find-a-pattern a particular problem-solving strategy. After that,…

  5. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  6. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  7. Direct Estimation of Optical Parameters From Photoacoustic Time Series in Quantitative Photoacoustic Tomography.

    PubMed

    Pulkkinen, Aki; Cox, Ben T; Arridge, Simon R; Goh, Hwan; Kaipio, Jari P; Tarvainen, Tanja

    2016-11-01

    Estimation of optical absorption and scattering of a target is an inverse problem associated with quantitative photoacoustic tomography. Conventionally, the problem is expressed as two folded. First, images of initial pressure distribution created by absorption of a light pulse are formed based on acoustic boundary measurements. Then, the optical properties are determined based on these photoacoustic images. The optical stage of the inverse problem can thus suffer from, for example, artefacts caused by the acoustic stage. These could be caused by imperfections in the acoustic measurement setting, of which an example is a limited view acoustic measurement geometry. In this work, the forward model of quantitative photoacoustic tomography is treated as a coupled acoustic and optical model and the inverse problem is solved by using a Bayesian approach. Spatial distribution of the optical properties of the imaged target are estimated directly from the photoacoustic time series in varying acoustic detection and optical illumination configurations. It is numerically demonstrated, that estimation of optical properties of the imaged target is feasible in limited view acoustic detection setting.

  8. Comparative study of the dynamics of lipid membrane phase decomposition in experiment and simulation.

    PubMed

    Burger, Stefan; Fraunholz, Thomas; Leirer, Christian; Hoppe, Ronald H W; Wixforth, Achim; Peter, Malte A; Franke, Thomas

    2013-06-25

    Phase decomposition in lipid membranes has been the subject of numerous investigations by both experiment and theoretical simulation, yet quantitative comparisons of the simulated data to the experimental results are rare. In this work, we present a novel way of comparing the temporal development of liquid-ordered domains obtained from numerically solving the Cahn-Hilliard equation and by inducing a phase transition in giant unilamellar vesicles (GUVs). Quantitative comparison is done by calculating the structure factor of the domain pattern. It turns out that the decomposition takes place in three distinct regimes in both experiment and simulation. These regimes are characterized by different rates of growth of the mean domain diameter, and there is quantitative agreement between experiment and simulation as to the duration of each regime and the absolute rate of growth in each regime.

  9. Inquiry-based problem solving in introductory physics

    NASA Astrophysics Data System (ADS)

    Koleci, Carolann

    What makes problem solving in physics difficult? How do students solve physics problems, and how does this compare to an expert physicist's strategy? Over the past twenty years, physics education research has revealed several differences between novice and expert problem solving. The work of Chi, Feltovich, and Glaser demonstrates that novices tend to categorize problems based on surface features, while experts categorize according to theory, principles, or concepts1. If there are differences between how problems are categorized, then are there differences between how physics problems are solved? Learning more about the problem solving process, including how students like to learn and what is most effective, requires both qualitative and quantitative analysis. In an effort to learn how novices and experts solve introductory electricity problems, a series of in-depth interviews were conducted, transcribed, and analyzed, using both qualitative and quantitative methods. One-way ANOVA tests were performed in order to learn if there are any significant problem solving differences between: (a) novices and experts, (b) genders, (c) students who like to answer questions in class and those who don't, (d) students who like to ask questions in class and those who don't, (e) students employing an interrogative approach to problem solving and those who don't, and (f) those who like physics and those who dislike it. The results of both the qualitative and quantitative methods reveal that inquiry-based problem solving is prevalent among novices and experts, and frequently leads to the correct physics. These findings serve as impetus for the third dimension of this work: the development of Choose Your Own Adventure Physics(c) (CYOAP), an innovative teaching tool in physics which encourages inquiry-based problem solving. 1Chi, M., P. Feltovich, R. Glaser, "Categorization and Representation of Physics Problems by Experts and Novices", Cognitive Science, 5, 121--152 (1981).

  10. Simulation of realistic abnormal SPECT brain perfusion images: application in semi-quantitative analysis

    NASA Astrophysics Data System (ADS)

    Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.

    2005-11-01

    Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.

  11. Evaluating motion processing algorithms for use with functional near-infrared spectroscopy data from young children.

    PubMed

    Delgado Reyes, Lourdes M; Bohache, Kevin; Wijeakumar, Sobanawartiny; Spencer, John P

    2018-04-01

    Motion artifacts are often a significant component of the measured signal in functional near-infrared spectroscopy (fNIRS) experiments. A variety of methods have been proposed to address this issue, including principal components analysis (PCA), correlation-based signal improvement (CBSI), wavelet filtering, and spline interpolation. The efficacy of these techniques has been compared using simulated data; however, our understanding of how these techniques fare when dealing with task-based cognitive data is limited. Brigadoi et al. compared motion correction techniques in a sample of adult data measured during a simple cognitive task. Wavelet filtering showed the most promise as an optimal technique for motion correction. Given that fNIRS is often used with infants and young children, it is critical to evaluate the effectiveness of motion correction techniques directly with data from these age groups. This study addresses that problem by evaluating motion correction algorithms implemented in HomER2. The efficacy of each technique was compared quantitatively using objective metrics related to the physiological properties of the hemodynamic response. Results showed that targeted PCA (tPCA), spline, and CBSI retained a higher number of trials. These techniques also performed well in direct head-to-head comparisons with the other approaches using quantitative metrics. The CBSI method corrected many of the artifacts present in our data; however, this approach produced sometimes unstable HRFs. The targeted PCA and spline methods proved to be the most robust, performing well across all comparison metrics. When compared head to head, tPCA consistently outperformed spline. We conclude, therefore, that tPCA is an effective technique for correcting motion artifacts in fNIRS data from young children.

  12. Later abortions and mental health: psychological experiences of women having later abortions--a critical review of research.

    PubMed

    Steinberg, Julia R

    2011-01-01

    Some abortion policies in the U.S. are based on the notion that abortion harms women's mental health. The American Psychological Association (APA) Task Force on Abortion and Mental Health concluded that first-trimester abortions do not harm women's mental health. However, the APA task force does not make conclusions regarding later abortions (second trimester or beyond) and mental health. This paper critically evaluates studies on later abortion and mental health in order to inform both policy and practice. Using guidelines outlined by Steinberg and Russo (2009), post 1989 quantitative studies on later abortion and mental health were evaluated on the following qualities: 1) composition of comparison groups, 2) how prior mental health was assessed, and 3) whether common risk factors were controlled for in analyses if a significant relationship between abortion and mental health was found. Studies were evaluated with respect to the claim that later abortions harm women's mental health. Eleven quantitative studies that compared the mental health of women having later abortions (for reasons of fetal anomaly) with other groups were evaluated. Findings differed depending on the comparison group. No studies considered the role of prepregnancy mental health, and one study considered whether factors common among women having later abortions and mental health problems drove the association between later abortion and mental health. Policies based on the notion that later abortions (because of fetal anomaly) harm women's mental health are unwarranted. Because research suggests that most women who have later abortions do so for reasons other than fetal anomaly, future investigations should examine women's psychological experiences around later abortions. Copyright © 2011 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  13. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method ...

  14. Employment from Solar Energy: A Bright but Partly Cloudy Future.

    ERIC Educational Resources Information Center

    Smeltzer, K. K.; Santini, D. J.

    A comparison of quantitative and qualitative employment effects of solar and conventional systems can prove the increased employment postulated as one of the significant secondary benefits of a shift from conventional to solar energy use. Current quantitative employment estimates show solar technology-induced employment to be generally greater…

  15. Using Facebook as a LMS?

    ERIC Educational Resources Information Center

    Arabacioglu, Taner; Akar-Vural, Ruken

    2014-01-01

    The main purpose of this research was to compare the communication media according to effective teaching. For this purpose, in the research, the mixed method, including quantitative and qualitative data collecting techniques, was applied. For the quantitative part of the research, the static group comparison design was implemented as one of the…

  16. Single Laboratory Comparison of Quantitative Real-time PCR Assays for the Detection of Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) assays available to detect and enumerate fecal pollution in ambient waters. Each assay employs distinct primers and probes that target different rRNA genes and microorganisms leading to potential variations in concentration es...

  17. Comparison of genetic diversity and population structure of Pacific Coast whitebark pine across multiple markers

    Treesearch

    Andrew D. Bower; Bryce A. Richardson; Valerie Hipkins; Regina Rochefort; Carol Aubry

    2011-01-01

    Analysis of "neutral" molecular markers and "adaptive" quantitative traits are common methods of assessing genetic diversity and population structure. Molecular markers typically reflect the effects of demographic and stochastic processes but are generally assumed to not reflect natural selection. Conversely, quantitative (or "adaptive")...

  18. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  19. Robust and transferable quantification of NMR spectral quality using IROC analysis

    NASA Astrophysics Data System (ADS)

    Zambrello, Matthew A.; Maciejewski, Mark W.; Schuyler, Adam D.; Weatherby, Gerard; Hoch, Jeffrey C.

    2017-12-01

    Non-Fourier methods are increasingly utilized in NMR spectroscopy because of their ability to handle nonuniformly-sampled data. However, non-Fourier methods present unique challenges due to their nonlinearity, which can produce nonrandom noise and render conventional metrics for spectral quality such as signal-to-noise ratio unreliable. The lack of robust and transferable metrics (i.e. applicable to methods exhibiting different nonlinearities) has hampered comparison of non-Fourier methods and nonuniform sampling schemes, preventing the identification of best practices. We describe a novel method, in situ receiver operating characteristic analysis (IROC), for characterizing spectral quality based on the Receiver Operating Characteristic curve. IROC utilizes synthetic signals added to empirical data as "ground truth", and provides several robust scalar-valued metrics for spectral quality. This approach avoids problems posed by nonlinear spectral estimates, and provides a versatile quantitative means of characterizing many aspects of spectral quality. We demonstrate applications to parameter optimization in Fourier and non-Fourier spectral estimation, critical comparison of different methods for spectrum analysis, and optimization of nonuniform sampling schemes. The approach will accelerate the discovery of optimal approaches to nonuniform sampling experiment design and non-Fourier spectrum analysis for multidimensional NMR.

  20. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    PubMed

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography.

    PubMed

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Ake; Winter, Reidar

    2009-08-25

    Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 +/- 3.7% and -0.2 +/- 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant.

  2. Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing

    USDA-ARS?s Scientific Manuscript database

    Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...

  3. Fetal growth and psychiatric and socioeconomic problems: population-based sibling comparison

    PubMed Central

    Class, Quetzal A.; Rickert, Martin E.; Larsson, Henrik; Lichtenstein, Paul; D’Onofrio, Brian M.

    2014-01-01

    Background It is unclear whether associations between fetal growth and psychiatric and socioeconomic problems are consistent with causal mechanisms. Aims To estimate the extent to which associations are a result of unmeasured confounding factors using a sibling-comparison approach. Method We predicted outcomes from continuously measured birth weight in a Swedish population cohort (n = 3 291 773), while controlling for measured and unmeasured confounding. Results In the population, lower birth weight (⩽2500 g) increased the risk of all outcomes. Sibling-comparison models indicated that lower birth weight independently predicted increased risk for autism spectrum disorder (hazard ratio for low birth weight = 2.44, 95% CI 1.99-2.97) and attention-deficit hyperactivity disorder. Although attenuated, associations remained for psychotic or bipolar disorder and educational problems. Associations with suicide attempt, substance use problems and social welfare receipt, however, were fully attenuated in sibling comparisons. Conclusions Results suggest that fetal growth, and factors that influence it, contribute to psychiatric and socioeconomic problems. PMID:25257067

  4. Fetal growth and psychiatric and socioeconomic problems: population-based sibling comparison.

    PubMed

    Class, Quetzal A; Rickert, Martin E; Larsson, Henrik; Lichtenstein, Paul; D'Onofrio, Brian M

    2014-11-01

    It is unclear whether associations between fetal growth and psychiatric and socioeconomic problems are consistent with causal mechanisms. To estimate the extent to which associations are a result of unmeasured confounding factors using a sibling-comparison approach. We predicted outcomes from continuously measured birth weight in a Swedish population cohort (n = 3 291 773), while controlling for measured and unmeasured confounding. In the population, lower birth weight (⩽ 2500 g) increased the risk of all outcomes. Sibling-comparison models indicated that lower birth weight independently predicted increased risk for autism spectrum disorder (hazard ratio for low birth weight = 2.44, 95% CI 1.99-2.97) and attention-deficit hyperactivity disorder. Although attenuated, associations remained for psychotic or bipolar disorder and educational problems. Associations with suicide attempt, substance use problems and social welfare receipt, however, were fully attenuated in sibling comparisons. Results suggest that fetal growth, and factors that influence it, contribute to psychiatric and socioeconomic problems. Royal College of Psychiatrists.

  5. Assessment of cleaning and disinfection in Salmonella-contaminated poultry layer houses using qualitative and semi-quantitative culture techniques.

    PubMed

    Wales, Andrew; Breslin, Mark; Davies, Robert

    2006-09-10

    Salmonella infection of laying flocks in the UK is predominantly a problem of the persistent contamination of layer houses and associated wildlife vectors by Salmonella Enteritidis. Methods for its control and elimination include effective cleaning and disinfection of layer houses between flocks, and it is important to be able to measure the success of such decontamination. A method for the environmental detection and semi-quantitative enumeration of salmonellae was used and compared with a standard qualitative method, in 12 Salmonella-contaminated caged layer houses before and after cleaning and disinfection. The quantitative technique proved to have comparable sensitivity to the standard method, and additionally provided insights into the numerical Salmonella challenge that replacement flocks would encounter. Elimination of S. Enteritidis was not achieved in any of the premises examined although substantial reductions in the prevalence and numbers of salmonellae were demonstrated, whilst in others an increase in contamination was observed after cleaning and disinfection. Particular problems with feeders and wildlife vectors were highlighted. The use of a quantitative method assisted the identification of problem areas, such as those with a high initial bacterial load or those experiencing only a modest reduction in bacterial count following decontamination.

  6. Inverse transport problems in quantitative PAT for molecular imaging

    NASA Astrophysics Data System (ADS)

    Ren, Kui; Zhang, Rongting; Zhong, Yimin

    2015-12-01

    Fluorescence photoacoustic tomography (fPAT) is a molecular imaging modality that combines photoacoustic tomography with fluorescence imaging to obtain high-resolution imaging of fluorescence distributions inside heterogeneous media. The objective of this work is to study inverse problems in the quantitative step of fPAT where we intend to reconstruct physical coefficients in a coupled system of radiative transport equations using internal data recovered from ultrasound measurements. We derive uniqueness and stability results on the inverse problems and develop some efficient algorithms for image reconstructions. Numerical simulations based on synthetic data are presented to validate the theoretical analysis. The results we present here complement these in Ren K and Zhao H (2013 SIAM J. Imaging Sci. 6 2024-49) on the same problem but in the diffusive regime.

  7. Chemometric comparison of polychlorinated biphenyl residues and toxicologically active polychlorinated biphenyl congeners in the eggs of Forster's Terns (Sterna fosteri)

    USGS Publications Warehouse

    Schwartz, Ted R.; Stalling, David L.

    1991-01-01

    The separation and characterization of complex mixtures of polychlorinated biphenyls (PCBs) is approached from the perspective of a problem in chemometrics. A technique for quantitative determination of PCB congeners is described as well as an enrichment technique designed to isolate only those congener residues which induce mixed aryl hydrocarbon hydroxylase enzyme activity. A congener-specific procedure is utilized for the determination of PCBs in whichn-alkyl trichloroacetates are used as retention index marker compounds. Retention indices are reproducible in the range of ±0.05 to ±0.7 depending on the specific congener. A laboratory data base system developed to aid in the editing and quantitation of data generated from capillary gas chromatography was employed to quantitate chromatographic data. Data base management was provided by computer programs written in VAX-DSM (Digital Standard MUMPS) for the VAX-DEC (Digital Equipment Corp.) family of computers.In the chemometric evaluation of these complex chromatographic profiles, data are viewed from a single analysis as a point in multi-dimensional space. Principal Components Analysis was used to obtain a representation of the data in a lower dimensional space. Two-and three-dimensional proections based on sample scores from the principal components models were used to visualize the behavior of Aroclor® mixtures. These models can be used to determine if new sample profiles may be represented by Aroclor profiles. Concentrations of individual congeners of a given chlorine substitution may be summed to form homologue concentration. However, the use of homologue concentrations in classification studies with environmental samples can lead to erroneous conclusions about sample similarity. Chemometric applications are discussed for evaluation of Aroclor mixture analysis and compositional description of environmental residues of PCBs in eggs of Forster's terns (Sterna fosteri) collected from colonies near Lake Poygan and Green Bay, Wisconsin. The application of chemometrics is extended to the comparison of: a) Aroclors and PCB-containing environmental samples; to b) fractions of Aroclors and of environmental samples that have been enriched in congeners which induce mixed aryl hydrocarbon hydroxylase enzyme activity.

  8. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Qualitative and quantitative reasoning about thermodynamics

    NASA Technical Reports Server (NTRS)

    Skorstad, Gordon; Forbus, Ken

    1989-01-01

    One goal of qualitative physics is to capture the tacit knowledge of engineers and scientists. It is shown how Qualitative Process theory can be used to express concepts of engineering thermodynamics. In particular, it is shown how to integrate qualitative and quantitative knowledge to solve textbook problems involving thermodynamic cycles, such as gas turbine plants and steam power plants. These ideas were implemented in a program called SCHISM. Its analysis of a sample textbook problem is described and plans for future work are discussed.

  10. Core-collapse supernovae as supercomputing science: A status report toward six-dimensional simulations with exact Boltzmann neutrino transport in full general relativity

    NASA Astrophysics Data System (ADS)

    Kotake, Kei; Sumiyoshi, Kohsuke; Yamada, Shoichi; Takiwaki, Tomoya; Kuroda, Takami; Suwa, Yudai; Nagakura, Hiroki

    2012-08-01

    This is a status report on our endeavor to reveal the mechanism of core-collapse supernovae (CCSNe) by large-scale numerical simulations. Multi-dimensionality of the supernova engine, general relativistic magnetohydrodynamics, energy and lepton number transport by neutrinos emitted from the forming neutron star, as well as nuclear interactions there, are all believed to play crucial roles in repelling infalling matter and producing energetic explosions. These ingredients are non-linearly coupled with one another in the dynamics of core collapse, bounce, and shock expansion. Serious quantitative studies of CCSNe hence make extensive numerical computations mandatory. Since neutrinos are neither in thermal nor in chemical equilibrium in general, their distributions in the phase space should be computed. This is a six-dimensional (6D) neutrino transport problem and quite a challenge, even for those with access to the most advanced numerical resources such as the "K computer". To tackle this problem, we have embarked on efforts on multiple fronts. In particular, we report in this paper our recent progresses in the treatment of multidimensional (multi-D) radiation hydrodynamics. We are currently proceeding on two different paths to the ultimate goal. In one approach, we employ an approximate but highly efficient scheme for neutrino transport and treat 3D hydrodynamics and/or general relativity rigorously; some neutrino-driven explosions will be presented and quantitative comparisons will be made between 2D and 3D models. In the second approach, on the other hand, exact, but so far Newtonian, Boltzmann equations are solved in two and three spatial dimensions; we will show some example test simulations. We will also address the perspectives of exascale computations on the next generation supercomputers.

  11. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution - Poster

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method p...

  12. Exciting New Images | Lunar Reconnaissance Orbiter Camera

    Science.gov Websites

    slowly and relentlessly reshapes the Moon's topography. Comparative study of the shapes of lunar craters , quantitative comparison be derived? And how can we quantify and compare the topography of a large number of for quantitative characterization of impact crater topography (Mahanti, P. et al., 2014, Icarus v. 241

  13. A Comparison of Learning Cultures in Different Sizes and Types

    ERIC Educational Resources Information Center

    Brown, Paula D.; Finch, Kim S.; MacGregor, Cynthia

    2012-01-01

    This study compared relevant data and information about leadership and learning cultures in different sizes and types of high schools. Research was conducted using a quantitative design with a qualitative element. Quantitative data were gathered using a researcher-created survey. Independent sample t-tests were conducted to analyze the means of…

  14. Does Pre-Service Preparation Matter? Examining an Old Question in New Ways

    ERIC Educational Resources Information Center

    Ronfeldt, Matthew

    2014-01-01

    Background: Over the past decade, most of the quantitative studies on teacher preparation have focused on comparisons between alternative and traditional routes. There has been relatively little quantitative research on specific features of teacher education that might cause certain pathways into teaching to be more effective than others. The vast…

  15. Detection limits and cost comparisons of human- and gull-associated conventional and quantitative PCR assays in artificial and environmental waters

    EPA Science Inventory

    Modern techniques for tracking fecal pollution in environmental waters require investing in DNA-based methods to determine the presence of specific fecal sources. To help water quality managers decide whether to employ routine polymerase chain reaction (PCR) or quantitative PC...

  16. Comparison of quantitative PCR assays for Escherichia coli targeting ribosomal RNA and single copy genes

    EPA Science Inventory

    Aims: Compare specificity and sensitivity of quantitative PCR (qPCR) assays targeting single and multi-copy gene regions of Escherichia coli. Methods and Results: A previously reported assay targeting the uidA gene (uidA405) was used as the basis for comparing the taxono...

  17. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  18. KEY COMPARISON: CCQM-K61: Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA

    NASA Astrophysics Data System (ADS)

    Woolford, Alison; Holden, Marcia; Salit, Marc; Burns, Malcolm; Ellison, Stephen L. R.

    2009-01-01

    Key comparison CCQM-K61 was performed to demonstrate and document the capability of interested national metrology institutes in the determination of the quantity of specific DNA target in an aqueous solution. The study provides support for the following measurement claim: "Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA". The comparison was an activity of the Bioanalysis Working Group (BAWG) of the Comité Consultatif pour la Quantité de Matière and was coordinated by NIST (Gaithersburg, USA) and LGC (Teddington, UK). The following laboratories (in alphabetical order) participated in this key comparison. DMSC (Thailand); IRMM (European Union); KRISS (Republic of Korea); LGC (UK); NIM (China); NIST (USA); NMIA (Australia); NMIJ (Japan); VNIIM (Russian Federation) Good agreement was observed between the reported results of all nine of the participants. Uncertainty estimates did not account fully for the dispersion of results even after allowance for possible inhomogeneity in calibration materials. Preliminary studies suggest that the effects of fluorescence threshold setting might contribute to the excess dispersion, and further study of this topic is suggested Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  19. Quantitative comparison of tumor delivery for multiple targeted nanoparticles simultaneously by multiplex ICP-MS.

    PubMed

    Elias, Andrew; Crayton, Samuel H; Warden-Rothman, Robert; Tsourkas, Andrew

    2014-07-28

    Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples.

  20. Health and Sleep Problems in Cornelia de Lange Syndrome: A Case Control Study

    ERIC Educational Resources Information Center

    Hall, S. S.; Arron, K.; Sloneem, J.; Oliver, C.

    2008-01-01

    Background: Self-injury, sleep problems and health problems are commonly reported in Cornelia de Lange Syndrome (CdLS) but there are no comparisons with appropriately matched participants. The relationship between these areas and comparison to a control group is warranted. Method: 54 individuals with CdLS were compared with 46 participants with…

  1. Predicting bone strength with ultrasonic guided waves

    PubMed Central

    Bochud, Nicolas; Vallet, Quentin; Minonzio, Jean-Gabriel; Laugier, Pascal

    2017-01-01

    Recent bone quantitative ultrasound approaches exploit the multimode waveguide response of long bones for assessing properties such as cortical thickness and stiffness. Clinical applications remain, however, challenging, as the impact of soft tissue on guided waves characteristics is not fully understood yet. In particular, it must be clarified whether soft tissue must be incorporated in waveguide models needed to infer reliable cortical bone properties. We hypothesize that an inverse procedure using a free plate model can be applied to retrieve the thickness and stiffness of cortical bone from experimental data. This approach is first validated on a series of laboratory-controlled measurements performed on assemblies of bone- and soft tissue mimicking phantoms and then on in vivo measurements. The accuracy of the estimates is evaluated by comparison with reference values. To further support our hypothesis, these estimates are subsequently inserted into a bilayer model to test its accuracy. Our results show that the free plate model allows retrieving reliable waveguide properties, despite the presence of soft tissue. They also suggest that the more sophisticated bilayer model, although it is more precise to predict experimental data in the forward problem, could turn out to be hardly manageable for solving the inverse problem. PMID:28256568

  2. Commercial vs professional UAVs for mapping

    NASA Astrophysics Data System (ADS)

    Nikolakopoulos, Konstantinos G.; Koukouvelas, Ioannis

    2017-09-01

    The continuous advancements in the technology behind Unmanned Aerial Vehicles (UAVs), in accordance with the consecutive decrease to their cost and the availability of photogrammetric software, make the use of UAVs an excellent tool for large scale mapping. In addition with the use of UAVs, the problems of increased costs, time consumption and the possible terrain accessibility problems, are significantly reduced. However, despite the growing number of UAV applications there has been a little quantitative assessment of UAV performance and of the quality of the derived products (orthophotos and Digital Surface Models). Here, we present results from field experiments designed to evaluate the accuracy of photogrammetrically-derived digital surface models (DSM) developed from imagery acquired with onboard digital cameras. We also show the comparison of the high resolution vs moderate resolution imagery for largescale geomorphic mapping. The acquired data analyzed in this study comes from a small commercial and a professional UAV. The test area was mapped using the same photogrammetric grid by the two UAVs. 3D models, DSMs and orthophotos were created using special software. Those products were compared to in situ survey measurements and the results are presented in this paper.

  3. [Method of fused sample preparation after nitrify-determination of primary and minor elements in manganese ore by X-ray fluorescence spectrometry].

    PubMed

    Song, Yi; Guo, Fen; Gu, Song-hai

    2007-02-01

    Eight components, i. e. Mn, SiO2, Fe, P, Al2O3, CaO, MgO and S, in manganese ore were determined by X-ray fluorescence spectrometer. Because manganese ore sample releases a lot of air bubbles during fusion which effect accuracy and reproducibility of determination, nitric acid was added to the sample to destroy organic matter before fusion by the mixture flux at 1000 degrees C. This method solved the problem that the flux splashed during fusion because organic matter volatilized brought out a lot of air bubbles, eliminated particle size effects and mineral effect, while solved the problem of volatilization of sulfur during fusion. The experiments for the selection of the sample preparation conditions, i. e. fusion flux, fusion time and volume of HNO3, were carried out. The matrix effects on absorption and enhancement were corrected by variable theoretical alpha coefficient to expand the range of determination. Moreover, the precision and accuracy experiments were performed. In comparison with chemical analysis method, the quantitative analytical results for each component are satisfactory. The method has proven rapid, precise and simple.

  4. A Comparison of the Effects of LOGO Use and Teacher-Directed Problem-Solving Instruction on the Problem-Solving Skills, Achievement, and Attitudes of Low, Average, and High Achieving Junior High School Learners.

    ERIC Educational Resources Information Center

    Dalton, David W.

    This comparison of the effects of LOGO use with the use of teacher-directed problem-solving instruction, and with conventional mathematics instruction, focused on the problem-solving ability, basic skills achievement, and attitudes of junior high school learners. Students (N=97) in five seventh grade mathematics classes were systematically…

  5. Comparison of Quantitative Antifungal Testing Methods for Textile Fabrics.

    PubMed

    Imoto, Yasuo; Seino, Satoshi; Nakagawa, Takashi; Yamamoto, Takao A

    2017-01-01

     Quantitative antifungal testing methods for textile fabrics under growth-supportive conditions were studied. Fungal growth activities on unfinished textile fabrics and textile fabrics modified with Ag nanoparticles were investigated using the colony counting method and the luminescence method. Morphological changes of the fungi during incubation were investigated by microscopic observation. Comparison of the results indicated that the fungal growth activity values obtained with the colony counting method depended on the morphological state of the fungi on textile fabrics, whereas those obtained with the luminescence method did not. Our findings indicated that unique characteristics of each testing method must be taken into account for the proper evaluation of antifungal activity.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterialsmore » or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.« less

  7. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  8. Comparison of DIGE and post-stained gel electrophoresis with both traditional and SameSpots analysis for quantitative proteomics.

    PubMed

    Karp, Natasha A; Feret, Renata; Rubtsov, Denis V; Lilley, Kathryn S

    2008-03-01

    2-DE is an important tool in quantitative proteomics. Here, we compare the deep purple (DP) system with DIGE using both a traditional and the SameSpots approach to gel analysis. Missing values in the traditional approach were found to be a significant issue for both systems. SameSpots attempts to address the missing value problem. SameSpots was found to increase the proportion of low volume data for DP but not for DIGE. For all the analysis methods applied in this study, the assumptions of parametric tests were met. Analysis of the same images gave significantly lower noise with SameSpots (over traditional) for DP, but no difference for DIGE. We propose that SameSpots gave lower noise with DP due to the stabilisation of the spot area by the common spot outline, but this was not seen with DIGE due to the co-detection process which stabilises the area selected. For studies where measurement of small abundance changes is required, a cost-benefit analysis highlights that DIGE was significantly cheaper regardless of the analysis methods. For studies analysing large changes, DP with SameSpots could be an effective alternative to DIGE but this will be dependent on the biological noise of the system under investigation.

  9. Comparative evaluation of performance measures for shading correction in time-lapse fluorescence microscopy.

    PubMed

    Liu, L; Kan, A; Leckie, C; Hodgkin, P D

    2017-04-01

    Time-lapse fluorescence microscopy is a valuable technology in cell biology, but it suffers from the inherent problem of intensity inhomogeneity due to uneven illumination or camera nonlinearity, known as shading artefacts. This will lead to inaccurate estimates of single-cell features such as average and total intensity. Numerous shading correction methods have been proposed to remove this effect. In order to compare the performance of different methods, many quantitative performance measures have been developed. However, there is little discussion about which performance measure should be generally applied for evaluation on real data, where the ground truth is absent. In this paper, the state-of-the-art shading correction methods and performance evaluation methods are reviewed. We implement 10 popular shading correction methods on two artificial datasets and four real ones. In order to make an objective comparison between those methods, we employ a number of quantitative performance measures. Extensive validation demonstrates that the coefficient of joint variation (CJV) is the most applicable measure in time-lapse fluorescence images. Based on this measure, we have proposed a novel shading correction method that performs better compared to well-established methods for a range of real data tested. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  10. Interactive radiographic image retrieval system.

    PubMed

    Kundu, Malay Kumar; Chowdhury, Manish; Das, Sudeb

    2017-02-01

    Content based medical image retrieval (CBMIR) systems enable fast diagnosis through quantitative assessment of the visual information and is an active research topic over the past few decades. Most of the state-of-the-art CBMIR systems suffer from various problems: computationally expensive due to the usage of high dimensional feature vectors and complex classifier/clustering schemes. Inability to properly handle the "semantic gap" and the high intra-class versus inter-class variability problem of the medical image database (like radiographic image database). This yields an exigent demand for developing highly effective and computationally efficient retrieval system. We propose a novel interactive two-stage CBMIR system for diverse collection of medical radiographic images. Initially, Pulse Coupled Neural Network based shape features are used to find out the most probable (similar) image classes using a novel "similarity positional score" mechanism. This is followed by retrieval using Non-subsampled Contourlet Transform based texture features considering only the images of the pre-identified classes. Maximal information compression index is used for unsupervised feature selection to achieve better results. To reduce the semantic gap problem, the proposed system uses a novel fuzzy index based relevance feedback mechanism by incorporating subjectivity of human perception in an analytic manner. Extensive experiments were carried out to evaluate the effectiveness of the proposed CBMIR system on a subset of Image Retrieval in Medical Applications (IRMA)-2009 database consisting of 10,902 labeled radiographic images of 57 different modalities. We obtained overall average precision of around 98% after only 2-3 iterations of relevance feedback mechanism. We assessed the results by comparisons with some of the state-of-the-art CBMIR systems for radiographic images. Unlike most of the existing CBMIR systems, in the proposed two-stage hierarchical framework, main importance is given on constructing efficient and compact feature vector representation, search-space reduction and handling the "semantic gap" problem effectively, without compromising the retrieval performance. Experimental results and comparisons show that the proposed system performs efficiently in the radiographic medical image retrieval field. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Assessment of myocardial viability: comparison of echocardiography versus cardiac magnetic resonance imaging in the current era.

    PubMed

    Tomlinson, David R; Becher, Harald; Selvanayagam, Joseph B

    2008-06-01

    Detecting viable myocardium, whether hibernating or stunned, is of clinical significance in patients with coronary artery disease and left ventricular dysfunction. Echocardiographic assessments of myocardial thickening and endocardial excursion during dobutamine infusion provide a highly specific marker for myocardial viability, but with relatively less sensitivity. The additional modalities of myocardial contrast echocardiography and tissue Doppler have recently been proposed to provide further, quantitative measures of myocardial viability assessment. Cardiac magnetic resonance (CMR) has become popular for the assessment of myocardial viability as it can assess cardiac function, volumes, myocardial scar, and perfusion with high-spatial resolution. Both 'delayed enhancement' CMR and dobutamine stress CMR have important roles in the assessment of patients with ischaemic cardiomyopathy. This article reviews the recent advances in both echocardiography and CMR for the clinical assessment of myocardial viability. It attempts to provide a pragmatic approach toward the patient-specific assessment of this important clinical problem.

  12. Control design for robust stability in linear regulators: Application to aerospace flight control

    NASA Technical Reports Server (NTRS)

    Yedavalli, R. K.

    1986-01-01

    Time domain stability robustness analysis and design for linear multivariable uncertain systems with bounded uncertainties is the central theme of the research. After reviewing the recently developed upper bounds on the linear elemental (structured), time varying perturbation of an asymptotically stable linear time invariant regulator, it is shown that it is possible to further improve these bounds by employing state transformations. Then introducing a quantitative measure called the stability robustness index, a state feedback conrol design algorithm is presented for a general linear regulator problem and then specialized to the case of modal systems as well as matched systems. The extension of the algorithm to stochastic systems with Kalman filter as the state estimator is presented. Finally an algorithm for robust dynamic compensator design is presented using Parameter Optimization (PO) procedure. Applications in a aircraft control and flexible structure control are presented along with a comparison with other existing methods.

  13. Nodal portraits of quantum billiards: Domains, lines, and statistics

    NASA Astrophysics Data System (ADS)

    Jain, Sudhir Ranjan; Samajdar, Rhine

    2017-10-01

    This is a comprehensive review of the nodal domains and lines of quantum billiards, emphasizing a quantitative comparison of theoretical findings to experiments. The nodal statistics are shown to distinguish not only between regular and chaotic classical dynamics but also between different geometric shapes of the billiard system itself. How a random superposition of plane waves can model chaotic eigenfunctions is discussed and the connections of the complex morphology of the nodal lines thereof to percolation theory and Schramm-Loewner evolution are highlighted. Various approaches to counting the nodal domains—using trace formulas, graph theory, and difference equations—are also illustrated with examples. The nodal patterns addressed pertain to waves on vibrating plates and membranes, acoustic and electromagnetic modes, wave functions of a "particle in a box" as well as to percolating clusters, and domains in ferromagnets, thus underlining the diversity and far-reaching implications of the problem.

  14. A Novel Image Retrieval Based on Visual Words Integration of SIFT and SURF

    PubMed Central

    Ali, Nouman; Bajwa, Khalid Bashir; Sablatnig, Robert; Chatzichristofis, Savvas A.; Iqbal, Zeshan; Rashid, Muhammad; Habib, Hafiz Adnan

    2016-01-01

    With the recent evolution of technology, the number of image archives has increased exponentially. In Content-Based Image Retrieval (CBIR), high-level visual information is represented in the form of low-level features. The semantic gap between the low-level features and the high-level image concepts is an open research problem. In this paper, we present a novel visual words integration of Scale Invariant Feature Transform (SIFT) and Speeded-Up Robust Features (SURF). The two local features representations are selected for image retrieval because SIFT is more robust to the change in scale and rotation, while SURF is robust to changes in illumination. The visual words integration of SIFT and SURF adds the robustness of both features to image retrieval. The qualitative and quantitative comparisons conducted on Corel-1000, Corel-1500, Corel-2000, Oliva and Torralba and Ground Truth image benchmarks demonstrate the effectiveness of the proposed visual words integration. PMID:27315101

  15. What does it mean to be an exemplary science teacher?

    NASA Astrophysics Data System (ADS)

    Tobin, Kenneth; Fraser, Barry J.

    In order to provide a refreshing alternative to the majority of research reports, which malign science education and highlight its major problems and shortcomings, a series of case studies of exemplary practice was initiated to provide a focus on the successful and positive facets of schooling. The major data-collection approach was qualitative and involved 13 researchers in hundreds of hours of intensive classroom observation involving 20 exemplary teachers and a comparison group of nonexemplary teachers. A distinctive feature of the methodology was that the qualitative information was complemented by quantitative information obtained from the administration of questionnaires assessing student perceptions of classroom psychosocial environment. The major trends were that exemplary science teachers (1) used management strategies that facilitated sustained student engagement, (2) used strategies designed to increase student understanding of science, (3) utilized strategies that encouraged students to participate in learning activities, and (4) maintained a favorable classroom learning environment.

  16. Global spectral graph wavelet signature for surface analysis of carpal bones

    NASA Astrophysics Data System (ADS)

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A.

    2018-02-01

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  17. Depth.

    PubMed

    Koenderink, Jan J; van Doorn, Andrea J; Wagemans, Johan

    2011-01-01

    Depth is the feeling of remoteness, or separateness, that accompanies awareness in human modalities like vision and audition. In specific cases depths can be graded on an ordinal scale, or even measured quantitatively on an interval scale. In the case of pictorial vision this is complicated by the fact that human observers often appear to apply mental transformations that involve depths in distinct visual directions. This implies that a comparison of empirically determined depths between observers involves pictorial space as an integral entity, whereas comparing pictorial depths as such is meaningless. We describe the formal structure of pictorial space purely in the phenomenological domain, without taking recourse to the theories of optics which properly apply to physical space-a distinct ontological domain. We introduce a number of general ways to design and implement methods of geodesy in pictorial space, and discuss some basic problems associated with such measurements. We deal mainly with conceptual issues.

  18. India’s Distorted Sex Ratio: Dire Consequences for Girls

    PubMed Central

    Roberts, Lisa R.; Montgomery, Susanne B.

    2017-01-01

    Female gender discrimination related to cultural preference for males is a common global problem, especially in Asian countries. Numerous laws intended to prevent discrimination on the basis of gender have been passed in India, yet the distorted female-to-male sex ratio seems to show worsening tendencies. Using detailed, two-year longitudinal chart abstraction data about delivery records of a private mission hospital in rural India, we explored if hospital birth ratio data differed in comparison to regional data, and what demographic and contextual variables may have influenced these outcomes. Using quantitative chart abstraction and qualitative contextual data, study results showed the female-to-male ratio was lower than the reported state ratio at birth. In the context of India’s patriarchal structure, with its strong son preference, women are under tremendous pressure or coerced to access community-based, sex-selective identification and female fetus abortion. Nurses may be key to turning the tide. PMID:28286369

  19. Two-, three-, and four-poster jets in cross flow

    NASA Technical Reports Server (NTRS)

    Vukits, Thomas J.; Sullivan, John P.; Murthy, S. N. B.

    1993-01-01

    In connection with the problems of the ingestion of hot exhaust gases in engines of V/STOL and STOVL aircraft in ground effect, a series of studies have been undertaken. Ground impinging, two- and three-poster jets operating in the presence of cross flow were studied. The current paper is divided into two parts. The first part is a comparison of the low speed, two-, three-, and four-poster jet cases, with respect to the flowfield in the region of interaction between the forward and the jet flows. These include cases with mass balanced inlet suction. An analysis of the inlet entry plane of the low speed two- and three-poster jet cases is also given. In the second part, high speed results for a two jet configuration without inlet suction are given. The results are based on quantitative, marker concentration distributions obtained by digitizing video images.

  20. Global spectral graph wavelet signature for surface analysis of carpal bones.

    PubMed

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A

    2018-02-05

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  1. Application of statistical classification methods for predicting the acceptability of well-water quality

    NASA Astrophysics Data System (ADS)

    Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.

    2018-06-01

    The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.

  2. Telling stories and adding scores: Measuring resilience in young children affected by maternal HIV and AIDS.

    PubMed

    Ebersöhn, Liesel; Eloff, Irma; Finestone, Michelle; Grobler, Adri; Moen, Melanie

    2015-01-01

    "Telling stories and adding scores: Measuring resilience in young children affected by maternal HIV and AIDS", demonstrates how a concurrent mixed method design assisted cross-cultural comparison and ecological descriptions of resilience in young South African children, as well as validated alternative ways to measure resilience in young children. In a longitudinal randomised control trial, which investigated psychological resilience in mothers and children affected by HIV/AIDS, we combined a qualitative projective story-telling technique (Düss Fable) with quantitative data (Child Behaviour Checklist). The children mostly displayed adaptive resilience-related behaviours, although maladaptive behaviours were present. Participating children use internal (resolve/agency, positive future expectations, emotional intelligence) and external protective resources (material resources, positive institutions) to mediate adaptation. Children's maladaptive behaviours were exacerbated by internal (limited problem-solving skills, negative emotions) and external risk factors (chronic and cumulative adversity).

  3. Electronic entanglement in late transition metal oxides.

    PubMed

    Thunström, Patrik; Di Marco, Igor; Eriksson, Olle

    2012-11-02

    We present a study of the entanglement in the electronic structure of the late transition metal monoxides--MnO, FeO, CoO, and NiO--obtained by means of density-functional theory in the local density approximation combined with dynamical mean-field theory. The impurity problem is solved through exact diagonalization, which grants full access to the thermally mixed many-body ground state density operator. The quality of the electronic structure is affirmed through a direct comparison between the calculated electronic excitation spectrum and photoemission experiments. Our treatment allows for a quantitative investigation of the entanglement in the electronic structure. Two main sources of entanglement are explicitly resolved through the use of a fidelity based geometrical entanglement measure, and additional information is gained from a complementary entropic entanglement measure. We show that the interplay of crystal field effects and Coulomb interaction causes the entanglement in CoO to take a particularly intricate form.

  4. Future development of IR thermovision weather satellite equipment

    NASA Technical Reports Server (NTRS)

    Listratov, A. V.

    1974-01-01

    The self radiation of the surface being viewed is used for image synthesis in IR thermovision equipment. The installation of such equipment aboard weather satellites makes it possible to obtain cloud cover pictures of the earth's surface in a complete orbit, regardless of the illumination conditions, and also provides quantitative information on the underlying surface temperature and cloud top height. Such equipment is used successfully aboard the Soviet satellites of the Meteor system, and experimentally on the American satellites of the Nimbus series. With regard to surface resolution, the present-day IR weather satellite equipment is inferior to the television equipment. This is due primarily to the comparatively low detectivity of the IR detectors used. While IR equipment has several fundamental advantages in comparison with the conventional television equipment, the problem arises of determining the possibility for future development of weather satellite IR thermovision equipment. Criteria are examined for evaluating the quality of IR.

  5. [Syphilis. Current physiobiological data. I. The bacteriological problem].

    PubMed

    Collart, P; Poitevin, M

    For lack of being able to grow Treponema pallidum, the only method which allows us to study the biology of this germ and the physiopathology of this infection lies in researches in experimental syphilis. After pointing out the different aspects of Treponema pallidum, either with light microscopy or electron microscopy, the authors review the different kinds of reproduction suggested by syphiligraphs, the recent trials to cultivate the treponema, and the processes of elimination. Then, they examine the biological properties and the antigenic structure of T.p. as it has been established by comparison with cultivable spirochetes. To end with, the authors show that both the TPI test and the FTA test are two very specific reactions; these tests mean nothing but the fact that the patient has been in contact with the antigens of Treponema pallidum and the quantitative tests cannot be considered as expressing the infectious potential capacity.

  6. An experimental comparison of online object-tracking algorithms

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Chen, Feng; Xu, Wenli; Yang, Ming-Hsuan

    2011-09-01

    This paper reviews and evaluates several state-of-the-art online object tracking algorithms. Notwithstanding decades of efforts, object tracking remains a challenging problem due to factors such as illumination, pose, scale, deformation, motion blur, noise, and occlusion. To account for appearance change, most recent tracking algorithms focus on robust object representations and effective state prediction. In this paper, we analyze the components of each tracking method and identify their key roles in dealing with specific challenges, thereby shedding light on how to choose and design algorithms for different situations. We compare state-of-the-art online tracking methods including the IVT,1 VRT,2 FragT,3 BoostT,4 SemiT,5 BeSemiT,6 L1T,7 MILT,8 VTD9 and TLD10 algorithms on numerous challenging sequences, and evaluate them with different performance metrics. The qualitative and quantitative comparative results demonstrate the strength and weakness of these algorithms.

  7. India's Distorted Sex Ratio: Dire Consequences for Girls.

    PubMed

    Roberts, Lisa R; Montgomery, Susanne B

    2016-01-01

    Female gender discrimination related to cultural preference for males is a common global problem, especially in Asian countries. Numerous laws intended to prevent discrimination on the basis of gender have been passed in India, yet the distorted female-to-male sex ratio seems to show worsening tendencies. Using detailed, two-year longitudinal chart abstraction data about delivery records of a private mission hospital in rural India, we explored if hospital birth ratio data differed in comparison to regional data, and what demographic and contextual variables may have influenced these outcomes. Using quantitative chart abstraction and qualitative contextual data, study results showed the female-to-male ratio was lower than the reported state ratio at birth. In the context of India's patriarchal structure, with its strong son preference, women are under tremendous pressure or coerced to access community-based, sex-selective identification and female fetus abortion. Nurses may be key to turning the tide.

  8. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Comparison of different approaches to quantitative adenovirus detection in stool specimens of hematopoietic stem cell transplant recipients.

    PubMed

    Kosulin, K; Dworzak, S; Lawitschka, A; Matthes-Leodolter, S; Lion, T

    2016-12-01

    Adenoviruses almost invariably proliferate in the gastrointestinal tract prior to dissemination, and critical threshold concentrations in stool correlate with the risk of viremia. Monitoring of adenovirus loads in stool may therefore be important for timely initiation of treatment in order to prevent invasive infection. Comparison of a manual DNA extraction kit in combination with a validated in-house PCR assay with automated extraction on the NucliSENS-EasyMAG device coupled with the Adenovirus R-gene kit (bioMérieux) for quantitative adenovirus analysis in stool samples. Stool specimens spiked with adenovirus concentrations in a range from 10E2-10E11 copies/g and 32 adenovirus-positive clinical stool specimens from pediatric stem cell transplant recipients were tested along with appropriate negative controls. Quantitative analysis of viral load in adenovirus-positive stool specimens revealed a median difference of 0.5 logs (range 0.1-2.2) between the detection systems tested and a difference of 0.3 logs (range 0.0-1.7) when the comparison was restricted to the PCR assays only. Spiking experiments showed a detection limit of 10 2 -10 3 adenovirus copies/g stool revealing a somewhat higher sensitivity offered by the automated extraction. The dynamic range of accurate quantitative analysis by both systems investigated was between 10 3 and 10 8 virus copies/g. The differences in quantitative analysis of adenovirus copy numbers between the systems tested were primarily attributable to the DNA extraction method used, while the qPCR assays revealed a high level of concordance. Both systems showed adequate performance for detection and monitoring of adenoviral load in stool specimens. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Comparison of the scanned pages of the contractual documents

    NASA Astrophysics Data System (ADS)

    Andreeva, Elena; Arlazarov, Vladimir V.; Manzhikov, Temudzhin; Slavin, Oleg

    2018-04-01

    In this paper the problem statement is given to compare the digitized pages of the official papers. Such problem appears during the comparison of two customer copies signed at different times between two parties with a view to find the possible modifications introduced on the one hand. This problem is a practically significant in the banking sector during the conclusion of contracts in a paper format. The method of comparison based on the recognition, which consists in the comparison of two bag-of-words, which are the recognition result of the master and test pages, is suggested. The described experiments were conducted using the OCR Tesseract and the siamese neural network. The advantages of the suggested method are the steady operation of the comparison algorithm and the high exacting precision, and one of the disadvantages is the dependence on the chosen OCR.

  11. Spatiotemporal Characterization of a Fibrin Clot Using Quantitative Phase Imaging

    PubMed Central

    Gannavarpu, Rajshekhar; Bhaduri, Basanta; Tangella, Krishnarao; Popescu, Gabriel

    2014-01-01

    Studying the dynamics of fibrin clot formation and its morphology is an important problem in biology and has significant impact for several scientific and clinical applications. We present a label-free technique based on quantitative phase imaging to address this problem. Using quantitative phase information, we characterized fibrin polymerization in real-time and present a mathematical model describing the transition from liquid to gel state. By exploiting the inherent optical sectioning capability of our instrument, we measured the three-dimensional structure of the fibrin clot. From this data, we evaluated the fractal nature of the fibrin network and extracted the fractal dimension. Our non-invasive and speckle-free approach analyzes the clotting process without the need for external contrast agents. PMID:25386701

  12. Classical problems in computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    In relation to the expected problems in the development of computational aeroacoustics (CAA), the preliminary applications were to classical problems where the known analytical solutions could be used to validate the numerical results. Such comparisons were used to overcome the numerical problems inherent in these calculations. Comparisons were made between the various numerical approaches to the problems such as direct simulations, acoustic analogies and acoustic/viscous splitting techniques. The aim was to demonstrate the applicability of CAA as a tool in the same class as computational fluid dynamics. The scattering problems that occur are considered and simple sources are discussed.

  13. Focal Point Theory Models for Dissecting Dynamic Duality Problems of Microbial Infections

    PubMed Central

    Huang, S.-H.; Zhou, W.; Jong, A.

    2008-01-01

    Extending along the dynamic continuum from conflict to cooperation, microbial infections always involve symbiosis (Sym) and pathogenesis (Pat). There exists a dynamic Sym-Pat duality (DSPD) in microbial infection that is the most fundamental problem in infectomics. DSPD is encoded by the genomes of both the microbes and their hosts. Three focal point (FP) theory-based game models (pure cooperative, dilemma, and pure conflict) are proposed for resolving those problems. Our health is associated with the dynamic interactions of three microbial communities (nonpathogenic microbiota (NP) (Cooperation), conditional pathogens (CP) (Dilemma), and unconditional pathogens (UP) (Conflict)) with the hosts at different health statuses. Sym and Pat can be quantitated by measuring symbiotic index (SI), which is quantitative fitness for the symbiotic partnership, and pathogenic index (PI), which is quantitative damage to the symbiotic partnership, respectively. Symbiotic point (SP), which bears analogy to FP, is a function of SI and PI. SP-converting and specific pathogen-targeting strategies can be used for the rational control of microbial infections. PMID:18350122

  14. Combining qualitative and quantitative spatial and temporal information in a hierarchical structure: Approximate reasoning for plan execution monitoring

    NASA Technical Reports Server (NTRS)

    Hoebel, Louis J.

    1993-01-01

    The problem of plan generation (PG) and the problem of plan execution monitoring (PEM), including updating, queries, and resource-bounded replanning, have different reasoning and representation requirements. PEM requires the integration of qualitative and quantitative information. PEM is the receiving of data about the world in which a plan or agent is executing. The problem is to quickly determine the relevance of the data, the consistency of the data with respect to the expected effects, and if execution should continue. Only spatial and temporal aspects of the plan are addressed for relevance in this work. Current temporal reasoning systems are deficient in computational aspects or expressiveness. This work presents a hybrid qualitative and quantitative system that is fully expressive in its assertion language while offering certain computational efficiencies. In order to proceed, methods incorporating approximate reasoning using hierarchies, notions of locality, constraint expansion, and absolute parameters need be used and are shown to be useful for the anytime nature of PEM.

  15. Investigation of various factors influencing Raman spectra interpretation with the use of likelihood ratio approach.

    PubMed

    Michalska, Aleksandra; Martyna, Agnieszka; Zadora, Grzegorz

    2018-01-01

    The main aim of this study was to verify whether selected analytical parameters may affect solving the comparison problem of Raman spectra with the use of the likelihood ratio (LR) approach. Firstly the LR methodologies developed for Raman spectra of blue automotive paints obtained with the use of 785nm laser source (results published by the authors previously) were implemented for good quality spectra recorded for these paints with the use of 514.5nm laser source. For LR models construction two types of variables were used i.e. areas under selected pigments bands and coefficients derived from discrete wavelet transform procedure (DWT). Few experiments were designed for 785nm and 514.5nm Raman spectra databases after constructing well performing LR models (low rates of false positive and false negative answers and acceptable results of empirical cross entropy approach). In order to verify whether objective magnification described by its numerical aperture affects spectra interpretation, three objective magnifications -20×(N.A.=0.4.), 50×(N.A.=0.75) and 100×(N.A.=0.85) within each of the applied laser sources (514.5nm and 785nm) were tested for a group of blue solid and metallic automotive paints having the same sets of pigments depending on the applied laser source. The findings obtained by two types of LR models indicate the importance of this parameter for solving the comparison problem of both solid and metallic automotive paints regardless of the laser source used for measuring Raman signal. Hence, the same objective magnification, preferably 50× (established based on the analysis of within- and between-samples variability and F-factor value), should be used when focusing the laser on samples during Raman measurements. Then the influence of parameters (laser power and time of irradiation) of one of the recommended fluorescence suppression techniques, namely photobleaching, was under investigation. Analysis performed on a group of solid automotive paint samples showed that time of irradiation upon established laser power does not affect solving the comparison problem with the use of LR test. Likewise upon established time of irradiation 5% or 10% laser power could be used interchangeably without changing conclusions within this problem. However, upon the established time of irradiation changes in laser power between control and recovered sample from 5% or 10% to 50% may cause erroneous conclusions. Additionally it was also proved that prolonged irradiation of paint does not quantitatively affect pigments bands areas revealed after such a pre-treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Examining Interactions between Problem Posing and Problem Solving with Prospective Primary Teachers: A Case of Using Fractions

    ERIC Educational Resources Information Center

    Xie, Jinxia; Masingila, Joanna O.

    2017-01-01

    Existing studies have quantitatively evidenced the relatedness between problem posing and problem solving, as well as the magnitude of this relationship. However, the nature and features of this relationship need further qualitative exploration. This paper focuses on exploring the interactions, i.e., mutual effects and supports, between problem…

  17. Evaluating Writing Programs: Paradigms, Problems, Possibilities.

    ERIC Educational Resources Information Center

    McLeod, Susan H.

    1992-01-01

    Describes two methodological approaches (qualitative and quantitative) that grow out of two different research examples. Suggests the problems these methods present. Discusses the ways in which an awareness of these problems can help teachers to understand how to work with researchers in designing useful evaluations of writing programs. (PRA)

  18. Phenomenographic Study of Students' Problem Solving Approaches in Physics

    ERIC Educational Resources Information Center

    Walsh, Laura N.; Howard, Robert G.; Bowe, Brian

    2007-01-01

    This paper describes ongoing research investigating student approaches to quantitative and qualitative problem solving in physics. This empirical study was conducted using a phenomenographic approach to analyze data from individual semistructured problem solving interviews with 22 introductory college physics students. The main result of the study…

  19. Analytical progress in the theory of vesicles under linear flow

    NASA Astrophysics Data System (ADS)

    Farutin, Alexander; Biben, Thierry; Misbah, Chaouqi

    2010-06-01

    Vesicles are becoming a quite popular model for the study of red blood cells. This is a free boundary problem which is rather difficult to handle theoretically. Quantitative computational approaches constitute also a challenge. In addition, with numerical studies, it is not easy to scan within a reasonable time the whole parameter space. Therefore, having quantitative analytical results is an essential advance that provides deeper understanding of observed features and can be used to accompany and possibly guide further numerical development. In this paper, shape evolution equations for a vesicle in a shear flow are derived analytically with precision being cubic (which is quadratic in previous theories) with regard to the deformation of the vesicle relative to a spherical shape. The phase diagram distinguishing regions of parameters where different types of motion (tank treading, tumbling, and vacillating breathing) are manifested is presented. This theory reveals unsuspected features: including higher order terms and harmonics (even if they are not directly excited by the shear flow) is necessary, whatever the shape is close to a sphere. Not only does this theory cure a quite large quantitative discrepancy between previous theories and recent experiments and numerical studies, but also it reveals a phenomenon: the VB mode band in parameter space, which is believed to saturate after a moderate shear rate, exhibits a striking widening beyond a critical shear rate. The widening results from excitation of fourth-order harmonic. The obtained phase diagram is in a remarkably good agreement with recent three-dimensional numerical simulations based on the boundary integral formulation. Comparison of our results with experiments is systematically made.

  20. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology imagesmore » by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.« less

  1. Determination of vitamins D2 and D3 in selected food matrices by online high-performance liquid chromatography-gas chromatography-mass spectrometry (HPLC-GC-MS).

    PubMed

    Nestola, Marco; Thellmann, Andrea

    2015-01-01

    An online normal-phase liquid chromatography-gas chromatography-mass spectrometry (HPLC-GC-MS) method was developed for the determination of vitamins D2 and D3 in selected food matrices. Transfer of the sample from HPLC to GC was realized by large volume on-column injection; detection was performed with a time-of-flight mass spectrometer (TOF-MS). Typical GC problems in the determination of vitamin D such as sample degradation or sensitivity issues, previously reported in the literature, were not observed. Determination of total vitamin D content was done by quantitation of its pyro isomer based on an isotopically labelled internal standard (ISTD). Extracted ion traces of analyte and ISTD showed cross-contribution, but non-linearity of the calibration curve was not determined inside the chosen calibration range by selection of appropriate quantifier ions. Absolute limits of detection (LOD) and quantitation (LOQ) for vitamins D2 and D3 were calculated as approximately 50 and 150 pg, respectively. Repeatability with internal standard correction was below 2 %. Good agreement between quantitative results of an established high-performance liquid chromatography with UV detection (HPLC-UV) method and HPLC-GC-MS was found. Sterol-enriched margarine was subjected to HPLC-GC-MS and HPLC-MS/MS for comparison, because HPLC-UV showed strong matrix interferences. HPLC-GC-MS produced comparable results with less manual sample cleanup. In summary, online hyphenation of HPLC and GC allowed a minimization in manual sample preparation with an increase of sample throughput.

  2. A Reexamination of Active and Passive Tumor Targeting by Using Rod-Shaped Gold Nanocrystals and Covalently Conjugated Peptide Ligands

    PubMed Central

    Huang, Xiaohua; Peng, Xianghong; Wang, Yiqing; Wang, Yuxiang; Shin, Dong M.; El-Sayed, Mostafa A.; Nie, Shuming

    2010-01-01

    The targeted delivery of nanoparticles to solid tumors is one of the most important and challenging problems in cancer nanomedicine, but the detailed delivery mechanisms and design principles are still not well understood. Here we report quantitative tumor uptake studies for a class of elongated gold nanocrystals (called nanorods) that are covalently conjugated to tumor-targeting peptides. A major advantage in using gold as a “tracer” is that the accumulated gold in tumors and other organs can be quantitatively determined by elemental mass spectrometry (gold is not a natural element found in animals). Thus, colloidal gold nanorods are stabilized with a layer of polyethylene glycols (PEGs), and are conjugated to three different ligands: (i) a single-chain variable fragment (ScFv) peptide that recognizes the epidermal growth factor receptor (EGFR); (ii) an amino terminal fragment (ATF) peptide that recognizes the urokinase plasminogen activator receptor (uPAR); and (iii) a cyclic RGD peptide that recognizes the avb3 integrin receptor. Quantitative pharmacokinetic and biodistribution data show that these targeting ligands only marginally improve the total gold accumulation in xenograft tumor models in comparison with nontargeted controls, but their use could greatly alter the intracellular and extracellular nanoparticle distributions. When the gold nanorods are administered via intravenous injection, we also find that active molecular targeting of the tumor microenvironments (e.g., fibroblasts, macrophages, and vasculatures) does not significantly influence the tumor nanoparticle uptake. These results suggest that for photothermal cancer therapy, the preferred route of gold nanorod administration is intra-tumoral injection instead of intravenous injection. PMID:20863096

  3. Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.

    2017-02-01

    In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.

  4. Devising tissue ingrowth metrics: a contribution to the computational characterization of engineered soft tissue healing.

    PubMed

    Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin

    2018-03-14

    The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.

  5. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  6. Development of quantitative risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesmeyer, J. M.; Okrent, D.

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  7. Congenital hypothyroidism: diagnostic scintigraphic evaluation of an organification defect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cone, L.; Oates, E.; Vazquez, R.

    1988-06-01

    Quantitative Tc-99m pertechnetate thyroid imaging was performed on a hypothyroid neonate. The image revealed markedly increased trapping in an enlarged, bilobed, eutopic gland. A perchlorate washout test using quantitative imaging with I-123 confirmed an organification problem.

  8. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  9. Comparison of some evolutionary algorithms for optimization of the path synthesis problem

    NASA Astrophysics Data System (ADS)

    Grabski, Jakub Krzysztof; Walczak, Tomasz; Buśkiewicz, Jacek; Michałowska, Martyna

    2018-01-01

    The paper presents comparison of the results obtained in a mechanism synthesis by means of some selected evolutionary algorithms. The optimization problem considered in the paper as an example is the dimensional synthesis of the path generating four-bar mechanism. In order to solve this problem, three different artificial intelligence algorithms are employed in this study.

  10. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography

    PubMed Central

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Åke; Winter, Reidar

    2009-01-01

    Background Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Methods Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. Results There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 ± 3.7% and -0.2 ± 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Conclusion Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant. PMID:19706183

  11. Measuring Aircraft Capability for Military and Political Analysis

    DTIC Science & Technology

    1976-03-01

    challenged in 1932 when a panel of distinguished British scientists discussed the feasibility of quantitatively estimating sensory events... Quantitative Analysis of Social Problems , E.R. Tufte (ed.), p. 407, Addison-Wesley, 1970. 17 "artificial" boundaries are imposed on the data. Less...of arms transfers in various parts of the world as well. Quantitative research (and hence measurement) contributes to theoretical development by

  12. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  13. Comparison of Quantitative and Qualitative Research Traditions: Epistemological, Theoretical, and Methodological Differences

    ERIC Educational Resources Information Center

    Yilmaz, Kaya

    2013-01-01

    There has been much discussion about quantitative and qualitative approaches to research in different disciplines. In the behavioural and social sciences, these two paradigms are compared to reveal their relative strengths and weaknesses. But the debate about both traditions has commonly taken place in academic books. It is hard to find an article…

  14. Multi-laboratory comparison of quantitative PCR assays for detection and quantification of Fusarium virguliforme from soybean roots and soil

    USDA-ARS?s Scientific Manuscript database

    Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...

  15. COMPARISON OF POPULATIONS OF MOULD SPECIES IN HOMES IN THE UK AND US USING MOLD-SPECIFIC QUANTITATIVE PCR (MSQPCR)

    EPA Science Inventory

    The goal of this research was to compare the populations of 81 mold species in homes in USA and UK using mould specific quantitative polymerase chain reaction (MSQPCR) technology. Dust samples were obtained from randomly selected homes in Great Britain (n=11). The mould populat...

  16. COMPARISON OF ENTEROCOCCUS MEASUREMENTS IN FRESHWATER AT TWO RECREATIONAL BEACHES BY QUANTITATIVE POLYMERASE CHAIN REACTION AND MEMBRANE FILER CULTURE ANALYSIS

    EPA Science Inventory

    Cell densities of the fecal pollution indicator genus, Enterococcus, were determined by a rapid (2-3 hr) quantitative PCR (QPCR) analysis based method in 100 ml water samples collected from recreational beaches on Lake Michigan and Lake Erie during the summer of 2003. Enumeration...

  17. Examining the Inclusion of Quantitative Research in a Meta-Ethnographic Review

    ERIC Educational Resources Information Center

    Booker, Rhae-Ann Richardson

    2010-01-01

    This study explored how one might extend meta-ethnography to quantitative research for the advancement of interpretive review methods. Using the same population of 139 studies on racial-ethnic matching as data, my investigation entailed an extended meta-ethnography (EME) and comparison of its results to a published meta-analysis (PMA). Adhering to…

  18. Analyses of Disruption of Cerebral White Matter Integrity in Schizophrenia with MR Diffusion Tensor Fiber Tracking Method

    NASA Astrophysics Data System (ADS)

    Yamamoto, Utako; Kobayashi, Tetsuo; Kito, Shinsuke; Koga, Yoshihiko

    We have analyzed cerebral white matter using magnetic resonance diffusion tensor imaging (MR-DTI) to measure the diffusion anisotropy of water molecules. The goal of this study is the quantitative evaluation of schizophrenia. Diffusion tensor images are acquired for patients with schizophrenia and healthy comparison subjects, group-matched for age, sex, and handedness. Fiber tracking is performed on the superior longitudinal fasciculus for the comparison between the patient and comparison groups. We have analysed and compared the cross-sectional area on the starting coronal plane and the mean and standard deviation of the fractional anisotropy and the apparent diffusion coefficient along fibers in the right and left hemispheres. In the right hemisphere, the cross-sectional areas in patient group are significantly smaller than those in the comparison group. Furthermore, in the comparison group, the cross-sectional areas in the right hemisphere are significantly larger than those in the left hemisphere, whereas there is no significant difference in the patient group. These results suggest that we may evaluate the disruption in white matter integrity in schizophrenic patients quantitatively by comparing the cross-sectional area of the superior longitudinal fasciculus in the right and left hemispheres.

  19. A Backscatter-Lidar Forward-Operator

    NASA Astrophysics Data System (ADS)

    Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland

    2015-04-01

    We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.

  20. Comparison of Dynamic Contrast Enhanced MRI and Quantitative SPECT in a Rat Glioma Model

    PubMed Central

    Skinner, Jack T.; Yankeelov, Thomas E.; Peterson, Todd E.; Does, Mark D.

    2012-01-01

    Pharmacokinetic modeling of dynamic contrast enhanced (DCE)-MRI data provides measures of the extracellular volume fraction (ve) and the volume transfer constant (Ktrans) in a given tissue. These parameter estimates may be biased, however, by confounding issues such as contrast agent and tissue water dynamics, or assumptions of vascularization and perfusion made by the commonly used model. In contrast to MRI, radiotracer imaging with SPECT is insensitive to water dynamics. A quantitative dual-isotope SPECT technique was developed to obtain an estimate of ve in a rat glioma model for comparison to the corresponding estimates obtained using DCE-MRI with a vascular input function (VIF) and reference region model (RR). Both DCE-MRI methods produced consistently larger estimates of ve in comparison to the SPECT estimates, and several experimental sources were postulated to contribute to these differences. PMID:22991315

  1. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  2. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy (Compiler); Kim, Youngkwang; Conway, Claire (Compiler); Conway, Darrel

    2017-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  3. A Meta-Analysis of the Taped Problems Intervention

    ERIC Educational Resources Information Center

    Kleinert, Whitney L.; Codding, Robin S.; Minami, Takuya; Gould, Kaitlin

    2018-01-01

    Taped problems is an intervention strategy for addressing mathematics fluency that has been evaluated in multiple single-case design studies. Although its efficacy has been supported in individual studies, no comprehensive quantitative synthesis has been conducted on taped problems. The purpose of this study was to synthesize the literature that…

  4. Use of a Computer Simulation To Develop Mental Simulations for Understanding Relative Motion Concepts.

    ERIC Educational Resources Information Center

    Monaghan, James M.; Clement, John

    1999-01-01

    Presents evidence for students' qualitative and quantitative difficulties with apparently simple one-dimensional relative-motion problems, students' spontaneous visualization of relative-motion problems, the visualizations facilitating solution of these problems, and students' memories of the online computer simulation used as a framework for…

  5. Problem Orientation in Physical Geography Teaching.

    ERIC Educational Resources Information Center

    Church, Michael

    1988-01-01

    States that the introduction of real, quantitative problems in classroom and field teaching improves scientific rigor and leads more directly to applied studies. Examines the use of problems in an introductory hydrology course, presenting teaching objectives and the full course structure to illustrate their integration with other teaching modes.…

  6. Maternal Ratings of Attention Problems in ADHD: Evidence for the Existence of a Continuum

    ERIC Educational Resources Information Center

    Lubke, Gitta H.; Hudziak, James J.; Derks, Eske M.; van Bijsterveldt, Toos C. E. M.; Boomsma, Dorret I.

    2009-01-01

    Objective: To investigate whether items assessing attention problems provide evidence of quantitative differences or categorically distinct subtypes of attention problems (APs) and to investigate the relation of empirically derived latent classes to "DSM-IV" diagnoses of subtypes of attention-deficit/hyperactivity disorder (ADHD), for…

  7. Support vector regression and artificial neural network models for stability indicating analysis of mebeverine hydrochloride and sulpiride mixtures in pharmaceutical preparation: A comparative study

    NASA Astrophysics Data System (ADS)

    Naguib, Ibrahim A.; Darwish, Hany W.

    2012-02-01

    A comparison between support vector regression (SVR) and Artificial Neural Networks (ANNs) multivariate regression methods is established showing the underlying algorithm for each and making a comparison between them to indicate the inherent advantages and limitations. In this paper we compare SVR to ANN with and without variable selection procedure (genetic algorithm (GA)). To project the comparison in a sensible way, the methods are used for the stability indicating quantitative analysis of mixtures of mebeverine hydrochloride and sulpiride in binary mixtures as a case study in presence of their reported impurities and degradation products (summing up to 6 components) in raw materials and pharmaceutical dosage form via handling the UV spectral data. For proper analysis, a 6 factor 5 level experimental design was established resulting in a training set of 25 mixtures containing different ratios of the interfering species. An independent test set consisting of 5 mixtures was used to validate the prediction ability of the suggested models. The proposed methods (linear SVR (without GA) and linear GA-ANN) were successfully applied to the analysis of pharmaceutical tablets containing mebeverine hydrochloride and sulpiride mixtures. The results manifest the problem of nonlinearity and how models like the SVR and ANN can handle it. The methods indicate the ability of the mentioned multivariate calibration models to deconvolute the highly overlapped UV spectra of the 6 components' mixtures, yet using cheap and easy to handle instruments like the UV spectrophotometer.

  8. International drug price comparisons: quality assessment.

    PubMed

    Machado, Márcio; O'Brodovich, Ryan; Krahn, Murray; Einarson, Thomas R

    2011-01-01

    To quantitatively summarize results (i.e., prices and affordability) reported from international drug price comparison studies and assess their methodological quality. A systematic search of the most relevant databases-Medline, Embase, International Pharmaceutical Abstracts (IPA), and Scopus, from their inception to May 2009-was conducted to identify original research comparing international drug prices. International drug price information was extracted and recorded from accepted papers. Affordability was reported as drug prices adjusted for income. Study quality was assessed using six criteria: use of similar countries, use of a representative sample of drugs, selection of specific types of prices, identification of drug packaging, different weights on price indices, and the type of currency conversion used. Of the 1 828 studies identified, 21 were included. Only one study adequately addressed all quality issues. A large variation in study quality was observed due to the many methods used to conduct the drug price comparisons, such as different indices, economic parameters, price types, basket of drugs, and more. Thus, the quality of published studies was considered poor. Results varied across studies, but generally, higher income countries had higher drug prices. However, after adjusting drug prices for affordability, higher income countries had more affordable prices than lower income countries. Differences between drug prices and affordability in different countries were found. Low income countries reported less affordability of drugs, leaving room for potential problems with drug access, and consequently, a negative impact on health. The quality of the literature on this topic needs improvement.

  9. How to normalize Twitter counts? A first attempt based on journals in the Twitter Index.

    PubMed

    Bornmann, Lutz; Haunschild, Robin

    One possible way of measuring the broad impact of research (societal impact) quantitatively is the use of alternative metrics (altmetrics). An important source of altmetrics is Twitter, which is a popular microblogging service. In bibliometrics, it is standard to normalize citations for cross-field comparisons. This study deals with the normalization of Twitter counts (TC). The problem with Twitter data is that many papers receive zero tweets or only one tweet. In order to restrict the impact analysis on only those journals producing a considerable Twitter impact, we defined the Twitter Index (TI) containing journals with at least 80 % of the papers with at least 1 tweet each. For all papers in each TI journal, we calculated normalized Twitter percentiles (TP) which range from 0 (no impact) to 100 (highest impact). Thus, the highest impact accounts for the paper with the most tweets compared to the other papers in the journal. TP are proposed to be used for cross-field comparisons. We studied the field-independency of TP in comparison with TC. The results point out that the TP can validly be used particularly in biomedical and health sciences, life and earth sciences, mathematics and computer science, as well as physical sciences and engineering. In a first application of TP, we calculated percentiles for countries. The results show that Denmark, Finland, and Norway are the countries with the most tweeted papers (measured by TP).

  10. Reconstruction of three-dimensional porous media using a single thin section

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Sahimi, Muhammad

    2012-06-01

    The purpose of any reconstruction method is to generate realizations of two- or multiphase disordered media that honor limited data for them, with the hope that the realizations provide accurate predictions for those properties of the media for which there are no data available, or their measurement is difficult. An important example of such stochastic systems is porous media for which the reconstruction technique must accurately represent their morphology—the connectivity and geometry—as well as their flow and transport properties. Many of the current reconstruction methods are based on low-order statistical descriptors that fail to provide accurate information on the properties of heterogeneous porous media. On the other hand, due to the availability of high resolution two-dimensional (2D) images of thin sections of a porous medium, and at the same time, the high cost, computational difficulties, and even unavailability of complete 3D images, the problem of reconstructing porous media from 2D thin sections remains an outstanding unsolved problem. We present a method based on multiple-point statistics in which a single 2D thin section of a porous medium, represented by a digitized image, is used to reconstruct the 3D porous medium to which the thin section belongs. The method utilizes a 1D raster path for inspecting the digitized image, and combines it with a cross-correlation function, a grid splitting technique for deciding the resolution of the computational grid used in the reconstruction, and the Shannon entropy as a measure of the heterogeneity of the porous sample, in order to reconstruct the 3D medium. It also utilizes an adaptive technique for identifying the locations and optimal number of hard (quantitative) data points that one can use in the reconstruction process. The method is tested on high resolution images for Berea sandstone and a carbonate rock sample, and the results are compared with the data. To make the comparison quantitative, two sets of statistical tests consisting of the autocorrelation function, histogram matching of the local coordination numbers, the pore and throat size distributions, multiple-points connectivity, and single- and two-phase flow permeabilities are used. The comparison indicates that the proposed method reproduces the long-range connectivity of the porous media, with the computed properties being in good agreement with the data for both porous samples. The computational efficiency of the method is also demonstrated.

  11. “We wouldn’t of made friends if we didn’t come to Football United”: the impacts of a football program on young people’s peer, prosocial and cross-cultural relationships

    PubMed Central

    2013-01-01

    Background Sport as a mechanism to build relationships across cultural boundaries and to build positive interactions among young people has often been promoted in the literature. However, robust evaluation of sport-for-development program impacts is limited. This study reports on an impact evaluation of a sport-for-development program in Australia, Football United®. Methods A quasi-experimental mixed methods design was employed using treatment partitioning (different groups compared had different levels of exposure to Football United). A survey was undertaken with 142 young people (average age of 14.7 years with 22.5% of the sample comprising girls) in four Australian schools. These schools included two Football United and two Comparison schools where Football United was not operating. The survey instrument was composed of previously validated measures, including emotional symptoms, peer problems and relationships, prosocial behaviour, other-group orientation, feelings of social inclusion and belonging and resilience. Face to face interviews were undertaken with a purposeful sample (n = 79) of those who completed the survey. The participants in the interviews were selected to provide a diversity of age, gender and cultural backgrounds. Results Young people who participated in Football United showed significantly higher levels of other-group orientation than a Comparison Group (who did not participate in the program). The Football United boys had significantly lower scores on the peer problem scale and significantly higher scores on the prosocial scale than boys in the Comparison Group. Treatment partitioning analyses showed positive, linear associations between other-group orientation and total participation in the Football United program. A lower score on peer problems and higher scores on prosocial behaviour in the survey were associated with regularity of attendance at Football United. These quantitative results are supported by qualitative data analysed from interviews. Conclusions The study provides evidence of the effects of Football United on key domains of peer and prosocial relationships for boys and other-group orientation for young people in the program sites studied. The effects on girls, and the impacts of the program on the broader school environment and at the community level, require further investigation. PMID:23621898

  12. Analysis of mathematical literacy ability based on self-efficacy in model eliciting activities using metaphorical thinking approach

    NASA Astrophysics Data System (ADS)

    Setiani, C.; Waluya, S. B.; Wardono

    2018-03-01

    The purposes of this research are: (1) to identify learning quality in Model Eliciting Activities (MEAs) using a Metaphorical Thinking (MT) approach regarding qualitative and quantitative; (2) to analyze mathematical literacy of students based on Self-Efficacy (SE). This research is mixed method concurrent embedded design with qualitative research as the primary method. The quantitative research used quasi-experimental with non-equivalent control group design. The population is VIII grade students of SMP Negeri 3 Semarang Indonesia. Quantitative data is examined by conducting completeness mean test, standard completeness test, mean differentiation test and proportional differentiation test. Qualitative data is analyzed descriptively. The result of this research shows that MEAs learning using MT approach accomplishes good criteria both quantitatively and qualitatively. Students with low self-efficacy can identify problems, but they are lack ability to arrange problem-solving strategy on mathematical literacy questions. Students with medium self-efficacy can identify information provided in issues, but they find difficulties to use math symbols in making a representation. Students with high self-efficacy are excellent to represent problems into mathematical models as well as figures by using appropriate symbols and tools, so they can arrange strategy easily to solve mathematical literacy questions.

  13. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  14. [Evaluation on methodological problems in reports concerning quantitative analysis of syndrome differentiation of diabetes mellitus].

    PubMed

    Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu

    2006-01-01

    To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.

  15. Quantitative comparison between full-spectrum and filter-based imaging in hyperspectral fluorescence microscopy

    PubMed Central

    GAO, L.; HAGEN, N.; TKACZYK, T.S.

    2012-01-01

    Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127

  16. dCLIP: a computational approach for comparative CLIP-seq analyses

    PubMed Central

    2014-01-01

    Although comparison of RNA-protein interaction profiles across different conditions has become increasingly important to understanding the function of RNA-binding proteins (RBPs), few computational approaches have been developed for quantitative comparison of CLIP-seq datasets. Here, we present an easy-to-use command line tool, dCLIP, for quantitative CLIP-seq comparative analysis. The two-stage method implemented in dCLIP, including a modified MA normalization method and a hidden Markov model, is shown to be able to effectively identify differential binding regions of RBPs in four CLIP-seq datasets, generated by HITS-CLIP, iCLIP and PAR-CLIP protocols. dCLIP is freely available at http://qbrc.swmed.edu/software/. PMID:24398258

  17. Differences in Caregiver-Reported Health Problems and Health Care Use in Maltreated Adolescents and a Comparison Group from the Same Urban Environment

    PubMed Central

    Schneiderman, Janet U.; Kools, Susan; Negriff, Sonya; Smith, Sharon; Trickett, Penelope K.

    2014-01-01

    Maltreated youth have a high prevalence of acute and chronic mental and physical health problems, but it is not clear whether these problems are related to maltreatment or to a disadvantaged environment. To compare health status and health care use of maltreated youth receiving child protective services to comparison youth living in the same community, we conducted a secondary analysis of caregiver reports for 207 maltreated adolescents (mean age 11.9 years) and 142 comparison adolescents (mean age 12.3 years) living in urban Los Angeles, using questionnaire data from a larger longitudinal study framed in a socio-ecological model. Caregivers included biological parents, relatives, and unrelated caregivers. Analyses included t-test, MANOVA, chi-square, and multivariable logistic regression. Caregivers reported similar rates of physical health problems but more mental health problems and psychotropic medicine use in maltreated youth than in the comparison youth, suggesting that maltreated youths’ higher rates of mental health problems could not be attributed to the disadvantaged environment. Although there were no differences in health insurance coverage, maltreated youth received preventive medical care more often than comparison youth. For all youth, having Medicaid improved their odds of receiving preventive health and dental care. Attention to mental health issues in adolescents receiving child welfare services remains important. Acceptance of Medicaid by neighborhood-based and/or school-based services in low-income communities may reduce barriers to preventive care. PMID:25557881

  18. Comparing Observations, 1st Experimental Edition.

    ERIC Educational Resources Information Center

    Butts, David P.

    Objectives for this module include the ability to: (1) order objects by comparing a property which the objects have in common (such as length, area, volume or mass), (2) describe objects (length, area, volume, mass, etc.) by comparing them quantitatively using either arbitrary units of comparison or standard units of comparison, and (3) describe…

  19. Integrating Conceptual and Quantitative Knowledge

    ERIC Educational Resources Information Center

    Metzgar, Matthew

    2013-01-01

    There has been an emphasis in some science courses to focus more on teaching conceptual knowledge. Though certain innovations have been successful in increasing student conceptual knowledge, performance on quantitative problem-solving tasks often remains unaffected. Research also shows that students tend to maintain conceptual and quantitative…

  20. Generating Linear Equations Based on Quantitative Reasoning

    ERIC Educational Resources Information Center

    Lee, Mi Yeon

    2017-01-01

    The Common Core's Standards for Mathematical Practice encourage teachers to develop their students' ability to reason abstractly and quantitatively by helping students make sense of quantities and their relationships within problem situations. The seventh-grade content standards include objectives pertaining to developing linear equations in…

  1. Counseling Persons with Comorbid Disorders: A Quantitative Comparison of Counselor Active Rehabilitation Service and Standard Rehabilitation Counseling Approaches

    ERIC Educational Resources Information Center

    Ferdinandi, Andrew D.; Li, Ming Hui

    2007-01-01

    The purpose of this quantitative study was to investigate the effect of counselor active rehabilitation service compared with the effect of standard rehabilitation counseling in assisting individuals with coexisting psychiatric and substance abuse disorders in attaining desired life roles. This study was conducted during a 6-month period in a…

  2. Clinical applications of a quantitative analysis of regional lift ventricular wall motion

    NASA Technical Reports Server (NTRS)

    Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.

    1975-01-01

    Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.

  3. An Exploration of a Quantitative Reasoning Instructional Approach to Linear Equations in Two Variables with Community College Students

    ERIC Educational Resources Information Center

    Belue, Paul T.; Cavey, Laurie Overman; Kinzel, Margaret T.

    2017-01-01

    In this exploratory study, we examined the effects of a quantitative reasoning instructional approach to linear equations in two variables on community college students' conceptual understanding, procedural fluency, and reasoning ability. This was done in comparison to the use of a traditional procedural approach for instruction on the same topic.…

  4. Changes in landscape patterns and associated forest succession on the western slope of the Rocky Mountains, Colorado

    Treesearch

    Daniel J. Manier; Richard D. Laven

    2001-01-01

    Using repeat photography, we conducted a qualitative and quantitative analysis of changes in forest cover on the western slope of the Rocky Mountains in Colorado. For the quantitative analysis, both images in a pair were classified using remote sensing and geographic information system (GIS) technologies. Comparisons were made using three landscape metrics: total...

  5. Best Bang for the Buck: Part 1 – The Size of Experiments Relative to Design Performance

    DOE PAGES

    Anderson-Cook, Christine Michaela; Lu, Lu

    2016-10-01

    There are many choices to make, when designing an experiment for a study, such as: what design factors to consider, which levels of the factors to use and which model to focus on. One aspect of design, however, is often left unquestioned: the size of the experiment. When learning about design of experiments, problems are often posed as "select a design for a particular objective with N runs." It’s tempting to consider the design size as a given constraint in the design-selection process. If you think of learning through designed experiments as a sequential process, however, strategically planning for themore » use of resources at different stages of data collection can be beneficial: Saving experimental runs for later is advantageous if you can efficiently learn with less in the early stages. Alternatively, if you’re too frugal in the early stages, you might not learn enough to proceed confidently with the next stages. Therefore, choosing the right-sized experiment is important—not too large or too small, but with a thoughtful balance to maximize the knowledge gained given the available resources. It can be a great advantage to think about the design size as flexible and include it as an aspect for comparisons. Sometimes you’re asked to provide a small design that is too ambitious for the goals of the study. Finally, if you can show quantitatively how the suggested design size might be inadequate or lead to problems during analysis—and also offer a formal comparison to some alternatives of different (likely larger) sizes—you may have a better chance to ask for additional resources to deliver statistically sound and satisfying results« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Cook, Christine Michaela; Lu, Lu

    There are many choices to make, when designing an experiment for a study, such as: what design factors to consider, which levels of the factors to use and which model to focus on. One aspect of design, however, is often left unquestioned: the size of the experiment. When learning about design of experiments, problems are often posed as "select a design for a particular objective with N runs." It’s tempting to consider the design size as a given constraint in the design-selection process. If you think of learning through designed experiments as a sequential process, however, strategically planning for themore » use of resources at different stages of data collection can be beneficial: Saving experimental runs for later is advantageous if you can efficiently learn with less in the early stages. Alternatively, if you’re too frugal in the early stages, you might not learn enough to proceed confidently with the next stages. Therefore, choosing the right-sized experiment is important—not too large or too small, but with a thoughtful balance to maximize the knowledge gained given the available resources. It can be a great advantage to think about the design size as flexible and include it as an aspect for comparisons. Sometimes you’re asked to provide a small design that is too ambitious for the goals of the study. Finally, if you can show quantitatively how the suggested design size might be inadequate or lead to problems during analysis—and also offer a formal comparison to some alternatives of different (likely larger) sizes—you may have a better chance to ask for additional resources to deliver statistically sound and satisfying results« less

  7. In Vitro Comparison of Adipokine Export Signals.

    PubMed

    Sharafi, Parisa; Kocaefe, Y Çetin

    2016-01-01

    Mammalian cells are widely used for recombinant protein production in research and biotechnology. Utilization of export signals significantly facilitates production and purification processes. 35 years after the discovery of the mammalian export machinery, there still are obscurities regarding the efficiency of the export signals. The aim of this study was the comparative evaluation of the efficiency of selected export signals using adipocytes as a cell model. Adipocytes have a large capacity for protein secretion including several enzymes, adipokines, and other signaling molecules, providing a valid system for a quantitative evaluation. Constructs that expressed N-terminal fusion export signals were generated to express Enhanced Green Fluorescence Protein (EGFP) as a reporter for quantitative and qualitative evaluation. Furthermore, fluorescent microscopy was used to trace the intracellular traffic of the reporter. The export efficiency of six selected proteins secreted from adipocytes was evaluated. Quantitative comparison of intracellular and exported fractions of the recombinant constructs demonstrated a similar efficiency among the studied sequences with minor variations. The export signal of Retinol Binding Protein (RBP4) exhibited the highest efficiency. This study presents the first quantitative data showing variations among export signals, in adipocytes which will help optimization of recombinant protein distribution.

  8. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  9. Playful Physics

    NASA Technical Reports Server (NTRS)

    Weaver, David

    2008-01-01

    Effectively communicate qualitative and quantitative information orally and in writing. Explain the application of fundamental physical principles to various physical phenomena. Apply appropriate problem-solving techniques to practical and meaningful problems using graphical, mathematical, and written modeling tools. Work effectively in collaborative groups.

  10. When Should Zero Be Included on a Scale Showing Magnitude?

    ERIC Educational Resources Information Center

    Kozak, Marcin

    2011-01-01

    This article addresses an important problem of graphing quantitative data: should one include zero on the scale showing magnitude? Based on a real time series example, the problem is discussed and some recommendations are proposed.

  11. The Effect of Communication Skills and Interpersonal Problem Solving Skills on Social Self-Efficacy

    ERIC Educational Resources Information Center

    Erozkan, Atilgan

    2013-01-01

    The purpose of this study was to examine communication skills, interpersonal problem solving skills, and social self-efficacy perception of adolescents and the predictive role of communication skills and interpersonal problem solving skills on social self-efficacy. This study is a quantitative and relational study aimed at examining the…

  12. An Investigation of Maternal Emotion Socialization Behaviors, Children's Self-Perceptions, and Social Problem-Solving Skills

    ERIC Educational Resources Information Center

    Ozkan, Hurside Kubra; Aksoy, Ayse Belgin

    2017-01-01

    Purpose: The present study aims to investigate maternal emotion socialization, children's self-perception, and social problem-solving skills. In addition, this study describes the association between the levels of children's self-perception and social problem-solving skills. Research Methods: This is a quantitative study adopting a relational…

  13. Assessing Leadership and Problem-Solving Skills and Their Impacts in the Community.

    ERIC Educational Resources Information Center

    Rohs, F. Richard; Langone, Christine A.

    1993-01-01

    A pretest-posttest control group design was used to assess the leadership and problem-solving skills of 281 participants and 110 controls in a statewide community leadership development program. Quantitative and qualitative data demonstrate that the program has been a catalyst to influence leadership and problem-solving skills for community…

  14. Nontargeted quantitation of lipid classes using hydrophilic interaction liquid chromatography-electrospray ionization mass spectrometry with single internal standard and response factor approach.

    PubMed

    Cífková, Eva; Holčapek, Michal; Lísa, Miroslav; Ovčačíková, Magdaléna; Lyčka, Antonín; Lynen, Frédéric; Sandra, Pat

    2012-11-20

    The identification and quantitation of a wide range of lipids in complex biological samples is an essential requirement for the lipidomic studies. High-performance liquid chromatography-mass spectrometry (HPLC/MS) has the highest potential to obtain detailed information on the whole lipidome, but the reliable quantitation of multiple lipid classes is still a challenging task. In this work, we describe a new method for the nontargeted quantitation of polar lipid classes separated by hydrophilic interaction liquid chromatography (HILIC) followed by positive-ion electrospray ionization mass spectrometry (ESI-MS) using a single internal lipid standard to which all class specific response factors (RFs) are related to. The developed method enables the nontargeted quantitation of lipid classes and molecules inside these classes in contrast to the conventional targeted quantitation, which is based on predefined selected reaction monitoring (SRM) transitions for selected lipids only. In the nontargeted quantitation method described here, concentrations of lipid classes are obtained by the peak integration in HILIC chromatograms multiplied by their RFs related to the single internal standard (i.e., sphingosyl PE, d17:1/12:0) used as common reference for all polar lipid classes. The accuracy, reproducibility and robustness of the method have been checked by various means: (1) the comparison with conventional lipidomic quantitation using SRM scans on a triple quadrupole (QqQ) mass analyzer, (2) (31)P nuclear magnetic resonance (NMR) quantitation of the total lipid extract, (3) method robustness test using subsequent measurements by three different persons, (4) method transfer to different HPLC/MS systems using different chromatographic conditions, and (5) comparison with previously published results for identical samples, especially human reference plasma from the National Institute of Standards and Technology (NIST human plasma). Results on human plasma, egg yolk and porcine liver extracts are presented and discussed.

  15. Quantitative Evaluation of Musical Scale Tunings

    ERIC Educational Resources Information Center

    Hall, Donald E.

    1974-01-01

    The acoustical and mathematical basis of the problem of tuning the twelve-tone chromatic scale is reviewed. A quantitative measurement showing how well any tuning succeeds in providing just intonation for any specific piece of music is explained and applied to musical examples using a simple computer program. (DT)

  16. IT: An Effective Pedagogic Tool in the Teaching of Quantitative Methods in Management.

    ERIC Educational Resources Information Center

    Nadkami, Sanjay M.

    1998-01-01

    Examines the possibility of supplementing conventional pedagogic methods with information technology-based teaching aids in the instruction of quantitative methods to undergraduate students. Considers the case for a problem-based learning approach, and discusses the role of information technology. (Author/LRW)

  17. Modeling phytoplankton community in reservoirs. A comparison between taxonomic and functional groups-based models.

    PubMed

    Di Maggio, Jimena; Fernández, Carolina; Parodi, Elisa R; Diaz, M Soledad; Estrada, Vanina

    2016-01-01

    In this paper we address the formulation of two mechanistic water quality models that differ in the way the phytoplankton community is described. We carry out parameter estimation subject to differential-algebraic constraints and validation for each model and comparison between models performance. The first approach aggregates phytoplankton species based on their phylogenetic characteristics (Taxonomic group model) and the second one, on their morpho-functional properties following Reynolds' classification (Functional group model). The latter approach takes into account tolerance and sensitivity to environmental conditions. The constrained parameter estimation problems are formulated within an equation oriented framework, with a maximum likelihood objective function. The study site is Paso de las Piedras Reservoir (Argentina), which supplies water for consumption for 450,000 population. Numerical results show that phytoplankton morpho-functional groups more closely represent each species growth requirements within the group. Each model performance is quantitatively assessed by three diagnostic measures. Parameter estimation results for seasonal dynamics of the phytoplankton community and main biogeochemical variables for a one-year time horizon are presented and compared for both models, showing the functional group model enhanced performance. Finally, we explore increasing nutrient loading scenarios and predict their effect on phytoplankton dynamics throughout a one-year time horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Traumatic Experience and Somatoform Dissociation Among Spirit Possession Practitioners in the Dominican Republic.

    PubMed

    Schaffler, Yvonne; Cardeña, Etzel; Reijman, Sophie; Haluza, Daniela

    2016-03-01

    Recent studies in African contexts have revealed a strong association between spirit possession and severe trauma, with inclusion into a possession cult serving at times a therapeutic function. Research on spirit possession in the Dominican Republic has so far not included quantitative studies of trauma and dissociation. This study evaluated demographic variables, somatoform dissociative symptoms, and potentially traumatizing events in the Dominican Republic with a group of Vodou practitioners that either do or do not experience spirit possession. Inter-group comparisons revealed that in contrast to non-possessed participants (n = 38), those experiencing spirit possession (n = 47) reported greater somatoform dissociation, more problems with sleep, and previous exposure to mortal danger such as assaults, accidents, or diseases. The two groups did not differ significantly in other types of trauma. The best predictor variable for group classification was somatoform dissociation, although those items could also reflect the experience of followers during a possession episode. A factor analysis across variables resulted in three factors: having to take responsibility early on in life and taking on a professional spiritual role; traumatic events and pain; and distress/dissociation. In comparison with the non-possessed individuals, the possessed ones did not seem to overall have a remarkably more severe story of trauma and seemed to derive economic gains from possession practice.

  19. Comparison of different numerical treatments for x-ray phase tomography of soft tissue from differential phase projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelliccia, Daniele; Vaz, Raquel; Svalbe, Imants

    X-ray imaging of soft tissue is made difficult by their low absorbance. The use of x-ray phase imaging and tomography can significantly enhance the detection of these tissues and several approaches have been proposed to this end. Methods such as analyzer-based imaging or grating interferometry produce differential phase projections that can be used to reconstruct the 3D distribution of the sample refractive index. We report on the quantitative comparison of three different methods to obtain x-ray phase tomography with filtered back-projection from differential phase projections in the presence of noise. The three procedures represent different numerical approaches to solve themore » same mathematical problem, namely phase retrieval and filtered back-projection. It is found that obtaining individual phase projections and subsequently applying a conventional filtered back-projection algorithm produces the best results for noisy experimental data, when compared with other procedures based on the Hilbert transform. The algorithms are tested on simulated phantom data with added noise and the predictions are confirmed by experimental data acquired using a grating interferometer. The experiment is performed on unstained adult zebrafish, an important model organism for biomedical studies. The method optimization described here allows resolution of weak soft tissue features, such as muscle fibers.« less

  20. Specificity control for read alignments using an artificial reference genome-guided false discovery rate.

    PubMed

    Giese, Sven H; Zickmann, Franziska; Renard, Bernhard Y

    2014-01-01

    Accurate estimation, comparison and evaluation of read mapping error rates is a crucial step in the processing of next-generation sequencing data, as further analysis steps and interpretation assume the correctness of the mapping results. Current approaches are either focused on sensitivity estimation and thereby disregard specificity or are based on read simulations. Although continuously improving, read simulations are still prone to introduce a bias into the mapping error quantitation and cannot capture all characteristics of an individual dataset. We introduce ARDEN (artificial reference driven estimation of false positives in next-generation sequencing data), a novel benchmark method that estimates error rates of read mappers based on real experimental reads, using an additionally generated artificial reference genome. It allows a dataset-specific computation of error rates and the construction of a receiver operating characteristic curve. Thereby, it can be used for optimization of parameters for read mappers, selection of read mappers for a specific problem or for filtering alignments based on quality estimation. The use of ARDEN is demonstrated in a general read mapper comparison, a parameter optimization for one read mapper and an application example in single-nucleotide polymorphism discovery with a significant reduction in the number of false positive identifications. The ARDEN source code is freely available at http://sourceforge.net/projects/arden/.

  1. Comparison of Adsorption/Desorption of Volatile Organic Compounds (VOCs) on Electrospun Nanofibers with Tenax TA for Potential Application in Sampling

    PubMed Central

    Chu, Lanling; Deng, Siwei; Zhao, Renshan; Deng, Jianjun; Kang, Xuejun

    2016-01-01

    The objective of this study was to compare the adsorption/desorption of target compounds on homemade electrospun nanofibers, polystyrene (PS) nanofibers, acrylic resin (AR) nanofibers and PS-AR composite nanofibers with Tenax TA. Ten volatile organic compounds (VOCs) were analyzed by preconcentration onto different sorbents followed by desorption (thermal and solvent orderly) and analysis by capillary gas chromatography. In comparison to Tenax TA, the electrospun nanofibers displayed a significant advantage in desorption efficiency and adsorption selectivity. Stability studies were conducted as a comparative experiment between PS-AR nanofibers and Tenax TA using toluene as the model compound. No stability problems were observed upon storage of toluene on both PS-AR nanofibers and Tenax TA over 60 hours period when maintained in an ultra-freezer (−80°C). The nanofibers provided slightly better stability for the adsorbed analytes than Tenax TA under other storage conditions. In addition, the nanofibers also provided slightly better precision than Tenax TA. The quantitative adsorption of PS-AR nanofibers exhibited a good linearity, as evidenced by the 0.988–0.999 range of regression coefficients (R). These results suggest that for VOCs sampling the electrospun nanofibers can be a potential ideal adsorbent. PMID:27776140

  2. Quantitative ENT endoscopy: the future in the new millennium

    NASA Astrophysics Data System (ADS)

    Mueller, Andreas; Schubert, Mario

    1999-06-01

    In Otorhinolaryngology the endoscopic appraisal of luminal dimensions of the nose, the throat, the larynx and the trachea is a daily problem. Those concerned with endoscopy know, that endoscopes distort dimensions of examined anatomical structures. To draw conclusions on luminal dimensions from the endoscopic pictures additional measuring devices are required. We developed a new method of measuring luminal dimensions in rigid or flexible endoscopy. For this a laser beam directed radially marks the anatomical lumen of interest in the videoendoscopic vision. The laser ring becomes deformed according to the form of the cavity explored. By keeping the distance defined between the laser ring and the top of the endoscope, the endoscopic video image can be measured. A piece of software developed by us calculates from the pictures the cross sectional area as well as the extension of benign or malign stenosis of the cavity explored. The result of the endoscopic measuring procedure can be visualized 3D on a PC-monitor. We are going to demonstrate the result of our clinical experience in different otorhinolaryngological diseases with the new endoscopic measuring kit in comparison to standard endoscopy. A further perspective is the endoscopic measuring kit in comparison to standard endoscopy. A further perspective is the endoscopic assisted manufacturing (EAM) of anatomical adapted stents, tubes and cannules.

  3. A Comparison Study for DNA Motif Modeling on Protein Binding Microarray.

    PubMed

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Wong, Hau-San

    2016-01-01

    Transcription factor binding sites (TFBSs) are relatively short (5-15 bp) and degenerate. Identifying them is a computationally challenging task. In particular, protein binding microarray (PBM) is a high-throughput platform that can measure the DNA binding preference of a protein in a comprehensive and unbiased manner; for instance, a typical PBM experiment can measure binding signal intensities of a protein to all possible DNA k-mers (k = 8∼10). Since proteins can often bind to DNA with different binding intensities, one of the major challenges is to build TFBS (also known as DNA motif) models which can fully capture the quantitative binding affinity data. To learn DNA motif models from the non-convex objective function landscape, several optimization methods are compared and applied to the PBM motif model building problem. In particular, representative methods from different optimization paradigms have been chosen for modeling performance comparison on hundreds of PBM datasets. The results suggest that the multimodal optimization methods are very effective for capturing the binding preference information from PBM data. In particular, we observe a general performance improvement if choosing di-nucleotide modeling over mono-nucleotide modeling. In addition, the models learned by the best-performing method are applied to two independent applications: PBM probe rotation testing and ChIP-Seq peak sequence prediction, demonstrating its biological applicability.

  4. Methodological triangulation in a study of social support for siblings of children with cancer.

    PubMed

    Murray, J S

    1999-10-01

    Triangulation is an approach to research that is becoming increasingly popular among nurse researchers. Five types of triangulation are used in nursing research: data, methodological, theoretical, researcher, and analytical triangulation. Methodological triangulation is an attempt to improve validity by combining various techniques in one study. In this article, an example of quantitative and qualitative triangulation is discussed to illustrate the procedures used and the results achieved. The secondary data used as an example are from a previous study that was conducted by the researcher and investigated nursing interventions used by pediatric oncology nurses to provide social support to siblings of children with cancer. Results show that methodological triangulation was beneficial in this study for three reasons. First, the careful comparison of quantitative and qualitative data added support for the social support variables under investigation. Second, the comparison showed more in-depth dimensions about pediatric oncology nurses providing social support to siblings of children with cancer. Finally, the use of methodological triangulation provided insight into revisions for the quantitative instrument.

  5. Quantitative Comparison of Tumor Delivery for Multiple Targeted Nanoparticles Simultaneously by Multiplex ICP-MS

    PubMed Central

    Elias, Andrew; Crayton, Samuel H.; Warden-Rothman, Robert; Tsourkas, Andrew

    2014-01-01

    Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples. PMID:25068300

  6. Comparison of quantitative and qualitative tests for glucose-6-phosphate dehydrogenase deficiency.

    PubMed

    LaRue, Nicole; Kahn, Maria; Murray, Marjorie; Leader, Brandon T; Bansil, Pooja; McGray, Sarah; Kalnoky, Michael; Zhang, Hao; Huang, Huiqiang; Jiang, Hui; Domingo, Gonzalo J

    2014-10-01

    A barrier to eliminating Plasmodium vivax malaria is inadequate treatment of infected patients. 8-Aminoquinoline-based drugs clear the parasite; however, people with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk for hemolysis from these drugs. Understanding the performance of G6PD deficiency tests is critical for patient safety. Two quantitative assays and two qualitative tests were evaluated. The comparison of quantitative assays gave a Pearson correlation coefficient of 0.7585 with significant difference in mean G6PD activity, highlighting the need to adhere to a single reference assay. Both qualitative tests had high sensitivity and negative predictive value at a cutoff G6PD value of 40% of normal activity if interpreted conservatively and performed under laboratory conditions. The performance of both tests dropped at a cutoff level of 45%. Cytochemical staining of specimens confirmed that heterozygous females with > 50% G6PD-deficient cells can seem normal by phenotypic tests. © The American Society of Tropical Medicine and Hygiene.

  7. A synthesis of sedimentary records of Australian environmental change during the last 2000 years

    NASA Astrophysics Data System (ADS)

    Tyler, J. J.; Karoly, D. J.; Gell, P.; Goodwin, I. D.

    2013-12-01

    Our understanding of Southern Hemispheric climate variability on multidecadal to multicentennial timescales is limited by a scarcity of quantitative, highly resolved climate records, a problem which is particularly manifest in Australia. To date there are no quantitative, annually resolved records from within continental Australia which extend further back in time than the most recent c. 300 years [Neukom and Gergis, 2012; PAGES 2k Consortium, 2013]. By contrast, a number of marine, lake, peat and speleothem sedimentary records exist, some of which span multiple millennia at sub-decadal resolution. Here we report a database of existing sedimentary records of environmental change in Australia [Freeman et al., 2011], of which 25 have sample resolutions < 100 years/sample and which span > 500 years in duration. The majority of these records are located in southeastern Australia, providing an invaluable resource with which to examine regional scale climate and environmental change. Although most of the records can not be quantitatively related to climate variability, Empirical Orthogonal Functions coupled with Monte Carlo iterative age modelling, demonstrate coherent patterns of environmental and ecological change. This coherency, as well as comparisons with a limited number of quantitative records, suggests that regional hydroclimatic changes were responsible for the observed patterns. Here, we discuss the implications of these findings with respect to Southern Hemisphere climate during the last 2000 years. In addition, we review the progress and potential of ongoing research in the region. References: Freeman, R., I. D. Goodwin, and T. Donovan (2011), Paleoclimate data synthesis and data base for the reconstruction of climate variability and impacts in NSW over the past 2000 years., Climate Futures Technical Report, 1/2011, 50 pages. Neukom, R., and J. Gergis (2012), Southern Hemisphere high-resolution palaeoclimate records of the last 2000 years, Holocene, 22(5), 501-524, doi:10.1177/0959683611427335. PAGES 2k Consortium (2013), Continental-scale temperature variability during the past two millennia, Nature Geoscience, 6, 339-346.

  8. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews.

    PubMed

    Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G

    2009-04-03

    To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.

  9. Associating quantitative behavioral traits with gene expression in the brain: searching for diamonds in the hay.

    PubMed

    Reiner-Benaim, Anat; Yekutieli, Daniel; Letwin, Noah E; Elmer, Gregory I; Lee, Norman H; Kafkafi, Neri; Benjamini, Yoav

    2007-09-01

    Gene expression and phenotypic functionality can best be associated when they are measured quantitatively within the same experiment. The analysis of such a complex experiment is presented, searching for associations between measures of exploratory behavior in mice and gene expression in brain regions. The analysis of such experiments raises several methodological problems. First and foremost, the size of the pool of potential discoveries being screened is enormous yet only few biologically relevant findings are expected, making the problem of multiple testing especially severe. We present solutions based on screening by testing related hypotheses, then testing the hypotheses of interest. In one variant the subset is selected directly, in the other one a tree of hypotheses is tested hierarchical; both variants control the False Discovery Rate (FDR). Other problems in such experiments are in the fact that the level of data aggregation may be different for the quantitative traits (one per animal) and gene expression measurements (pooled across animals); in that the association may not be linear; and in the resolution of interest only few replications exist. We offer solutions to these problems as well. The hierarchical FDR testing strategies presented here can serve beyond the structure of our motivating example study to any complex microarray study. Supplementary data are available at Bioinformatics online.

  10. Are qualitative and quantitative sleep problems associated with delinquency when controlling for psychopathic features and parental supervision?

    PubMed

    Backman, Heidi; Laajasalo, Taina; Saukkonen, Suvi; Salmi, Venla; Kivivuori, Janne; Aronen, Eeva T

    2015-10-01

    The aim of this study was to explore the relationship between sleep, including both qualitative and quantitative aspects, and delinquent behaviour while controlling for psychopathic features of adolescents and parental supervision at bedtime. We analysed data from a nationally representative sample of 4855 Finnish adolescents (mean age 15.3 years, 51% females). Sleep problems, hours of sleep and delinquency were evaluated via self-report. Psychopathic features were measured with the Antisocial Process Screening Device - Self-Report. In negative binomial regressions, gender and sleep-related variables acted as predictors for both property and violent crime after controlling for psychopathic features and parental supervision at bedtime. The results suggest that both sleep problems (at least three times per week, at least for a year) and an insufficient amount of sleep (less than 7 h) are associated with property crime and violent behaviour, and the relationship is not explained by gender, degree of parental supervision at bedtime or co-occurring psychopathic features. These results suggest that sleep difficulties and insufficient amount of sleep are associated with delinquent behaviour in adolescents. The significance of addressing sleep-related problems, both qualitative and quantitative, among adolescents is thus highlighted. Implications for a prevention technique of delinquent behaviour are discussed. © 2015 European Sleep Research Society.

  11. Online Interactive Teaching Modules Enhance Quantitative Proficiency of Introductory Biology Students

    PubMed Central

    Nelson, Kären C.; Marbach-Ad, Gili; Keller, Michael; Fagan, William F.

    2010-01-01

    There is widespread agreement within the scientific and education communities that undergraduate biology curricula fall short in providing students with the quantitative and interdisciplinary problem-solving skills they need to obtain a deep understanding of biological phenomena and be prepared fully to contribute to future scientific inquiry. MathBench Biology Modules were designed to address these needs through a series of interactive, Web-based modules that can be used to supplement existing course content across the biological sciences curriculum. The effect of the modules was assessed in an introductory biology course at the University of Maryland. Over the course of the semester, students showed significant increases in quantitative skills that were independent of previous math course work. Students also showed increased comfort with solving quantitative problems, whether or not they ultimately arrived at the correct answer. A survey of spring 2009 graduates indicated that those who had experienced MathBench in their course work had a greater appreciation for the role of mathematics in modern biology than those who had not used MathBench. MathBench modules allow students from diverse educational backgrounds to hone their quantitative skills, preparing them for more complex mathematical approaches in upper-division courses. PMID:20810959

  12. Online interactive teaching modules enhance quantitative proficiency of introductory biology students.

    PubMed

    Thompson, Katerina V; Nelson, Kären C; Marbach-Ad, Gili; Keller, Michael; Fagan, William F

    2010-01-01

    There is widespread agreement within the scientific and education communities that undergraduate biology curricula fall short in providing students with the quantitative and interdisciplinary problem-solving skills they need to obtain a deep understanding of biological phenomena and be prepared fully to contribute to future scientific inquiry. MathBench Biology Modules were designed to address these needs through a series of interactive, Web-based modules that can be used to supplement existing course content across the biological sciences curriculum. The effect of the modules was assessed in an introductory biology course at the University of Maryland. Over the course of the semester, students showed significant increases in quantitative skills that were independent of previous math course work. Students also showed increased comfort with solving quantitative problems, whether or not they ultimately arrived at the correct answer. A survey of spring 2009 graduates indicated that those who had experienced MathBench in their course work had a greater appreciation for the role of mathematics in modern biology than those who had not used MathBench. MathBench modules allow students from diverse educational backgrounds to hone their quantitative skills, preparing them for more complex mathematical approaches in upper-division courses.

  13. Experiences and expectations of women with urogenital prolapse: a quantitative and qualitative exploration.

    PubMed

    Srikrishna, S; Robinson, D; Cardozo, L; Cartwright, R

    2008-10-01

    To explore the expectations and goals of women undergoing surgery for urogenital prolapse using both a quantitative quality of life approach exploring symptom bother and a qualitative interview-based approach exploring patient goals and expectations. Prospective observational study. Tertiary referral centre for urogynaecology. Forty-three women with symptomatic pelvic organ prolapse were recruited from the waiting list for pelvic floor reconstructive surgery. All women were assessed with a structured clinical interview on an individual basis. The data obtained were transcribed verbatim and then analysed thematically based on the grounded theory. Individual codes and subcodes were identified to develop a coding framework. The prolapse quality-of-life (pQoL) questionnaire was used to determine the impact of pelvic organ prolapse on the woman's daily life. We arbitrarily classified 'bother' as minimal, mild, moderate and marked if scores ranged from 0 to 25, 25-50, 50-75 and 75-100, respectively. The degree of prolapse was objectively quantified using the pelvic organ prolapse quantification (POP-Q) system. Quantitative data were analysed using SPSS. Ethical approval was obtained from the Kings College Hospital Ethics Committee. Quantitative data from POP-Q, subjective data from pQoL, qualitative data based on the structured clinical interview. Forty-three women were recruited over the first 1 year of the study. Their mean age was 56 years (range 36-78) and mean parity was 2 (range 0-6). The mean ordinal stage of the prolapse was 2 (range stages 1-4). Quantitative analysis of the pQoL data suggested that the main domains affected were prolapse impact on life (mean score 74.71) and personal relationships (mean score 46.66). Qualitative analysis based on the clinical interview suggested that these women were most affected by the actual physical symptoms of prolapse (bulge, pain and bowel problems) as well by the impact prolapse has on their sexual function. While disease-specific QoL questionnaires allow broad comparisons to be made assessing patient bother, they may lack the sensitivity to assess individual symptoms. A qualitative approach may individualize patient care and ultimately improve patient satisfaction and overall outcome when treating women complaining of urogenital prolapse.

  14. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  15. Comparison of symptomatology and performance degradation for motion and radiation sickness. Technical report, 6 January 1984-31 March 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClellan, G.E.; Wiker, S.F.

    1985-05-31

    This report quantifies for the first time the relationship between the signs and symptoms of acute radiation sickness and those of motion sickness. With this relationship, a quantitative comparison is made between data on human performance degradation during motion sickness and estimates of performance degradation during radiation sickness. The comparison validates estimates made by the Intermediate Dose Program on the performance degradation from acute radiation sickness.

  16. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways. Calculus, calculus-based physics, chemistry, statistics, programming and linear algebra were viewed as important course preparation for a successful graduate experience. A set of recommendations for departments and for new community resources includes ideas for infusing quantitative reasoning throughout the undergraduate experience and mechanisms for learning from successful experiments in both geoscience and mathematics. A full list of participants, summaries of the meeting discussion and recommendations are available at http://serc.carleton.edu/quantskills/winter06/index.html. These documents, crafted by a small but diverse group can serve as a starting point for broader community discussion of the quantitative preparation of future geoscience graduate students.

  17. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  18. A Transformative Model for Undergraduate Quantitative Biology Education

    ERIC Educational Resources Information Center

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…

  19. Engaging Business Students in Quantitative Skills Development

    ERIC Educational Resources Information Center

    Cronin, Anthony; Carroll, Paula

    2015-01-01

    In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…

  20. Enhancing Students' Scientific and Quantitative Literacies through an Inquiry-Based Learning Project on Climate Change

    ERIC Educational Resources Information Center

    McCright, Aaron M.

    2012-01-01

    Promoting sustainability and dealing with complex environmental problems like climate change demand a citizenry with considerable scientific and quantitative literacy. In particular, students in the STEM disciplines of (biophysical) science, technology, engineering, and mathematics need to develop interdisciplinary skills that help them understand…

  1. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  2. University Students' Understanding of the Concepts Empirical, Theoretical, Qualitative and Quantitative Research

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2015-01-01

    University research education in many disciplines is frequently confronted by problems with students' weak level of understanding of research concepts. A mind map technique was used to investigate how students understand central methodological concepts of empirical, theoretical, qualitative and quantitative. The main hypothesis was that some…

  3. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  4. Using HEGIS Data in Institutional Comparisons. AIR 1984 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Christal, Melodie E.; And Others

    Problems associated with the use of the Higher Education General Information Survey (HEGIS) data to make institutional comparisons are discussed. It is noted that information collected by HEGIS includes data on enrollment, degrees, finances, employees, libraries, and physical facilities. Attention is directed to the following problems with the…

  5. Quantitative Comparison of Three Standardization Methods Using a One-Way ANOVA for Multiple Mean Comparisons

    ERIC Educational Resources Information Center

    Barrows, Russell D.

    2007-01-01

    A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…

  6. Quantification of HCV RNA in Liver Tissue by bDNA Assay.

    PubMed

    Dailey, P J; Collins, M L; Urdea, M S; Wilber, J C

    1999-01-01

    With this statement, Sherlock and Dooley have described two of the three major challenges involved in quantitatively measuring any analyte in tissue samples: the distribution of the analyte in the tissue; and the standard of reference, or denominator, with which to make comparisons between tissue samples. The third challenge for quantitative measurement of an analyte in tissue is to ensure reproducible and quantitative recovery of the analyte on extraction from tissue samples. This chapter describes a method that can be used to measure HCV RNA quantitatively in liver biopsy and tissue samples using the bDNA assay. All three of these challenges-distribution, denominator, and recovery-apply to the measurement of HCV RNA in liver biopsies.

  7. Big fish in a big pond: a study of academic self concept in first year medical students.

    PubMed

    Jackman, Kirsty; Wilson, Ian G; Seaton, Marjorie; Craven, Rhonda G

    2011-07-27

    Big-fish-little-pond effect (BFLPE) research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation.

  8. Students' conceptual performance on synthesis physics problems with varying mathematical complexity

    NASA Astrophysics Data System (ADS)

    Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan

    2017-06-01

    A body of research on physics problem solving has focused on single-concept problems. In this study we use "synthesis problems" that involve multiple concepts typically taught in different chapters. We use two types of synthesis problems, sequential and simultaneous synthesis tasks. Sequential problems require a consecutive application of fundamental principles, and simultaneous problems require a concurrent application of pertinent concepts. We explore students' conceptual performance when they solve quantitative synthesis problems with varying mathematical complexity. Conceptual performance refers to the identification, follow-up, and correct application of the pertinent concepts. Mathematical complexity is determined by the type and the number of equations to be manipulated concurrently due to the number of unknowns in each equation. Data were collected from written tasks and individual interviews administered to physics major students (N =179 ) enrolled in a second year mechanics course. The results indicate that mathematical complexity does not impact students' conceptual performance on the sequential tasks. In contrast, for the simultaneous problems, mathematical complexity negatively influences the students' conceptual performance. This difference may be explained by the students' familiarity with and confidence in particular concepts coupled with cognitive load associated with manipulating complex quantitative equations. Another explanation pertains to the type of synthesis problems, either sequential or simultaneous task. The students split the situation presented in the sequential synthesis tasks into segments but treated the situation in the simultaneous synthesis tasks as a single event.

  9. The Impact of a Virtual Public Charter School Program on the Learning Outcomes of Students with Disabilities: A Quantitative Study

    ERIC Educational Resources Information Center

    Epps, Sucari

    2017-01-01

    This quantitative study investigated the learning outcomes of students with disabilities in comparison to their non-disabled peers in a TK-12th grade school that offers a sixth-twelfth grade virtual public charter school program that currently serves students in the state of California. No differences were found between groups indicating…

  10. Comparison of Maximum Likelihood Estimation Approach and Regression Approach in Detecting Quantitative Trait Lco Using RAPD Markers

    Treesearch

    Changren Weng; Thomas L. Kubisiak; C. Dana Nelson; James P. Geaghan; Michael Stine

    1999-01-01

    Single marker regression and single marker maximum likelihood estimation were tied to detect quantitative trait loci (QTLs) controlling the early height growth of longleaf pine and slash pine using a ((longleaf pine x slash pine) x slash pine) BC, population consisting of 83 progeny. Maximum likelihood estimation was found to be more power than regression and could...

  11. Regulating Availability: How Access to Alcohol Affects Drinking and Problems in Youth and Adults

    PubMed Central

    Gruenewald, Paul J.

    2011-01-01

    Regulations on the availability of alcohol have been used to moderate alcohol problems in communities throughout the world for thousands of years. In the latter half of the 20th century, quantitative studies of the effects of these regulations on drinking and related problems began in earnest as public health practitioners began to recognize the full extent of the harmful consequences related to drinking. This article briefly outlines the history of this work over four areas, focusing on the minimum legal drinking age, the privatization of alcohol control systems, outlet densities, and hours and days of sale. Some historical background is provided to emphasize the theoretical and empirical roots of this work and to highlight the substantial progress that has been made in each area. In general, this assessment suggests that higher minimum legal drinking ages, greater monopoly controls over alcohol sales, lower outlet numbers and reduced outlet densities, and limited hours and days of sale can effectively reduce alcohol sales, use, and problems. There are, however, substantial gaps in the research literature and a near absence of the quantitative theoretical work needed to direct alcohol-control efforts. Local community responses to alcohol policies are complex and heterogeneous, sometimes reinforcing and sometimes mitigating the effects of availability regulations. Quantitative models of policy effects are essential to accelerate progress toward the formulation and testing of optimal control strategies for the reduction of alcohol problems. PMID:22330225

  12. Self-directed learning readiness of Asian students: students perspective on a hybrid problem based learning curriculum.

    PubMed

    Leatemia, Lukas D; Susilo, Astrid P; van Berkel, Henk

    2016-12-03

    To identify the student's readiness to perform self-directed learning and the underlying factors influencing it on the hybrid problem based learning curriculum. A combination of quantitative and qualitative studies was conducted in five medical schools in Indonesia. In the quantitative study, the Self Directed Learning Readiness Scale was distributed to all students in all batches, who had experience with the hybrid problem based curriculum. They were categorized into low- and high -level based on the score of the questionnaire. Three focus group discussions (low-, high-, and mixed level) were conducted in the qualitative study with six to twelve students chosen randomly from each group to find the factors influencing their self-directed learning readiness. Two researchers analysed the qualitative data as a measure of triangulation. The quantitative study showed only half of the students had a high-level of self-directed learning readiness, and a similar trend also occurred in each batch. The proportion of students with a high level of self-directed learning readiness was lower in the senior students compared to more junior students. The qualitative study showed that problem based learning processes, assessments, learning environment, students' life styles, students' perceptions of the topics, and mood, were factors influencing their self-directed learning. A hybrid problem based curriculum may not fully affect the students' self-directed learning. The curriculum system, teacher's experiences, student's background and cultural factors might contribute to the difficulties for the student's in conducting self-directed learning.

  13. A Research Methodology for Studying What Makes Some Problems Difficult to Solve

    ERIC Educational Resources Information Center

    Gulacar, Ozcan; Fynewever, Herb

    2010-01-01

    We present a quantitative model for predicting the level of difficulty subjects will experience with specific problems. The model explicitly accounts for the number of subproblems a problem can be broken into and the difficultly of each subproblem. Although the model builds on previously published models, it is uniquely suited for blending with…

  14. Science Curriculum from the Perspectives of Turkish Teachers: Problems Encountered and Suggestions for Solutions

    ERIC Educational Resources Information Center

    Bekmezci, Sinan Muhammet; Ates, Özlem

    2017-01-01

    The purpose of this research is to identify the problems that teachers have experienced during the implementation of the science curriculum and their suggestions for solution to these problems. In the research, survey model has been used among the descriptive research methods in which quantitative and qualitative data have been used together. The…

  15. Effect of Cooperative Problem-Based Lab Instruction on Metacognition and Problem-Solving Skills

    ERIC Educational Resources Information Center

    Sandi-Urena, Santiago; Cooper, Melanie; Stevens, Ron

    2012-01-01

    While most scientists agree that laboratory work is an important part of introductory science courses, there is scant evidence for the relationship between laboratory work and student learning, particularly at the college level. This work reports the quantitative component of a mixed-methods study of the effect of cooperative problem-based…

  16. Statistical differences between relative quantitative molecular fingerprints from microbial communities.

    PubMed

    Portillo, M C; Gonzalez, J M

    2008-08-01

    Molecular fingerprints of microbial communities are a common method for the analysis and comparison of environmental samples. The significance of differences between microbial community fingerprints was analyzed considering the presence of different phylotypes and their relative abundance. A method is proposed by simulating coverage of the analyzed communities as a function of sampling size applying a Cramér-von Mises statistic. Comparisons were performed by a Monte Carlo testing procedure. As an example, this procedure was used to compare several sediment samples from freshwater ponds using a relative quantitative PCR-DGGE profiling technique. The method was able to discriminate among different samples based on their molecular fingerprints, and confirmed the lack of differences between aliquots from a single sample.

  17. Mapping of thermal injury in biologic tissues using quantitative pathologic techniques

    NASA Astrophysics Data System (ADS)

    Thomsen, Sharon L.

    1999-05-01

    Qualitative and quantitative pathologic techniques can be used for (1) mapping of thermal injury, (2) comparisons lesion sizes and configurations for different instruments or heating sources and (3) comparisons of treatment effects. Concentric zones of thermal damage form around a single volume heat source. The boundaries between some of these zones are distinct and measurable. Depending on the energy deposition, heating times and tissue type, the zones can include the following beginning at the hotter center and progressing to the cooler periphery: (1) tissue ablation, (2) carbonization, (3) tissue water vaporization, (4) structural protein denaturation (thermal coagulation), (5) vital enzyme protein denaturation, (6) cell membrane disruption, (7) hemorrhage, hemostasis and hyperhemia, (8) tissue necrosis and (9) wound organization and healing.

  18. 2D problems of surface growth theory with applications to additive manufacturing

    NASA Astrophysics Data System (ADS)

    Manzhirov, A. V.; Mikhin, M. N.

    2018-04-01

    We study 2D problems of surface growth theory of deformable solids and their applications to the analysis of the stress-strain state of AM fabricated products and structures. Statements of the problems are given, and a solution method based on the approaches of the theory of functions of a complex variable is suggested. Computations are carried out for model problems. Qualitative and quantitative results are discussed.

  19. Safe uses of Hill's model: an exact comparison with the Adair-Klotz model

    PubMed Central

    2011-01-01

    Background The Hill function and the related Hill model are used frequently to study processes in the living cell. There are very few studies investigating the situations in which the model can be safely used. For example, it has been shown, at the mean field level, that the dose response curve obtained from a Hill model agrees well with the dose response curves obtained from a more complicated Adair-Klotz model, provided that the parameters of the Adair-Klotz model describe strongly cooperative binding. However, it has not been established whether such findings can be extended to other properties and non-mean field (stochastic) versions of the same, or other, models. Results In this work a rather generic quantitative framework for approaching such a problem is suggested. The main idea is to focus on comparing the particle number distribution functions for Hill's and Adair-Klotz's models instead of investigating a particular property (e.g. the dose response curve). The approach is valid for any model that can be mathematically related to the Hill model. The Adair-Klotz model is used to illustrate the technique. One main and two auxiliary similarity measures were introduced to compare the distributions in a quantitative way. Both time dependent and the equilibrium properties of the similarity measures were studied. Conclusions A strongly cooperative Adair-Klotz model can be replaced by a suitable Hill model in such a way that any property computed from the two models, even the one describing stochastic features, is approximately the same. The quantitative analysis showed that boundaries of the regions in the parameter space where the models behave in the same way exhibit a rather rich structure. PMID:21521501

  20. Using Clinical Data, Hypothesis Generation Tools and PubMed Trends to Discover the Association between Diabetic Retinopathy and Antihypertensive Drugs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senter, Katherine G; Sukumar, Sreenivas R; Patton, Robert M

    Diabetic retinopathy (DR) is a leading cause of blindness and common complication of diabetes. Many diabetic patients take antihypertensive drugs to prevent cardiovascular problems, but these drugs may have unintended consequences on eyesight. Six common classes of antihypertensive drug are angiotensin converting enzyme (ACE) inhibitors, alpha blockers, angiotensin receptor blockers (ARBs), -blockers, calcium channel blockers, and diuretics. Analysis of medical history data might indicate which of these drugs provide safe blood pressure control, and a literature review is often used to guide such analyses. Beyond manual reading of relevant publications, we sought to identify quantitative trends in literature from themore » biomedical database PubMed to compare with quantitative trends in the clinical data. By recording and analyzing PubMed search results, we found wide variation in the prevalence of each antihypertensive drug in DR literature. Drug classes developed more recently such as ACE inhibitors and ARBs were most prevalent. We also identified instances of change-over-time in publication patterns. We then compared these literature trends to a dataset of 500 diabetic patients from the UT Hamilton Eye Institute. Data for each patient included class of antihypertensive drug, presence and severity of DR. Graphical comparison revealed that older drug classes such as diuretics, calcium channel blockers, and -blockers were much more prevalent in the clinical data than in the DR and antihypertensive literature. Finally, quantitative analysis of the dataset revealed that patients taking -blockers were statistically more likely to have DR than patients taking other medications, controlling for presence of hypertension and year of diabetes onset. This finding was concerning given the prevalence of -blockers in the clinical data. We determined that clinical use of -blockers should be minimized in diabetic patients to prevent retinal damage.« less

  1. A quantitative analysis of municipal solid waste disposal charges in China.

    PubMed

    Wu, Jian; Zhang, Weiqian; Xu, Jiaxuan; Che, Yue

    2015-03-01

    Rapid industrialization and economic development have caused a tremendous increase in municipal solid waste (MSW) generation in China. China began implementing a policy of MSW disposal fees for household waste management at the end of last century. Three charging methods were implemented throughout the country: a fixed disposal fee, a potable water-based disposal fee, and a plastic bag-based disposal fee. To date, there has been little qualitative or quantitative analysis on the effectiveness of this relatively new policy. This paper provides a general overview of MSW fee policy in China, attempts to verify whether the policy is successful in reducing general waste collected, and proposes an improved charging system to address current problems. The paper presents an empirical statistical analysis of policy effectiveness derived from an environmental Kuznets curve (EKC) test on panel data of China. EKC tests on different kinds of MSW charge systems were then examined for individual provinces or cities. A comparison of existing charging systems was conducted using environmental and economic criteria. The results indicate the following: (1) the MSW policies implemented over the study period were effective in the reduction of waste generation, (2) the household waste discharge fee policy did not act as a strong driver in terms of waste prevention and reduction, and (3) the plastic bag-based disposal fee appeared to be performing well according to qualitative and quantitative analysis. Based on current situation of waste discharging management in China, a three-stage transitional charging scheme is proposed and both advantages and drawbacks discussed. Evidence suggests that a transition from a fixed disposal fee to a plastic bag-based disposal fee involving various stakeholders should be the next objective of waste reduction efforts.

  2. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  3. Migration experiences, employment status and psychological distress among Somali immigrants: a mixed-method international study.

    PubMed

    Warfa, Nasir; Curtis, Sarah; Watters, Charles; Carswell, Ken; Ingleby, David; Bhui, Kamaldeep

    2012-09-07

    The discourse about mental health problems among migrants and refugees tends to focus on adverse pre-migration experiences; there is less investigation of the environmental conditions in which refugee migrants live, and the contrasts between these situations in different countries. This cross-national study of two samples of Somali refugees living in London (UK) and Minneapolis, Minnesota, (USA) helps to fill a gap in the literature, and is unusual in being able to compare information collected in the same way in two cities in different countries. There were two parts to the study, focus groups to gather in-depth qualitative data and a survey of health status and quantifiable demographic and material factors. Three of the focus groups involved nineteen Somali professionals and five groups included twenty-eight lay Somalis who were living in London and Minneapolis. The quantitative survey was done with 189 Somali respondents, also living in London and Minneapolis. We used the MINI International Neuropsychiatric Interview (MINI) to assess ICD-10 and DSM-IV mental disorders. The overall qualitative and quantitative results suggested that challenges to masculinity, thwarted aspirations, devalued refugee identity, unemployment, legal uncertainties and longer duration of stay in the host country account for poor psychological well-being and psychiatric disorders among this group. The use of a mixed-methods approach in this international study was essential since the quantitative and qualitative data provide different layers and depth of meaning and complement each other to provide a fuller picture of complex and multi-faceted life situations of refugees and asylum seekers. The comparison between the UK and US suggests that greater flexibility of access to labour markets for this refugee group might help to promote opportunities for better integration and mental well-being.

  4. MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, M; Petrick, N; Obuchowski, N

    The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. Asmore » such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.« less

  5. Migration experiences, employment status and psychological distress among Somali immigrants: a mixed-method international study

    PubMed Central

    2012-01-01

    Background The discourse about mental health problems among migrants and refugees tends to focus on adverse pre-migration experiences; there is less investigation of the environmental conditions in which refugee migrants live, and the contrasts between these situations in different countries. This cross-national study of two samples of Somali refugees living in London (UK) and Minneapolis, Minnesota, (USA) helps to fill a gap in the literature, and is unusual in being able to compare information collected in the same way in two cities in different countries. Methods There were two parts to the study, focus groups to gather in-depth qualitative data and a survey of health status and quantifiable demographic and material factors. Three of the focus groups involved nineteen Somali professionals and five groups included twenty-eight lay Somalis who were living in London and Minneapolis. The quantitative survey was done with 189 Somali respondents, also living in London and Minneapolis. We used the MINI International Neuropsychiatric Interview (MINI) to assess ICD-10 and DSM-IV mental disorders. Results The overall qualitative and quantitative results suggested that challenges to masculinity, thwarted aspirations, devalued refugee identity, unemployment, legal uncertainties and longer duration of stay in the host country account for poor psychological well-being and psychiatric disorders among this group. Conclusion The use of a mixed-methods approach in this international study was essential since the quantitative and qualitative data provide different layers and depth of meaning and complement each other to provide a fuller picture of complex and multi-faceted life situations of refugees and asylum seekers. The comparison between the UK and US suggests that greater flexibility of access to labour markets for this refugee group might help to promote opportunities for better integration and mental well-being. PMID:22954304

  6. A comparison study of image features between FFDM and film mammogram images

    PubMed Central

    Jing, Hao; Yang, Yongyi; Wernick, Miles N.; Yarusso, Laura M.; Nishikawa, Robert M.

    2012-01-01

    Purpose: This work is to provide a direct, quantitative comparison of image features measured by film and full-field digital mammography (FFDM). The purpose is to investigate whether there is any systematic difference between film and FFDM in terms of quantitative image features and their influence on the performance of a computer-aided diagnosis (CAD) system. Methods: The authors make use of a set of matched film-FFDM image pairs acquired from cadaver breast specimens with simulated microcalcifications consisting of bone and teeth fragments using both a GE digital mammography system and a screen-film system. To quantify the image features, the authors consider a set of 12 textural features of lesion regions and six image features of individual microcalcifications (MCs). The authors first conduct a direct comparison on these quantitative features extracted from film and FFDM images. The authors then study the performance of a CAD classifier for discriminating between MCs and false positives (FPs) when the classifier is trained on images of different types (film, FFDM, or both). Results: For all the features considered, the quantitative results show a high degree of correlation between features extracted from film and FFDM, with the correlation coefficients ranging from 0.7326 to 0.9602 for the different features. Based on a Fisher sign rank test, there was no significant difference observed between the features extracted from film and those from FFDM. For both MC detection and discrimination of FPs from MCs, FFDM had a slight but statistically significant advantage in performance; however, when the classifiers were trained on different types of images (acquired with FFDM or SFM) for discriminating MCs from FPs, there was little difference. Conclusions: The results indicate good agreement between film and FFDM in quantitative image features. While FFDM images provide better detection performance in MCs, FFDM and film images may be interchangeable for the purposes of training CAD algorithms, and a single CAD algorithm may be applied to either type of images. PMID:22830771

  7. Double-exponential decay of orientational correlations in semiflexible polyelectrolytes.

    PubMed

    Bačová, P; Košovan, P; Uhlík, F; Kuldová, J; Limpouchová, Z; Procházka, K

    2012-06-01

    In this paper we revisited the problem of persistence length of polyelectrolytes. We performed a series of Molecular Dynamics simulations using the Debye-Hückel approximation for electrostatics to test several equations which go beyond the classical description of Odijk, Skolnick and Fixman (OSF). The data confirm earlier observations that in the limit of large contour separations the decay of orientational correlations can be described by a single-exponential function and the decay length can be described by the OSF relation. However, at short countour separations the behaviour is more complex. Recent equations which introduce more complicated expressions and an additional length scale could describe the results very well on both the short and the long length scale. The equation of Manghi and Netz when used without adjustable parameters could capture the qualitative trend but deviated in a quantitative comparison. Better quantitative agreement within the estimated error could be obtained using three equations with one adjustable parameter: 1) the equation of Manghi and Netz; 2) the equation proposed by us in this paper; 3) the equation proposed by Cannavacciuolo and Pedersen. Two characteristic length scales can be identified in the data: the intrinsic or bare persistence length and the electrostatic persistence length. All three equations use a single parameter to describe a smooth crossover from the short-range behaviour dominated by the intrinsic stiffness of the chain to the long-range OSF-like behaviour.

  8. Heat-Treatment-Responsive Proteins in Different Developmental Stages of Tomato Pollen Detected by Targeted Mass Accuracy Precursor Alignment (tMAPA).

    PubMed

    Chaturvedi, Palak; Doerfler, Hannes; Jegadeesan, Sridharan; Ghatak, Arindam; Pressman, Etan; Castillejo, Maria Angeles; Wienkoop, Stefanie; Egelhofer, Volker; Firon, Nurit; Weckwerth, Wolfram

    2015-11-06

    Recently, we have developed a quantitative shotgun proteomics strategy called mass accuracy precursor alignment (MAPA). The MAPA algorithm uses high mass accuracy to bin mass-to-charge (m/z) ratios of precursor ions from LC-MS analyses, determines their intensities, and extracts a quantitative sample versus m/z ratio data alignment matrix from a multitude of samples. Here, we introduce a novel feature of this algorithm that allows the extraction and alignment of proteotypic peptide precursor ions or any other target peptide from complex shotgun proteomics data for accurate quantification of unique proteins. This strategy circumvents the problem of confusing the quantification of proteins due to indistinguishable protein isoforms by a typical shotgun proteomics approach. We applied this strategy to a comparison of control and heat-treated tomato pollen grains at two developmental stages, post-meiotic and mature. Pollen is a temperature-sensitive tissue involved in the reproductive cycle of plants and plays a major role in fruit setting and yield. By LC-MS-based shotgun proteomics, we identified more than 2000 proteins in total for all different tissues. By applying the targeted MAPA data-processing strategy, 51 unique proteins were identified as heat-treatment-responsive protein candidates. The potential function of the identified candidates in a specific developmental stage is discussed.

  9. How we hear what is not there: A neural mechanism for the missing fundamental illusion

    NASA Astrophysics Data System (ADS)

    Chialvo, Dante R.

    2003-12-01

    How the brain estimates the pitch of a complex sound remains unsolved. Complex sounds are composed of more than one tone. When two tones occur together, a third lower pitched tone is often heard. This is referred to as the "missing fundamental illusion" because the perceived pitch is a frequency (fundamental) for which there is no actual source vibration. This phenomenon exemplifies a larger variety of problems related to how pitch is extracted from complex tones, music and speech, and thus has been extensively used to test theories of pitch perception. A noisy nonlinear process is presented here as a candidate neural mechanism to explain the majority of reported phenomenology and provide specific quantitative predictions. The two basic premises of this model are as follows: (I) The individual tones composing the complex tones add linearly producing peaks of constructive interference whose amplitude is always insufficient to fire the neuron (II): The spike threshold is reached only with noise, which naturally selects the maximum constructive interferences. The spacing of these maxima, and consequently the spikes, occurs at a rate identical to the perceived pitch for the complex tone. Comparison with psychophysical and physiological data reveals a remarkable quantitative agreement not dependent on adjustable parameters. In addition, results from numerical simulations across different models are consistent, suggesting relevance to other sensory modalities.

  10. A New Conflict Resolution Method for Multiple Mobile Robots in Cluttered Environments With Motion-Liveness.

    PubMed

    Shahriari, Mohammadali; Biglarbegian, Mohammad

    2018-01-01

    This paper presents a new conflict resolution methodology for multiple mobile robots while ensuring their motion-liveness, especially for cluttered and dynamic environments. Our method constructs a mathematical formulation in a form of an optimization problem by minimizing the overall travel times of the robots subject to resolving all the conflicts in their motion. This optimization problem can be easily solved through coordinating only the robots' speeds. To overcome the computational cost in executing the algorithm for very cluttered environments, we develop an innovative method through clustering the environment into independent subproblems that can be solved using parallel programming techniques. We demonstrate the scalability of our approach through performing extensive simulations. Simulation results showed that our proposed method is capable of resolving the conflicts of 100 robots in less than 1.23 s in a cluttered environment that has 4357 intersections in the paths of the robots. We also developed an experimental testbed and demonstrated that our approach can be implemented in real time. We finally compared our approach with other existing methods in the literature both quantitatively and qualitatively. This comparison shows while our approach is mathematically sound, it is more computationally efficient, scalable for very large number of robots, and guarantees the live and smooth motion of robots.

  11. The problem with coal-waste dumps inventory in Upper Silesian Coal Basin

    NASA Astrophysics Data System (ADS)

    Abramowicz, Anna; Chybiorz, Ryszard

    2017-04-01

    Coal-waste dumps are the side effect of coal mining, which has lasted in Poland for 250 years. They have negative influence on the landscape and the environment, and pollute soil, vegetation and groundwater. Their number, size and shape is changing over time, as new wastes have been produced and deposited changing their shape and enlarging their size. Moreover deposited wastes, especially overburned, are exploited for example road construction, also causing the shape and size change up to disappearing. Many databases and inventory systems were created in order to control these hazards, but some disadvantages prevent reliable statistics. Three representative databases were analyzed according to their structure and type of waste dumps description, classification and visualization. The main problem is correct classification of dumps in terms of their name and type. An additional difficulty is the accurate quantitative description (area and capacity). A complex database was created as a result of comparison, verification of the information contained in existing databases and its supplementation based on separate documentation. A variability analysis of coal-waste dumps over time is also included. The project has been financed from the funds of the Leading National Research Centre (KNOW) received by the Centre for Polar Studies for the period 2014-2018.

  12. Fetus dose estimation in thyroid cancer post-surgical radioiodine therapy.

    PubMed

    Mianji, Fereidoun A; Diba, Jila Karimi; Babakhani, Asad

    2015-01-01

    Unrecognised pregnancy during radioisotope therapy of thyroid cancer results in hardly definable embryo/fetus exposures, particularly when the thyroid gland is already removed. Sources of such difficulty include uncertainty in data like pregnancy commencing time, amount and distribution of metastasized thyroid cells in body, effect of the thyroidectomy on the fetus dose coefficient etc. Despite all these uncertainties, estimation of the order of the fetus dose in most cases is enough for medical and legal decision-making purposes. A model for adapting the dose coefficients recommended by the well-known methods to the problem of fetus dose assessment in athyrotic patients is proposed. The model defines a correction factor for the problem and ensures that the fetus dose in athyrotic pregnant patients is less than the normal patients. A case of pregnant patient undergone post-surgical therapy by I-131 is then studied for quantitative comparison of the methods. The results draw a range for the fetus dose in athyrotic patients using the derived factor. This reduces the concerns on under- or over-estimation of the embryo/fetus dose and is helpful for personal and/or legal decision-making on abortion. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. An experimental evaluation of the effect of homogenization quality as a preconditioning on oil-water two-phase volume fraction measurement accuracy using gamma-ray attenuation technique

    NASA Astrophysics Data System (ADS)

    Sharifzadeh, M.; Hashemabadi, S. H.; Afarideh, H.; Khalafi, H.

    2018-02-01

    The problem of how to accurately measure multiphase flow in the oil/gas industry remains as an important issue since the early 80 s. Meanwhile, oil-water two-phase flow rate measurement has been regarded as an important issue. Gamma-ray attenuation is one of the most commonly used methods for phase fraction measurement which is entirely dependent on the flow regime variations. The peripheral strategy applied for removing the regime dependency problem, is using a homogenization system as a preconditioning tool, as this research work demonstrates. Here, at first, TPFHL as a two-phase flow homogenizer loop has been introduced and verified by a quantitative assessment. In the wake of this procedure, SEMPF as a static-equivalent multiphase flow with an additional capability for preparing a uniform mixture has been explained. The proposed idea in this system was verified by Monte Carlo simulations. Finally, the different water-gas oil two-phase volume fractions fed to the homogenizer loop and injected into the static-equivalent system. A comparison between performance of these two systems by using gamma-ray attenuation technique, showed not only an extra ability to prepare a homogenized mixture but a remarkably increased measurement accuracy for the static-equivalent system.

  14. Perceived Child Behavior Problems, Parenting Stress, and Maternal Depressive Symptoms Among Prenatal Methamphetamine Users

    PubMed Central

    Newman, Elana; LaGasse, Linda L.; Derauf, Chris; Shah, Rizwan; Smith, Lynne M.; Arria, Amelia M.; Huestis, Marilyn A.; Haning, William; Strauss, Arthur; DellaGrotta, Sheri; Dansereau, Lynne M.; Neal, Charles; Lester, Barry M.

    2013-01-01

    The present study was designed to examine parenting stress, maternal depressive symptoms, and perceived child behavior problems among mothers who used methamphetamine (MA) during pregnancy. Participants were a subsample (n = 212; 75 exposed, 137 comparison) of biological mothers who had continuous custody of their child from birth to 36 months. The subsample was drawn from a larger, ongoing longitudinal study on the effects of prenatal methamphetamine exposure (n = 412; 204 exposed, 208 comparison) (Arria et al in Matern Child Health J 10:293–302 2006). Mothers who used MA during pregnancy reported more parenting stress and more depressive symptoms than a matched comparison group. There were no differences between groups on perceived child behavior problems. In a hierarchical linear model, depressive symptoms, and perceived child behavior problems, but not MA exposure, were statistically significant predictors of parenting stress. Screening for potential parenting problems among mothers with a history of substance abuse is warranted. Parenting interventions targeting depressive symptoms, parenting stress, and child behavior problems are needed for this population. PMID:22552952

  15. Attention problems and academic achievement: Do persistent and earlier-emerging problems have more adverse long-term effects?

    PubMed Central

    Rabiner, David L.; Carrig, Madeline; Dodge, Kenenth A.

    2013-01-01

    This study examined whether the negative association between children’s attention difficulties and their academic functioning is largely confined to children whose attention problems persist across early grades and whether it depends on when attention problems emerge in children’s schooling. Children from the normative sample of the Fast Track study were classified into four attention problem groups based on the presence vs. absence of attention problems in first and second grade. Those with attention problems in both grades showed a decline in reading and math achievement during the K-5 interval relative to children with attention problems in first grade only. Both groups of inattentive first graders also performed worse than comparison children. In contrast, children whose attention problems emerged in second grade did not differ from comparison children on any achievement outcome performed significantly better than inattentive first graders. The implications of these findings are discussed. PMID:24141101

  16. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  17. Towards Extending Forward Kinematic Models on Hyper-Redundant Manipulator to Cooperative Bionic Arms

    NASA Astrophysics Data System (ADS)

    Singh, Inderjeet; Lakhal, Othman; Merzouki, Rochdi

    2017-01-01

    Forward Kinematics is a stepping stone towards finding an inverse solution and subsequently a dynamic model of a robot. Hence a study and comparison of various Forward Kinematic Models (FKMs) is necessary for robot design. This paper deals with comparison of three FKMs on the same hyper-redundant Compact Bionic Handling Assistant (CBHA) manipulator under same conditions. The aim of this study is to project on modeling cooperative bionic manipulators. Two of these methods are quantitative methods, Arc Geometry HTM (Homogeneous Transformation Matrix) Method and Dual Quaternion Method, while the other one is Hybrid Method which uses both quantitative as well as qualitative approach. The methods are compared theoretically and experimental results are discussed to add further insight to the comparison. HTM is the widely used and accepted technique, is taken as reference and trajectory deviation in other techniques are compared with respect to HTM. Which method allows obtaining an accurate kinematic behavior of the CBHA, controlled in the real-time.

  18. A comparison of two IPv4/IPv6 transition mechanisms - OpenVPN and IVI

    NASA Astrophysics Data System (ADS)

    Vu, Cong Tuan; Tran, Quang Anh; Jiang, Frank

    2012-09-01

    This document presents a comparison of two IPv4/IPv6 transition mechanisms. They are OpenVPN and IVI. Meanwhile OpenVPN is based on tunneling technology, IVI is a stateless IPv4/IPv6 translation technique which is developed by China Education and Research Network (CERNET). This research focus on the quantitative and qualitative comparison of these two main mechanisms; how they are applied in practical situation by the Internet Service Providers, as well as their advantages and drawbacks.

  19. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  20. QR-STEM: Energy and Environment as a Context for Improving QR and STEM Understandings of 6-12 Grade Teachers II. The Quantitative Reasoning

    NASA Astrophysics Data System (ADS)

    Mayes, R.; Lyford, M. E.; Myers, J. D.

    2009-12-01

    The Quantitative Reasoning in STEM (QR STEM) project is a state level Mathematics and Science Partnership Project (MSP) with a focus on the mathematics and statistics that underlies the understanding of complex global scientific issues. This session is a companion session to the QR STEM: The Science presentation. The focus of this session is the quantitative reasoning aspects of the project. As students move from understandings that range from local to global in perspective on issues of energy and environment, there is a significant increase in the need for mathematical and statistical conceptual understanding. These understandings must be accessible to the students within the scientific context, requiring the special understandings that are endemic within quantitative reasoning. The QR STEM project brings together interdisciplinary teams of higher education faculty and middle/high school teachers to explore complex problems in energy and environment. The disciplines include life sciences, physics, chemistry, earth science, statistics, and mathematics. These interdisciplinary teams develop open ended performance tasks to implement in the classroom, based on scientific concepts that underpin energy and environment. Quantitative reasoning is broken down into three components: Quantitative Literacy, Quantitative Interpretation, and Quantitative Modeling. Quantitative Literacy is composed of arithmetic concepts such as proportional reasoning, numeracy, and descriptive statistics. Quantitative Interpretation includes algebraic and geometric concepts that underlie the ability to interpret a model of natural phenomena which is provided for the student. This model may be a table, graph, or equation from which the student is to make predictions or identify trends, or from which they would use statistics to explore correlations or patterns in data. Quantitative modeling is the ability to develop the model from data, including the ability to test hypothesis using statistical procedures. We use the term model very broadly, so it includes visual models such as box models, as well as best fit equation models and hypothesis testing. One of the powerful outcomes of the project is the conversation which takes place between science teachers and mathematics teachers. First they realize that though they are teaching concepts that cross their disciplines, the barrier of scientific language within their subjects restricts students from applying the concepts across subjects. Second the mathematics teachers discover the context of science as a means of providing real world situations that engage students in the utility of mathematics as a tool for solving problems. Third the science teachers discover the barrier to understanding science that is presented by poor quantitative reasoning ability. Finally the students are engaged in exploring energy and environment in a manner which exposes the importance of seeing a problem from multiple interdisciplinary perspectives. The outcome is a democratic citizen capable of making informed decisions, and perhaps a future scientist.

  1. Learning biology through connecting mathematics to scientific mechanisms: Student outcomes and teacher supports

    NASA Astrophysics Data System (ADS)

    Schuchardt, Anita

    Integrating mathematics into science classrooms has been part of the conversation in science education for a long time. However, studies on student learning after incorporating mathematics in to the science classroom have shown mixed results. Understanding the mixed effects of including mathematics in science has been hindered by a historical focus on characteristics of integration tangential to student learning (e.g., shared elements, extent of integration). A new framework is presented emphasizing the epistemic role of mathematics in science. An epistemic role of mathematics missing from the current literature is identified: use of mathematics to represent scientific mechanisms, Mechanism Connected Mathematics (MCM). Building on prior theoretical work, it is proposed that having students develop mathematical equations that represent scientific mechanisms could elevate their conceptual understanding and quantitative problem solving. Following design and implementation of an MCM unit in inheritance, a large-scale quantitative analysis of pre and post implementation test results showed MCM students, compared to traditionally instructed students) had significantly greater gains in conceptual understanding of mathematically modeled scientific mechanisms, and their ability to solve complex quantitative problems. To gain insight into the mechanism behind the gain in quantitative problem solving, a small-scale qualitative study was conducted of two contrasting groups: 1) within-MCM instruction: competent versus struggling problem solvers, and 2) within-competent problem solvers: MCM instructed versus traditionally instructed. Competent MCM students tended to connect their mathematical inscriptions to the scientific phenomenon and to switch between mathematical and scientifically productive approaches during problem solving in potentially productive ways. The other two groups did not. To address concerns about teacher capacity presenting barriers to scalability of MCM approaches, the types and amount of teacher support needed to achieve these types of student learning gains were investigated. In the context of providing teachers with access to educative materials, students achieved learning gains in both areas in the absence of face-to-face teacher professional development. However, maximal student learning gains required the investment of face-to-face professional development. This finding can govern distribution of scarce resources, but does not preclude implementation of MCM instruction even where resource availability does not allow for face-to-face professional development.

  2. Simplified neutrosophic sets and their applications in multi-criteria group decision-making problems

    NASA Astrophysics Data System (ADS)

    Peng, Juan-juan; Wang, Jian-qiang; Wang, Jing; Zhang, Hong-yu; Chen, Xiao-hong

    2016-07-01

    As a variation of fuzzy sets and intuitionistic fuzzy sets, neutrosophic sets have been developed to represent uncertain, imprecise, incomplete and inconsistent information that exists in the real world. Simplified neutrosophic sets (SNSs) have been proposed for the main purpose of addressing issues with a set of specific numbers. However, there are certain problems regarding the existing operations of SNSs, as well as their aggregation operators and the comparison methods. Therefore, this paper defines the novel operations of simplified neutrosophic numbers (SNNs) and develops a comparison method based on the related research of intuitionistic fuzzy numbers. On the basis of these operations and the comparison method, some SNN aggregation operators are proposed. Additionally, an approach for multi-criteria group decision-making (MCGDM) problems is explored by applying these aggregation operators. Finally, an example to illustrate the applicability of the proposed method is provided and a comparison with some other methods is made.

  3. Amygdala hypoactivity to fearful faces in boys with conduct problems and callous-unemotional traits.

    PubMed

    Jones, Alice P; Laurens, Kristin R; Herba, Catherine M; Barker, Gareth J; Viding, Essi

    2009-01-01

    Although early-onset conduct problems predict both psychiatric and health problems in adult life, little research has been done to index neural correlates of conduct problems. Emerging research suggests that a subgroup of children with conduct problems and elevated levels of callous-unemotional traits may be genetically vulnerable to manifesting disturbances in neural reactivity to emotional stimuli indexing distress. Using functional MRI, the authors evaluated differences in neural response to emotional stimuli between boys with conduct problems and elevated levels of callous-unemotional traits and comparison boys. Seventeen boys with conduct problems and elevated levels of callous-unemotional traits and 13 comparison boys of equivalent age (mean=11 years) and IQ (mean=100) viewed blocked presentations of fearful and neutral faces. For each face, participants distinguished the sex of the face via manual response. Relative to the comparison group, boys with conduct problems and elevated levels of callous-unemotional traits manifested lesser right amygdala activity to fearful faces. This finding is in line with data from studies of adults with antisocial behavior and callous-unemotional traits (i.e., psychopaths), as well as from a recent study of adolescents with callous-unemotional traits, and suggests that the neural substrates of emotional impairment associated with callous-unemotional antisocial behavior are already present in childhood.

  4. Epigenetics as a First Exit Problem

    NASA Astrophysics Data System (ADS)

    Aurell, E.; Sneppen, K.

    2002-01-01

    We develop a framework to discuss the stability of epigenetic states as first exit problems in dynamical systems with noise. We consider in particular the stability of the lysogenic state of the λ prophage. The formalism defines a quantitative measure of robustness of inherited states.

  5. Qualitative Differences in Real-Time Solution of Standardized Figural Analogies.

    ERIC Educational Resources Information Center

    Schiano, Diane J.; And Others

    Performance on standardized figural analogy tests is considered highly predictive of academic success. While information-processing models of analogy solution attribute performance differences to quantitative differences in processing parameters, the problem-solving literature suggests that qualitative differences in problem representation and…

  6. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  7. Quantitation of aflatoxins from corn and other food related materials by direct analysis in real time - mass spectrometry (DART-MS)

    USDA-ARS?s Scientific Manuscript database

    Ambient ionization coupled to mass spectrometry continues to be applied to new analytical problems, facilitating the rapid and convenient analysis of a variety of analytes. Recently, demonstrations of ambient ionization mass spectrometry applied to quantitative analysis of mycotoxins have been shown...

  8. A Developmental Perspective of Divergent Movement Ability in Early Young Children

    ERIC Educational Resources Information Center

    Zachopoulou, Evridiki; Makri, Anastasia

    2005-01-01

    Movement responses to a stimulus could be either quantitative or qualitative, or could also be the answer to a pre-established problem. This process activates both divergent thinking and critical thinking. Divergent movement ability generates both quantitative and qualitative movement responses to a stimulus. The aim of this study was to examine…

  9. A Quantitative Study of the Effectiveness of Teacher Recruitment Strategies in a Rural Midwestern State

    ERIC Educational Resources Information Center

    Kane, Rose Etta

    2010-01-01

    A problem in American education is that rural schools have difficulty recruiting licensed teachers. Teacher shortages in mathematics, science, foreign language, and special education are more acute in rural areas. The purpose of this quantitative descriptive survey study was to examine specific recruiting strategies and newly hired licensed…

  10. A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format

    ERIC Educational Resources Information Center

    Moore, Vivianne E.

    2013-01-01

    This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…

  11. Faculty Grading of Quantitative Problems: A Mismatch between Values and Practice

    ERIC Educational Resources Information Center

    Petcovic, Heather L.; Fynewever, Herb; Henderson, Charles; Mutambuki, Jacinta M.; Barney, Jeffrey A.

    2013-01-01

    Grading practices can send a powerful message to students about course expectations. A study by Henderson et al. ("American Journal of Physics" 72:164-169, 2004) in physics education has identified a misalignment between what college instructors say they value and their actual scoring of quantitative student solutions. This work identified three…

  12. Integration of Social Sciences in Terrorism Modelling: Issues, Problems and Recommendations

    DTIC Science & Technology

    2007-02-01

    qualitative social research : empirical data, patterns, regularities and case studies Terrorism emergence: causes...quantitative and qualitative methods in studies of terrorism, mass violence and conflicts, suggested models of human behaviour response to the threat of...epistemology of social research , demographics, quantitative sociological research , qualitative social research , cultural studies , etc.) can contribute

  13. A Functional Model for the Integration of Gains and Losses under Risk: Implications for the Measurement of Subjective Value

    ERIC Educational Resources Information Center

    Viegas, Ricardo G.; Oliveira, Armando M.; Garriga-Trillo, Ana; Grieco, Alba

    2012-01-01

    In order to be treated quantitatively, subjective gains and losses (utilities/disutilities) must be psychologically measured. If legitimate comparisons are sought between them, measurement must be at least interval level, with a common unit. If comparisons of absolute magnitudes across gains and losses are further sought, as in standard…

  14. The Effects of Handwriting Instruction on Reading for Students in Grades 1 and 2

    ERIC Educational Resources Information Center

    Stroik, Linda R.

    2016-01-01

    The purpose of this quantitative quasi-experimental group comparison study using a repeated measures comparison group design with random assignment of subjects to groups was to investigate the effects of handwriting instruction on reading progress for learners in grade 1 and grade 2. At three points in time, the number of words each student read…

  15. Quantitative Analysis of the Shape of the Corpus Callosum in Patients with Autism and Comparison Individuals

    ERIC Educational Resources Information Center

    Casanova, Manuel F.; El-Baz, Ayman; Elnakib, Ahmed; Switala, Andrew E.; Williams, Emily L.; Williams, Diane L.; Minshew, Nancy J.; Conturo, Thomas E.

    2011-01-01

    Multiple studies suggest that the corpus callosum in patients with autism is reduced in size. This study attempts to elucidate the nature of this morphometric abnormality by analyzing the shape of this structure in 17 high-functioning patients with autism and an equal number of comparison participants matched for age, sex, IQ, and handedness. The…

  16. Systematic review of statistically-derived models of immunological response in HIV-infected adults on antiretroviral therapy in Sub-Saharan Africa.

    PubMed

    Sempa, Joseph B; Ujeneza, Eva L; Nieuwoudt, Martin

    2017-01-01

    In Sub-Saharan African (SSA) resource limited settings, Cluster of Differentiation 4 (CD4) counts continue to be used for clinical decision making in antiretroviral therapy (ART). Here, HIV-infected people often remain with CD4 counts <350 cells/μL even after 5 years of viral load suppression. Ongoing immunological monitoring is necessary. Due to varying statistical modeling methods comparing immune response to ART across different cohorts is difficult. We systematically review such models and detail the similarities, differences and problems. 'Preferred Reporting Items for Systematic Review and Meta-Analyses' guidelines were used. Only studies of immune-response after ART initiation from SSA in adults were included. Data was extracted from each study and tabulated. Outcomes were categorized into 3 groups: 'slope', 'survival', and 'asymptote' models. Wordclouds were drawn wherein the frequency of variables occurring in the reviewed models is indicated by their size and color. 69 covariates were identified in the final models of 35 studies. Effect sizes of covariates were not directly quantitatively comparable in view of the combination of differing variables and scale transformation methods across models. Wordclouds enabled the identification of qualitative and semi-quantitative covariate sets for each outcome category. Comparison across categories identified sex, baseline age, baseline log viral load, baseline CD4, ART initiation regimen and ART duration as a minimal consensus set. Most models were different with respect to covariates included, variable transformations and scales, model assumptions, modelling strategies and reporting methods, even for the same outcomes. To enable comparison across cohorts, statistical models would benefit from the application of more uniform modelling techniques. Historic efforts have produced results that are anecdotal to individual cohorts only. This study was able to define 'prior' knowledge in the Bayesian sense. Such information has value for prospective modelling efforts.

  17. An alternative method for analysis of food taints using stir bar sorptive extraction.

    PubMed

    Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M

    2010-09-10

    The determination of taints in food products currently can involve the use of several sample extraction techniques, including direct headspace (DHS), steam distillation extraction (SDE) and more recently solid phase microextraction (SPME). Each of these techniques has disadvantages, such as the use of large volumes of solvents (SDE), or limitations in sensitivity (DHS), or have only been applied to date for determination of individual or specific groups of tainting compounds (SPME). The use of stir bar sorptive extraction (SBSE) has been evaluated as a quantitative screening method for unknown tainting compounds in foods. A range of commonly investigated problem compounds, with a range of physical and chemical properties, were examined. The method was optimised to give the best response for the majority of compounds and the performance was evaluated by examining the accuracy, precision, linearity, limits of detection and quantitation and uncertainties for each analyte. For most compounds SBSE gave the lowest limits of detection compared to steam distillation extraction or direct headspace analysis and in general was better than these established techniques. However, for methyl methacrylate and hexanal no response was observed following stir bar extraction under the optimised conditions. The assays were carried out using a single quadrupole GC-MS in scan mode. A comparison of acquisition modes and instrumentation was performed using standards to illustrate the increase in sensitivity possible using more targeted ion monitoring or a more sensitive high resolution mass spectrometer. This comparison illustrated the usefulness of this approach as an alternative to specialised glassware or expensive instrumentation. SBSE in particular offers a 'greener' extraction method by a large reduction in the use of organic solvents and also minimises the potential for contamination from external laboratory sources, which is of particular concern for taint analysis. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Quantitative habitability.

    PubMed

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  19. Interlaboratory comparison of extraction efficiency of pesticides from surface and laboratory water using solid-phase extraction disks.

    PubMed

    Senseman, Scott A; Mueller, Thomas C; Riley, Melissa B; Wauchope, R Don; Clegg, Chris; Young, Roddy W; Southwick, Lloyd M; Moye, H Anson; Dumas, Jose A; Mersie, Wondi; Mattice, John D; Leidy, Ross B

    2003-06-18

    A continuation of an earlier interlaboratory comparison was conducted (1) to assess solid-phase extraction (SPE) using Empore disks to extract atrazine, bromacil, metolachlor, and chlorpyrifos from various water sources accompanied by different sample shipping and quantitative techniques and (2) to compare quantitative results of individual laboratories with results of one common laboratory. Three replicates of a composite surface water (SW) sample were fortified with the analytes along with three replicates of deionized water (DW). A nonfortified DW sample and a nonfortified SW sample were also extracted. All samples were extracted using Empore C(18) disks. After extraction, part of the samples were eluted and analyzed in-house. Duplicate samples were evaporated in a 2-mL vial, shipped dry to a central laboratory (SDC), redissolved, and analyzed. Overall, samples analyzed in-house had higher recoveries than SDC samples. Laboratory x analysis type and laboratory x water source interactions were significant for all four compounds. Seven laboratories participated in this interlaboratory comparison program. No differences in atrazine recoveries were observed from in-house samples analyzed by laboratories A, B, D, and G compared with the recovery of SDC samples. In-house atrazine recoveries from laboratories C and F were higher when compared with recovery from SDC samples. However, laboratory E had lower recoveries from in-house samples compared with SDC samples. For each laboratory, lower recoveries were observed for chlorpyrifos from the SDC samples compared with samples analyzed in-house. Bromacil recovery was <65% at two of the seven laboratories in the study. Bromacil recoveries for the remaining laboratories were >75%. Three laboratories showed no differences in metolachlor recovery; two laboratories had higher recoveries for samples analyzed in-house, and two other laboratories showed higher metolachlor recovery for SDC samples. Laboratory G had a higher recovery in SW for all four compounds compared with DW. Other laboratories that had significant differences in pesticide recovery between the two water sources showed higher recovery in DW than in the SW regardless of the compound. In comparison to earlier work, recovery of these compounds using SPE disks as a temporary storage matrix may be more effective than shipping dried samples in a vial. Problems with analytes such as chlorpyrifos are unavoidable, and it should not be assumed that an extraction procedure using SPE disks will be adequate for all compounds and transferrable across all chromatographic conditions.

  20. Simulating the Heterogeneity in Braided Channel Belt Deposits: 2. Examples of Results and Comparison to Natural Deposits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guin, Arijit; Ramanathan, Ramya; Ritzi, Robert W.

    In Part 1 of this series we presented a methodology and a code for modeling the hierarchical sedimentary architecture in braided channel belt deposits. Here, in Part 2, the code was used to create a digital model of this architecture, and the corresponding spatial distribution of permeability. The simulated architecture was compared to the real stratal architecture observed in an abandoned channel belt of the Sagavanirktok River, Alaska by Lunt et al. (2004). The comparisons included assessments of similarity which were both qualitative and quantitative. From the qualitative comparisons we conclude that a synthetic deposit created by the code hasmore » unit types, at each level, with a geometry which is generally consistent with the geometry of unit types observed in the field. The digital unit types would generally be recognized as representing their counterparts in nature, including cross stratasets, lobate and scroll bar deposits, channel fills, etc. Furthermore, the synthetic deposit has a hierarchical spatial relationship among these units which represents how the unit types are observed in field exposures and in geophysical images. In quantitative comparisons the proportions and the length, width, and height of unit types at different scales, across all levels of the stratal hierarchy compare well between the digital and the natural deposits. A number of important attributes of the channel belt model were shown to be influenced by more than one level within the hierarchy of stratal architecture. First, the high-permeability open-framework gravels percolated at all levels and thus formed preferential flow pathways. Open framework gravels are indeed known to form preferential flow pathways in natural channel belt deposits. The nature of a percolating cluster changed across different levels of the hierarchy of stratal architecture. As a result of this geologic structure, the percolation occurs at proportions of open-framework gravels below the theoretical percolation threshold for random infinite media. Second, when the channel belt model was populated with permeability distributions by lowest-level unit type, the composite permeability semivariogram contained structures that were identifiable at more than one scale, and each of these structures could be directly linked to unit types of different scales existing at different levels within the hierarchy of strata. These collective results are encouraging with respect to our goal that this model be relevant as a base case in future studies for testing ideas in research addressing the upscaling problem in aquifers and reservoirs with multi-scale heterogeneity.« less

  1. Psychological, neuropsychological, and electrocortical effects of mixed mold exposure.

    PubMed

    Crago, B Robert; Gray, Michael R; Nelson, Lonnie A; Davis, Marilyn; Arnold, Linda; Thrasher, Jack D

    2003-08-01

    The authors assessed the psychological, neuropsychological, and electrocortical effects of human exposure to mixed colonies of toxigenic molds. Patients (N = 182) with confirmed mold-exposure history completed clinical interviews, a symptom checklist (SCL-90-R), limited neuropsychological testing, quantitative electroencephalogram (QEEG) with neurometric analysis, and measures of mold exposure. Patients reported high levels of physical, cognitive, and emotional symptoms. Ratings on the SCL-90-R were "moderate" to "severe," with a factor reflecting situational depression accounting for most of the variance. Most of the patients were found to suffer from acute stress, adjustment disorder, or post-traumatic stress. Differential diagnosis confirmed an etiology of a combination of external stressors, along with organic metabolically based dysregulation of emotions and decreased cognitive functioning as a result of toxic or metabolic encephalopathy. Measures of toxic mold exposure predicted QEEG measures and neuropsychological test performance. QEEG results included narrowed frequency bands and increased power in the alpha and theta bands in the frontal areas of the cortex. These findings indicated a hypoactivation of the frontal cortex, possibly due to brainstem involvement and insufficient excitatory input from the reticular activating system. Neuropsychological testing revealed impairments similar to mild traumatic brain injury. In comparison with premorbid estimates of intelligence, findings of impaired functioning on multiple cognitive tasks predominated. A dose-response relationship between measures of mold exposure and abnormal neuropsychological test results and QEEG measures suggested that toxic mold causes significant problems in exposed individuals. Study limitations included lack of a comparison group, patient selection bias, and incomplete data sets that did not allow for comparisons among variables.

  2. A comparison of the analytical performance of five commercially available assays for neutrophil gelatinase-associated lipocalin using urine.

    PubMed

    Kift, Rebecca L; Messenger, Michael P; Wind, Tobias C; Hepburn, Sophie; Wilson, Michelle; Thompson, Douglas; Smith, Matthew Welberry; Sturgeon, Catharine; Lewington, Andrew J; Selby, Peter J; Banks, Rosamonde E

    2013-05-01

    Neutrophil gelatinase-associated lipocalin (NGAL) is a promising biomarker for acute kidney injury that is beginning to be used in clinical practice in addition to research studies. The current study describes an independent validation and comparison of five commercially available NGAL assays, focusing on urine samples. This is an essential step in the translation of this marker to clinical use in terms of allowing valid inter-study comparison and generation of robust results. Two CE (Conformité Européenne)-marked assays, the NGAL Test (BioPorto) on Siemens ADVIA(®) 1800 and the ARCHITECT Urine NGAL assay on i2000SR (Abbott Laboratories), and three research-use-only (RUO) ELISAs (R&D Systems, Hycult and BioPorto) were evaluated. Imprecision, parallelism, recovery, selectivity, limit of quantitation (LOQ), vulnerability to interference and hook effect were assessed and inter-assay agreement was determined using 68 urine samples from patients with various renal diseases and healthy controls. The Abbott and R&D Systems assays demonstrated satisfactory performance for all parameters tested. However for the other three assays evaluated, problems were identified with LOQ (BioPorto/ADVIA(®)), parallelism (BioPorto ELISA) or several parameters (Hycult). Between-method agreement varied with the Hycult assay in particular being markedly different and highlighting issues with standardization and form of NGAL measured. Variability exists between the five NGAL assays in terms of their performance and this should be taken into account when interpreting results from the various clinical or research studies measuring urinary NGAL.

  3. Comparison of methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles.

    PubMed

    Schneider, Barbara St Pierre; Nicholas, Jennifer; Kurrus, Jeffrey E

    2013-01-01

    To compare the methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles. The methodologic quality of quantitative nursing education research needs to advance to a higher level. Clinical research can provide guidance for nursing education to reach this level. One hundred quantitative clinical research articles from-high impact journals published in 2007 and 37 education research articles from high impact journals published in 2006 to 2007 were chosen for analysis. Clinical articles had significantly higher quality scores than education articles in three domains: number of institutions studied, type of data, and outcomes. The findings indicate three ways in which nursing education researchers can strengthen the methodologic quality of their quantitative research. With this approach, greater funding may be secured for advancing the science of nursing education.

  4. The rapid quantitation of the filamentous blue-green alga plectonema boryanum by the luciferase assay for ATP

    NASA Technical Reports Server (NTRS)

    Bush, V. N.

    1974-01-01

    Plectonema boryanum is a filamentous blue green alga. Blue green algae have a procaryotic cellular organization similar to bacteria, but are usually obligate photoautotrophs, obtaining their carbon and energy from photosynthetic mechanism similar to higher plants. This research deals with a comparison of three methods of quantitating filamentous populations: microscopic cell counts, the luciferase assay for ATP and optical density measurements.

  5. Econophysical visualization of Adam Smith’s invisible hand

    NASA Astrophysics Data System (ADS)

    Cohen, Morrel H.; Eliazar, Iddo I.

    2013-02-01

    Consider a complex system whose macrostate is statistically observable, but yet whose operating mechanism is an unknown black-box. In this paper we address the problem of inferring, from the system’s macrostate statistics, the system’s intrinsic force yielding the observed statistics. The inference is established via two diametrically opposite approaches which result in the very same intrinsic force: a top-down approach based on the notion of entropy, and a bottom-up approach based on the notion of Langevin dynamics. The general results established are applied to the problem of visualizing the intrinsic socioeconomic force-Adam Smith’s invisible hand-shaping the distribution of wealth in human societies. Our analysis yields quantitative econophysical representations of figurative socioeconomic forces, quantitative definitions of “poor” and “rich”, and a quantitative characterization of the “poor-get-poorer” and the “rich-get-richer” phenomena.

  6. Quantitative imaging of aggregated emulsions.

    PubMed

    Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J

    2006-02-28

    Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.

  7. Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses

    PubMed Central

    Alexander, Elsinore; Wei, Xin; Lee, Shinwook

    2018-01-01

    Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526

  8. Direct comparison of low- and mid-frequency Raman spectroscopy for quantitative solid-state pharmaceutical analysis.

    PubMed

    Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J

    2018-02-05

    This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. The Integral Method, a new approach to quantify bactericidal activity.

    PubMed

    Gottardi, Waldemar; Pfleiderer, Jörg; Nagl, Markus

    2015-08-01

    The bactericidal activity (BA) of antimicrobial agents is generally derived from the results of killing assays. A reliable quantitative characterization and particularly a comparison of these substances, however, are impossible with this information. We here propose a new method that takes into account the course of the complete killing curve for assaying BA and that allows a clear-cut quantitative comparison of antimicrobial agents with only one number. The new Integral Method, based on the reciprocal area below the killing curve, reliably calculates an average BA [log10 CFU/min] and, by implementation of the agent's concentration C, the average specific bactericidal activity SBA=BA/C [log10 CFU/min/mM]. Based on experimental killing data, the pertaining BA and SBA values of exemplary active halogen compounds were established, allowing quantitative assertions. N-chlorotaurine (NCT), chloramine T (CAT), monochloramine (NH2Cl), and iodine (I2) showed extremely diverging SBA values of 0.0020±0.0005, 1.11±0.15, 3.49±0.22, and 291±137log10 CFU/min/mM, respectively, against Staphylococcus aureus. This immediately demonstrates an approximately 550-fold stronger activity of CAT, 1730-fold of NH2Cl, and 150,000-fold of I2 compared to NCT. The inferred quantitative assertions and conclusions prove the new method suitable for characterizing bactericidal activity. Its application comprises the effect of defined agents on various bacteria, the consequence of temperature shifts, the influence of varying drug structure, dose-effect relationships, ranking of isosteric agents, comparison of competing commercial antimicrobial formulations, and the effect of additives. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Talking Physics: Two Case Studies on Short Answers and Self-explanation in Learning Physics

    NASA Astrophysics Data System (ADS)

    Badeau, Ryan C.

    This thesis explores two case studies into the use of short answers and self-explanation to improve student learning in physics. The first set of experiments focuses on the role of short answer questions in the context of computer-based instruction. Through a series of six experiments, we compare and evaluate the performance of computer-assessed short answer questions versus multiple choice for training conceptual topics in physics, controlling for feedback between the two formats. In addition to finding overall similar improvements on subsequent student performance and retention, we identify unique differences in how students interact with the treatments in terms of time spent on feedback and performance on follow-up short answer assessment. In addition, we identify interactions between the level of interactivity of the training, question format, and student attitudinal ratings of each respective training. The second case study focuses on the use of worked examples in the context of multi-concept physics problems - which we call "synthesis problems." For this part of the thesis, four experiments were designed to evaluate the effectiveness of two instructional methods employing worked examples on student performance with synthesis problems; these instructional techniques, analogical comparison and self-explanation, have previously been studied primarily in the context of single-concept problems. As such, the work presented here represents a novel focus on extending these two techniques to this class of more complicated physics problem. Across the four experiments, both self-explanation and certain kinds of analogical comparison of worked examples significantly improved student performance on a target synthesis problem, with distinct improvements in recognition of the relevant concepts. More specifically, analogical comparison significantly improved student performance when the comparisons were invoked between worked synthesis examples. In contrast, similar comparisons between corresponding pairs of worked single-concept examples did not significantly improve performance. On a more complicated synthesis problem, self-explanation was significantly more effective than analogical comparison, potentially due to differences in how successfully students encoded the full structure of the worked examples. Finally, we find that the two techniques can be combined for additional benefit, with the trade-off of slightly more time-on-task.

  11. 76 FR 5719 - Pattern of Violations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... safety and health record of each mine rather than on a strictly quantitative comparison of mines to... several reservations, given the methodological difficulties involved in estimating the compensating wage...

  12. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    NASA Astrophysics Data System (ADS)

    Walters, Charles David

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008) related to quantitative reasoning. However, this may prove challenging, as prior to entering the classroom, PSTs often have few opportunities to develop MKT by examining and reflecting on students' thinking. Videos offer one avenue through which such opportunities are possible. In this study, I report on the design of a mini-course for PSTs that featured a series of videos created as part of a proof-of-concept NSF-funded project. These MathTalk videos highlight the ways in which the quantitative reasoning of two high school students developed over time. Using a mixed approach to grounded theory, I analyzed pre- and postinterviews using an extant coding scheme based on the Silverman and Thompson (2008) framework for the development of MKT. This analysis revealed a shift in participants' affect as well as three distinct shifts in their MKT around quantitative reasoning with distances, including shifts in: (a) quantitative reasoning; (b) point of view (decentering); and (c) orientation toward problem solving. Using the four-part focusing framework (Lobato, Hohensee, & Rhodehamel, 2013), I analyzed classroom data to account for how participants' noticing was linked with the shifts in MKT. Notably, their increased noticing of aspects of MKT around quantitative reasoning with distances, which features prominently in the MathTalk videos, seemed to contribute to the emergence of the shifts in MKT. Results from this study link elements of the learning environment to the development of specific facets of MKT around quantitative reasoning with distances. These connections suggest that vicarious experiences with two students' quantitative reasoning over time was critical for participants' development of MKT.

  13. Problem Solving, Patterns, Probability, Pascal, and Palindromes.

    ERIC Educational Resources Information Center

    Hylton-Lindsay, Althea Antoinette

    2003-01-01

    Presents a problem-solving activity, the birth order problem, and several solution-seeking strategies. Includes responses of current and prospective teachers and a comparison of various strategies. (YDS)

  14. A software package to improve image quality and isolation of objects of interest for quantitative stereology studies of rat hepatocarcinogenesis.

    PubMed

    Xu, Yihua; Pitot, Henry C

    2006-03-01

    In the studies of quantitative stereology of rat hepatocarcinogenesis, we have used image analysis technology (automatic particle analysis) to obtain data such as liver tissue area, size and location of altered hepatic focal lesions (AHF), and nuclei counts. These data are then used for three-dimensional estimation of AHF occurrence and nuclear labeling index analysis. These are important parameters for quantitative studies of carcinogenesis, for screening and classifying carcinogens, and for risk estimation. To take such measurements, structures or cells of interest should be separated from the other components based on the difference of color and density. Common background problems seen on the captured sample image such as uneven light illumination or color shading can cause severe problems in the measurement. Two application programs (BK_Correction and Pixel_Separator) have been developed to solve these problems. With BK_Correction, common background problems such as incorrect color temperature setting, color shading, and uneven light illumination background, can be corrected. With Pixel_Separator different types of objects can be separated from each other in relation to their color, such as seen with different colors in immunohistochemically stained slides. The resultant images of such objects separated from other components are then ready for particle analysis. Objects that have the same darkness but different colors can be accurately differentiated in a grayscale image analysis system after application of these programs.

  15. Theoretical foundations for a quantitative approach to paleogenetics. I, II.

    NASA Technical Reports Server (NTRS)

    Holmquist, R.

    1972-01-01

    It is shown that by neglecting the phenomena of multiple hits, back mutation, and chance coincidence errors larger than 100% can be introduced in the calculated value of the average number of nucleotide base differences to be expected between two homologous polynucleotides. Mathematical formulas are derived to correct quantitatively for these effects. It is pointed out that the effects change materially the quantitative aspects of phylogenics, such as the length of the legs of the trees. A number of problems are solved without approximation.-

  16. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  17. A transformative model for undergraduate quantitative biology education.

    PubMed

    Usher, David C; Driscoll, Tobin A; Dhurjati, Prasad; Pelesko, John A; Rossi, Louis F; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions.

  18. A Transformative Model for Undergraduate Quantitative Biology Education

    PubMed Central

    Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions. PMID:20810949

  19. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  20. Children's Environmental Concerns: Expressing Ecophobia

    ERIC Educational Resources Information Center

    Strife, Susan Jean

    2012-01-01

    While numerous quantitative studies across disciplines have investigated children's knowledge and attitudes about environmental problems, few studies examine children's feelings about environmental problems--and even fewer have focused on the child's point of view. Through 50 in-depth interviews with urban children (ages 10-12) this research aimed…

  1. Treatment preparation in the context of system coordination serves inmates well.

    PubMed

    Windell, Phillip A; Barron, Nancy

    2002-01-01

    A large percentage of jail inmates suffer from substance abuse problems; however, providing treatment in jail is difficult. Multnomah County's In Jail Intervention Program (IJIP) demonstrated an effective alternative. Finigan, Barron, and Carey (In press) and Barron and Finigan (1999) demonstrated that inmates with substance use problems, especially women, participating in IJIP experienced fewer rearrests and reincarcerations. To address the question of what led to these outcomes, quantitative data were abstracted from program, jail, and state administrative databases and were supplemented by face-to-face interviews with key informants, including program participants and former participants. In addition to their substance abuse problems, IJIP participants were chronic offenders who were more likely to be diagnosed with mental health problems. Results suggest that treatment preparation together with coordination of jail release and entry to treatment increased numbers enrolling in treatment and helped former inmates engage in treatment more quickly. Quantitative data suggest that the longer inmates stayed in IJIP, the more likely was completion of community treatment.

  2. Assessment of the Microscreen phage-induction assay for screening hazardous wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houk, V.S.; DeMarini, D.M.

    1987-09-01

    The Microscreen phage-induction assay, which quantitatively measures the induction of prophage lambda in Escherichia coli WP2s(lambda), was used to test 14 crude (unfractionated) hazardous industrial waste samples for genotoxic activity in the presence and absence of metabolic activation. Eleven of the 14 wastes induced prophage, and induction was observed at concentrations as low as 0.4 picograms per ml. Comparisons between the mutagenicity of these waste samples in Salmonella and their ability to induce prophage lambda indicate that the Microscreen phage-induction assay detected genotoxic activity in all but one of the wastes that were mutagenic in Salmonella. Moreover, the Microscreen assaymore » detected as genotoxic 5 additional wastes that were not detected in the Salmonella assay. The applicability of the Microscreen phage-induction assay for screening hazardous wastes for genotoxic activity is discussed along with some of the problems associated with screening highly toxic wastes containing toxic volatile compounds.« less

  3. Use of the microscreen phage-induction assay to assess the genotoxicity of 14 hazardous industrial wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houk, V.S.; DeMarini, D.M.

    1988-01-01

    The Microscreen phage-induction assay, which quantitatively measures the induction of prophage lambda in Escherichia coli WP2s(lambda), was used to test 14 crude (unfractionated) hazardous industrial waste samples for genotoxic activity in the presence and absence of metabolic activation. Eleven of the 14 wastes induced prophage, and induction was observed at concentrations as low as 0.4 pg per ml. Comparisons between the ability of these waste samples to induce prophage and their mutagenicity in the Salmonella reverse mutation assay indicate that the phage-induction assay detected genotoxic activity in all but one of the wastes that were mutagenic in Salmonella. Moreover, themore » Microscreen assay detected as genotoxic five additional wastes that were not detected in the Salmonella assay. The applicability of the Microscreen phage-induction assay for screening hazardous wastes for genotoxic activity is discussed, as are some of the problems associated with screening highly toxic wastes containing toxic volatile compounds.« less

  4. Use of the Microscreen phage-induction assay to assess the genotoxicity of 14 hazardous industrial wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houk, V.S.; DeMarini, D.M.

    1988-01-01

    The Microscreen phage-induction assay, which quantitatively measures the induction of prophage lambda in Escherichia coli WP2s lambda, was used to test 14 crude (unfractionated) hazardous industrial-waste samples for genotoxic activity in the presence and absence of metabolic activation. Eleven of the 14 wastes induced prophage, and induction was observed at concentrations as low as 0.4 picograms per ml. Comparisons between the mutagenicity of these waste samples in Salmonella and their ability to induce prophage lambda indicate that the Microscreen phage-induction assay detected genotoxic activity in all but one of the wastes that were mutagenic in Salmonella. Moreover, the Microscreen assaymore » detected as genotoxic 5 additional wastes that were not detected in the Salmonella assay. The applicability of the Microscreen phage-induction assay for screening hazardous wastes for genotoxic activity is discussed along with some of the problems associated with screening highly toxic wastes containing toxic volatile compounds.« less

  5. Distribution and assessment of marine debris in the deep Tyrrhenian Sea (NW Mediterranean Sea, Italy).

    PubMed

    Angiolillo, Michela; Lorenzo, Bianca di; Farcomeni, Alessio; Bo, Marzia; Bavestrello, Giorgio; Santangelo, Giovanni; Cau, Angelo; Mastascusa, Vincenza; Cau, Alessandro; Sacco, Flavio; Canese, Simonepietro

    2015-03-15

    Marine debris is a recognized global ecological concern. Little is known about the extent of the problem in the Mediterranean Sea regarding litter distribution and its influence on deep rocky habitats. A quantitative assessment of debris present in the deep seafloor (30-300 m depth) was carried out in 26 areas off the coast of three Italian regions in the Tyrrhenian Sea, using a Remotely Operated Vehicle (ROV). The dominant type of debris (89%) was represented by fishing gears, mainly lines, while plastic objects were recorded only occasionally. Abundant quantities of gears were found on rocky banks in Sicily and Campania (0.09-0.12 debris m(-2)), proving intense fishing activity. Fifty-four percent of the recorded debris directly impacted benthic organisms, primarily gorgonians, followed by black corals and sponges. This work provides a first insight on the impact of marine debris in Mediterranean deep ecosystems and a valuable baseline for future comparisons. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Multicenter evaluation of the new Abbott RealTime assays for quantitative detection of human immunodeficiency virus type 1 and hepatitis C virus RNA.

    PubMed

    Schutten, M; Peters, D; Back, N K T; Beld, M; Beuselinck, K; Foulongne, V; Geretti, A-M; Pandiani, L; Tiemann, C; Niesters, H G M

    2007-06-01

    The analytical performances of the new Abbott RealTime hepatitis C virus (HCV) and human immunodeficiency virus type 1 viral load assays were compared at nine laboratories with different competitor assays. These included the Abbott LcX, Bayer Versant bDNA, Roche COBAS Amplicor, and Roche COBAS TaqMan assays. Two different protocols used during the testing period with and without a pre-m1000 RNA isolation spin were compared. The difference proved to be nonsignificant. A uracil-N-glycosylase (UNG) contamination control option in the HCV test for previous Roche COBAS Amplicor users was evaluated. It proved to decrease amplicon carryover by 100-fold independent of the amplicon input concentration. The protocol including UNG proved to overcome problems with false-positive negative controls. Comparison with other assays revealed only minor differences. The largest difference was observed between the Abbott HCV RealTime assay and the Roche COBAS Amplicor HCV Monitor version 2.0 assay.

  7. [Access to primary healthcare services: still a way to go].

    PubMed

    Mendes, Antônio da Cruz Gouveia; Miranda, Gabriella Morais Duarte; Figueiredo, Karla Erika Gouveia; Duarte, Petra Oliveira; Furtado, Betise Mery Alencar Sousa Macau

    2012-11-01

    This study seeks to evaluate accessibility to the Basic Units of the Family Health Strategy (ESF-UB) and Traditional Basic Units (BU-T) in the city of Recife in 2009. Data were collected through three instruments: a roadmap for systematic observation of the units and questionnaires for users and professional units. This is a descriptive cross-sectional study using a quantitative approach, and 1180 users, 61 doctors and 56 nurses were interviewed. The results showed good ties and recognition of users whereby primary healthcare is seen as the access portal to the health system. In the comparison between ESF-UB and UB-T, evaluations are always more favorable to the family healthcare strategy, though with relatively insignificant differences. The overall result revealed widespread dissatisfaction with the difficulty of obtaining drugs and taking tests, and also with the waiting times and access to specialized care. This showed the existence of organizational problems that may constitute barriers limiting accessibility to basic healthcare services for users.

  8. Algorithms for constructing optimal paths and statistical analysis of passenger traffic

    NASA Astrophysics Data System (ADS)

    Trofimov, S. P.; Druzhinina, N. G.; Trofimova, O. G.

    2018-01-01

    Several existing information systems of urban passenger transport (UPT) are considered. Author’s UPT network model is presented. To a passenger a new service is offered that is the best path from one stop to another stop at a specified time. The algorithm and software implementation for finding the optimal path are presented. The algorithm uses the current UPT schedule. The article also describes the algorithm of statistical analysis of trip payments by the electronic E-cards. The algorithm allows obtaining the density of passenger traffic during the day. This density is independent of the network topology and UPT schedules. The resulting density of the traffic flow can solve a number of practical problems. In particular, the forecast for the overflow of passenger transport in the «rush» hours, the quantitative comparison of different topologies transport networks, constructing of the best UPT timetable. The efficiency of the proposed integrated approach is demonstrated by the example of the model town with arbitrary dimensions.

  9. Investigating Factorial Invariance of Latent Variables Across Populations When Manifest Variables Are Missing Completely

    PubMed Central

    Widaman, Keith F.; Grimm, Kevin J.; Early, Dawnté R.; Robins, Richard W.; Conger, Rand D.

    2013-01-01

    Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group. PMID:24019738

  10. Computational analysis of drop formation before and after the first singularity: the fate of free and satellite drops during simple dripping and DOD drop formation

    NASA Astrophysics Data System (ADS)

    Chen, Alvin U.; Basaran, Osman A.

    2000-11-01

    Drop formation from a capillary --- dripping mode --- or an ink jet nozzle --- drop-on-demand (DOD) mode --- falls into a class of scientifically challenging yet practically useful free surface flows that exhibit a finite time singularity, i.e. the breakup of an initially single liquid mass into two or more fragments. While computational tools to model such problems have been developed recently, they lack the accuracy needed to quantitatively predict all the dynamics observed in experiments. Here we present a new finite element method (FEM) based on a robust algorithm for elliptic mesh generation and remeshing to handle extremely large interface deformations. The new algorithm allows continuation of computations beyond the first singularity to track fates of both primary and any satellite drops. The accuracy of the computations is demonstrated by comparison of simulations with experimental measurements made possible with an ultra high-speed digital imager capable of recording 100 million frames per second.

  11. To frame is to explain: a deductive frame-analysis of Dutch and French climate change coverage during the annual UN Conferences of the Parties.

    PubMed

    Dirikx, Astrid; Gelders, Dave

    2010-11-01

    This study examines the way Dutch and French newspapers frame climate change during the annual United Nations Conferences of the Parties. The methods used in previous studies on the framing of climate change do not allow for general cross-national comparisons. We conduct a quantitative deductive framing analysis on 257 quality Dutch and French newspaper articles between 2001 and 2007. Both countries' newspapers seem to frame climate change through mainly the same lens. The majority of the articles make reference to the consequences of the (non-)pursuit of a certain course of action and of possible losses and gains (consequences frame). Additionally, many articles mention the need for urgent actions, refer to possible solutions and suggest that governments are responsible for and/or capable of alleviating climate change problems (responsibility frame). Finally, the conflict frame was found to be used less often than the aforementioned frames, but more regularly than the human interest frame.

  12. Development of soft scaffolding strategy to improve student’s creative thinking ability in physics

    NASA Astrophysics Data System (ADS)

    Nurulsari, Novinta; Abdurrahman; Suyatna, Agus

    2017-11-01

    Student’s creative thinking ability in physics learning can be developed through a learning experience. However, many students fail to gain a learning experience because of the lack of teacher roles in providing assistance to students when they face learning difficulties. In this study, a soft scaffolding strategy developed to improve student’s creative thinking ability in physics, especially in optical instruments. The methods used were qualitative and quantitative. The soft scaffolding strategy developed was called the 6E Soft Scaffolding Strategy where 6E stands for Explore real-life problems, Engage students with web technology, Enable experiment using analogies, Elaborate data through multiple representations, Encourage questioning, and Ensure the feedback. The strategy was applied to 60 students in secondary school through cooperative learning. As a comparison, conventional strategies were also applied to 60 students in the same school and grade. The result of the study showed that the soft scaffolding strategy was effective in improving student’s creative thinking ability.

  13. A comparison of the immune responses of dogs exposed to canine distemper virus (CDV) - Differences between vaccinated and wild-type virus exposed dogs.

    PubMed

    Perrone, Danielle; Bender, Scott; Niewiesk, Stefan

    2010-07-01

    Canine distemper virus (CDV)-specific immune response was measured in different dog populations. Three groups of vaccinated or wild-type virus exposed dogs were tested: dogs with a known vaccination history, dogs without a known vaccination history (shelter dogs), and dogs with potential exposure to wild-type CDV. The use of a T-cell proliferation assay demonstrated a detectable CDV-specific T-cell response from both spleen and blood lymphocytes of dogs. Qualitatively, antibody assays [enzyme-linked immunosorbent assay (ELISA) and neutralization assay] predicted the presence of a T-cell response well, although quantitatively neither antibody assays nor the T-cell assay correlated well with each other. An interesting finding from our study was that half of the dogs in shelters were not vaccinated (potentially posing a public veterinary health problem) and that antibody levels in dogs living in an environment with endemic CDV were lower than in vaccinated animals.

  14. A comparison of the immune responses of dogs exposed to canine distemper virus (CDV) — Differences between vaccinated and wild-type virus exposed dogs

    PubMed Central

    Perrone, Danielle; Bender, Scott; Niewiesk, Stefan

    2010-01-01

    Canine distemper virus (CDV)-specific immune response was measured in different dog populations. Three groups of vaccinated or wild-type virus exposed dogs were tested: dogs with a known vaccination history, dogs without a known vaccination history (shelter dogs), and dogs with potential exposure to wild-type CDV. The use of a T-cell proliferation assay demonstrated a detectable CDV-specific T-cell response from both spleen and blood lymphocytes of dogs. Qualitatively, antibody assays [enzyme-linked immunosorbent assay (ELISA) and neutralization assay] predicted the presence of a T-cell response well, although quantitatively neither antibody assays nor the T-cell assay correlated well with each other. An interesting finding from our study was that half of the dogs in shelters were not vaccinated (potentially posing a public veterinary health problem) and that antibody levels in dogs living in an environment with endemic CDV were lower than in vaccinated animals. PMID:20885846

  15. Neurite density from magnetic resonance diffusion measurements at ultrahigh field: Comparison with light microscopy and electron microscopy

    PubMed Central

    Jespersen, Sune N.; Bjarkam, Carsten R.; Nyengaard, Jens R.; Chakravarty, M. Mallar; Hansen, Brian; Vosegaard, Thomas; Østergaard, Leif; Yablonskiy, Dmitriy; Nielsen, Niels Chr.; Vestergaard-Poulsen, Peter

    2010-01-01

    Due to its unique sensitivity to tissue microstructure, diffusion-weighted magnetic resonance imaging (MRI) has found many applications in clinical and fundamental science. With few exceptions, a more precise correspondence between physiological or biophysical properties and the obtained diffusion parameters remain uncertain due to lack of specificity. In this work, we address this problem by comparing diffusion parameters of a recently introduced model for water diffusion in brain matter to light microscopy and quantitative electron microscopy. Specifically, we compare diffusion model predictions of neurite density in rats to optical myelin staining intensity and stereological estimation of neurite volume fraction using electron microscopy. We find that the diffusion model describes data better and that its parameters show stronger correlation with optical and electron microscopy, and thus reflect myelinated neurite density better than the more frequently used diffusion tensor imaging (DTI) and cumulant expansion methods. Furthermore, the estimated neurite orientations capture dendritic architecture more faithfully than DTI diffusion ellipsoids. PMID:19732836

  16. Recommendations for Benchmarking Preclinical Studies of Nanomedicines.

    PubMed

    Dawidczyk, Charlene M; Russell, Luisa M; Searson, Peter C

    2015-10-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small-molecule drug therapy for cancer and to achieve both therapeutic and diagnostic functions in the same platform. Preclinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of preclinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of preclinical trials and propose a protocol for benchmarking that we recommend be included in in vivo preclinical studies of drug-delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. ©2015 American Association for Cancer Research.

  17. Perspective: Recommendations for benchmarking pre-clinical studies of nanomedicines

    PubMed Central

    Dawidczyk, Charlene M.; Russell, Luisa M.; Searson, Peter C.

    2015-01-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small molecule drug therapy for cancer, and to achieve both therapeutic and diagnostic functions in the same platform. Pre-clinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of pre-clinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of pre-clinical trials and propose a protocol for benchmarking that we recommend be included in in vivo pre-clinical studies of drug delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. PMID:26249177

  18. An Investigation on Micro-Raman Spectra and Wavelet Data Analysis for Pemphigus Vulgaris Follow-up Monitoring.

    PubMed

    Camerlingo, Carlo; Zenone, Flora; Perna, Giuseppe; Capozzi, Vito; Cirillo, Nicola; Gaeta, Giovanni Maria; Lepore, Maria

    2008-06-01

    A wavelet multi-component decomposition algorithm has been used for data analysis of micro-Raman spectra of blood serum samples from patients affected by pemphigus vulgaris at different stages. Pemphigus is a chronic, autoimmune, blistering disease of the skin and mucous membranes with a potentially fatal outcome. Spectra were measured by means of a Raman confocal microspectrometer apparatus using the 632.8 nm line of a He-Ne laser source. A discrete wavelet transform decomposition method has been applied to the recorded Raman spectra in order to overcome problems related to low-level signals and the presence of noise and background components due to light scattering and fluorescence. This numerical data treatment can automatically extract quantitative information from the Raman spectra and makes more reliable the data comparison. Even if an exhaustive investigation has not been done in this work, the feasibility of the follow-up monitoring of pemphigus vulgaris pathology has been clearly proved with useful implications for the clinical applications.

  19. An Investigation on Micro-Raman Spectra and Wavelet Data Analysis for Pemphigus Vulgaris Follow-up Monitoring

    PubMed Central

    Camerlingo, Carlo; Zenone, Flora; Perna, Giuseppe; Capozzi, Vito; Cirillo, Nicola; Gaeta, Giovanni Maria; Lepore, Maria

    2008-01-01

    A wavelet multi-component decomposition algorithm has been used for data analysis of micro-Raman spectra of blood serum samples from patients affected by pemphigus vulgaris at different stages. Pemphigus is a chronic, autoimmune, blistering disease of the skin and mucous membranes with a potentially fatal outcome. Spectra were measured by means of a Raman confocal microspectrometer apparatus using the 632.8 nm line of a He-Ne laser source. A discrete wavelet transform decomposition method has been applied to the recorded Raman spectra in order to overcome problems related to low-level signals and the presence of noise and background components due to light scattering and fluorescence. This numerical data treatment can automatically extract quantitative information from the Raman spectra and makes more reliable the data comparison. Even if an exhaustive investigation has not been done in this work, the feasibility of the follow-up monitoring of pemphigus vulgaris pathology has been clearly proved with useful implications for the clinical applications. PMID:27879899

  20. Can plastic mulching replace irrigation in dryland agriculture?

    NASA Astrophysics Data System (ADS)

    Wang, L.; Daryanto, S.; Jacinthe, P. A.

    2017-12-01

    Increasing water use efficiency (WUE) is a key strategy to maintaining crops yield without over-exploiting the scarce water resource. Plastic mulching technology for wheat and maize has been commonly used in China, but their effect on yield, soil moisture, evapotranspiration (ET), and WUE has not been compared with traditional irrigation method. Using a meta-analysis approach, we quantitatively examined the efficacy of plastic mulching in comparison with traditional irrigation in dryland agriculture. Our results showed that plastic mulching technique resulted in yield increase comparable to irrigated crops but used 24% less water. By covering the ridges with plastic and channeling rainwater into a very narrow planting zone (furrow), plastic mulching increased WUE and available soil moisture. Higher WUE in plastic-mulched croplands was likely a result of greater proportion of available water being used for transpiration than evaporation. If problems related to production costs and residual plastic pollution could be managed, plastic mulching technology would become a promising strategy for dryland farming in other regions.

  1. Quantitative Features of Liver Lesions, Lung Nodules, and Renal Stones at Multi-Detector Row CT Examinations: Dependency on Radiation Dose and Reconstruction Algorithm.

    PubMed

    Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan

    2016-04-01

    To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was significantly affected by the reconstruction algorithm used (average of 3.33 features affected by MBIR throughout lesion types; P < .002, for all comparisons), no significant effect of the radiation dose setting was observed for all but one of the texture features (P = .002-.998). Radiation dose settings and reconstruction algorithms affect the extraction and analysis of quantitative imaging features in lesions at multi-detector row CT.

  2. International Comparisons of Behavioral and Emotional Problems in Preschool Children: Parents' Reports from 24 Societies

    ERIC Educational Resources Information Center

    Rescorla, Leslie A.; Achenbach, Thomas M.; Ivanova, Masha Y.; Harder, Valerie S.; Otten, Laura; Bilenberg, Niels; Bjarnadottir, Gudrun; Capron, Christiane; De Pauw, Sarah S. W.; Dias, Pedro; Dobrean, Anca; Dopfner, Manfred; Duyme, Michel; Eapen, Valsamma; Erol, Nese; Esmaeili, Elaheh Mohammad; Ezpeleta, Lourdes; Frigerio, Alessandra; Fung, Daniel S. S.; Goncalves, Miguel; Gudmundsson, Halldor; Jeng, Suh-Fang; Jusiene, Roma; Kim, Young Ah; Kristensen, Solvejg; Liu, Jianghong; Lecannelier, Felipe; Leung, Patrick W. L.; Machado, Barbara Cesar; Montirosso, Rosario; Oh, Kyung Ja; Ooi, Yoon Phaik; Pluck, Julia; Pomalima, Rolando; Pranvera, Jetishi; Schmeck, Klaus; Shahini, Mimoza; Silva, Jaime R.; Simsek, Zeynep; Sourander, Andre; Valverde, Jose; van der Ende, Jan; Van Leeuwen, Karla G.; Wu, Yen-Tzu; Yurdusen, Sema; Zubrick, Stephen R.; Verhulst, Frank C.

    2011-01-01

    International comparisons were conducted of preschool children's behavioral and emotional problems as reported on the Child Behavior Checklist for Ages 1 1/2-5 by parents in 24 societies (N = 19,850). Item ratings were aggregated into scores on syndromes; "Diagnostic and Statistical Manual of Mental Disorders"-oriented scales; a Stress…

  3. An Analysis of Comparison Questions in the Context of Auditing.

    ERIC Educational Resources Information Center

    Lauer, Thomas W.; Peacock, Eileen

    1990-01-01

    Provides a definition of comparison questions and shows how they relate to the semantic categories of two taxonomies for classifying questions, both of which omit comparison questions. Examines the comparison questions that auditors generate when they diagnose problems in a company. (SR)

  4. Explaining quantum correlations through evolution of causal models

    NASA Astrophysics Data System (ADS)

    Harper, Robin; Chapman, Robert J.; Ferrie, Christopher; Granade, Christopher; Kueng, Richard; Naoumenko, Daniel; Flammia, Steven T.; Peruzzo, Alberto

    2017-04-01

    We propose a framework for the systematic and quantitative generalization of Bell's theorem using causal networks. We first consider the multiobjective optimization problem of matching observed data while minimizing the causal effect of nonlocal variables and prove an inequality for the optimal region that both strengthens and generalizes Bell's theorem. To solve the optimization problem (rather than simply bound it), we develop a genetic algorithm treating as individuals causal networks. By applying our algorithm to a photonic Bell experiment, we demonstrate the trade-off between the quantitative relaxation of one or more local causality assumptions and the ability of data to match quantum correlations.

  5. Quality of life in overweight and obese young Chinese children: a mixed-method study

    PubMed Central

    2013-01-01

    Background Obesity among young children in Hong Kong has become a public health problem. This study explored associations between Chinese parent reported children’s quality of life (QoL), socio-demographics and young children’s weight status from 27 preschool settings. Methods A mixed-method approach, including quantitative and qualitative tools, was employed for this cross-sectional study. Quantitative data were collected from 336 Chinese parents of children aged 2–7 years. Paediatric Quality of Life Inventory 4.0 (PedsQL, v 4.0) and a questionnaire about parents’ socio-demographics were used. In-depth interviews with mothers, teachers and children from a larger sample were the basis of 10 case studies. Quantitative data were analysed using chi-square analysis, one-way ANOVA and logistic regression. Qualitative data were analysed according to a multi-level framework that established linkages with quantitative data. Results The children’s Body Mass Index (BMI) ranged from 11.3 to 28.0 kg/m2 and was classified into four weight groups. ANOVAs showed that the normal-weight children had significantly higher PedsQL scores in Physical Functioning than obese children (mean difference = 14.19, p < .0083) and significantly higher scores in School Functioning than overweight children (mean difference = 10.15, p < .0083). Results of logistic regression showed that relative to normal-weight children, obese children had a 2–5 times higher odds of showing problems in Physical, Social Functioning and School Performance. Overweight children had 2 times higher odds of problems in Social Functioning, and underweight children had a 2 times higher odds of problems in Physical Functioning. Children’s age (χ2 = 21.71, df = 3, p < 0.01), and housing (χ2 = 33.00, df = 9, p < 0.01) were associated with their weight. The case studies further act as a supplement to the quantitative data that children showed emotional problems across different abnormal weight statues; and the association between children’s weight status and well-being might be affected by multiple childcare arrangements and familial immigration status. Conclusions This study is one of only a few studies that have examined parents’, teachers’ and young children’s own perceptions of the children’s quality of life across different weight statuses. The results are discussed in terms of their implications for intervention. PMID:23496917

  6. Self-directed learning readiness of Asian students: students perspective on a hybrid problem based learning curriculum

    PubMed Central

    Susilo, Astrid P.; van Berkel, Henk

    2016-01-01

    Objectives To identify the student’s readiness to perform self-directed learning and the underlying factors influencing it on the hybrid problem based learning curriculum. Methods A combination of quantitative and qualitative studies was conducted in five medical schools in Indonesia. In the quantitative study, the Self Directed Learning Readiness Scale was distributed to all students in all batches, who had experience with the hybrid problem based curriculum. They were categorized into low- and high -level based on the score of the questionnaire. Three focus group discussions (low-, high-, and mixed level) were conducted in the qualitative study with six to twelve students chosen randomly from each group to find the factors influencing their self-directed learning readiness. Two researchers analysed the qualitative data as a measure of triangulation. Results The quantitative study showed only half of the students had a high-level of self-directed learning readiness, and a similar trend also occurred in each batch. The proportion of students with a high level of self-directed learning readiness was lower in the senior students compared to more junior students. The qualitative study showed that problem based learning processes, assessments, learning environment, students’ life styles, students’ perceptions of the topics, and mood, were factors influencing their self-directed learning. Conclusion A hybrid problem based curriculum may not fully affect the students’ self-directed learning. The curriculum system, teacher’s experiences, student’s background and cultural factors might contribute to the difficulties for the student’s in conducting self-directed learning. PMID:27915308

  7. A novel quantitative real-time polymerase chain reaction method for detecting toxigenic Pasteurella multocida in nasal swabs from swine.

    PubMed

    Scherrer, Simone; Frei, Daniel; Wittenbrink, Max Michael

    2016-12-01

    Progressive atrophic rhinitis (PAR) in pigs is caused by toxigenic Pasteurella multocida. In Switzerland, PAR is monitored by selective culture of nasal swabs and subsequent polymerase chain reaction (PCR) screening of bacterial colonies for the P. multocida toxA gene. A panel of 203 nasal swabs from a recent PAR outbreak were used to evaluate a novel quantitative real-time PCR for toxigenic P. multocida in porcine nasal swabs. In comparison to the conventional PCR with a limit of detection of 100 genome equivalents per PCR reaction, the real-time PCR had a limit of detection of 10 genome equivalents. The real-time PCR detected toxA-positive P. multocida in 101 samples (49.8%), whereas the conventional PCR was less sensitive with 90 toxA-positive samples (44.3%). In comparison to the real-time PCR, 5.4% of the toxA-positive samples revealed unevaluable results by conventional PCR. The approach of culture-coupled toxA PCR for the monitoring of PAR in pigs is substantially improved by a novel quantitative real-time PCR.

  8. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  9. CONRAD—A software framework for cone-beam imaging in radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maier, Andreas; Choi, Jang-Hwan; Riess, Christian

    2013-11-15

    Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects.Methods: CONRAD is a Java-based state-of-the-art software platform withmore » extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source.Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size.Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and quantitative performance comparison between the methods of different groups.« less

  10. Quantitative Evaluation of Third Year Medical Students' Perception and Satisfaction from Problem Based Learning in Anatomy: A Pilot Study of the Introduction of Problem Based Learning into the Traditional Didactic Medical Curriculum in Nigeria

    ERIC Educational Resources Information Center

    Saalu, L. C.; Abraham A. A.; Aina, W. O.

    2010-01-01

    Problem-based learning (PBL) is a method of teaching that uses hypothetical clinical cases, individual investigation and group process. In recent years, in medical education, problem-based learning (PBL) has increasingly been adopted as the preferred pedagogy in many countries around the world. Controversy, however, still exists as the potential…

  11. Improving genomic prediction for pre-harvest sprouting tolerance in wheat by weighting large-effect quantitative trait loci

    USDA-ARS?s Scientific Manuscript database

    Pre-harvest sprouting (PHS) is a major problem in wheat (Triticum aestivum L.) that occurs when grains in a mature spike germinate prior to harvest, resulting in reduced yield, quality, and grain sale price. Improving PHS tolerance (PHST) is a challenge to wheat breeders because it is quantitatively...

  12. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  13. TPACK Levels of Physics and Science Teacher Candidates: Problems and Possible Solutions

    ERIC Educational Resources Information Center

    Bozkurt, Ersin

    2014-01-01

    This research examined whether the technological pedagogical content knowledge (TPACK) of physics and science teachers is at a sufficient level and whether the TPACK level affected the academic achievements of the students. In the research, a mixed method was used quantitatively and qualitatively. In the quantitative part of the research, Provus'…

  14. Teaching Children How to Include the Inversion Principle in Their Reasoning about Quantitative Relations

    ERIC Educational Resources Information Center

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Bell, Daniel; Barros, Rossana

    2012-01-01

    The basis of this intervention study is a distinction between numerical calculus and relational calculus. The former refers to numerical calculations and the latter to the analysis of the quantitative relations in mathematical problems. The inverse relation between addition and subtraction is relevant to both kinds of calculus, but so far research…

  15. Solar Occultation Satellite Data and Derived Meteorological Products: Sampling Issues and Comparisons with Aura MLS

    NASA Technical Reports Server (NTRS)

    Manney, Gloria; Daffer, William H.; Zawodny, Joseph M.; Bernath, Peter F.; Hoppel, Karl W.; Walker, Kaley A.; Knosp, Brian W.; Boone, Chris; Remsberg, Ellis E.; Santee, Michelle L.; hide

    2007-01-01

    Derived Meteorological Products (DMPs, including potential temperature (theta), potential vorticity, equivalent latitude (EqL), horizontal winds and tropopause locations) have been produced for the locations and times of measurements by several solar occultation (SO) instruments and the Aura Microwave Limb Sounder (MLS). DMPs are calculated from several meteorological analyses for the Atmospheric Chemistry Experiment-Fourier Transform Spectrometer, Stratospheric Aerosol and Gas Experiment II and III, Halogen Occultation Experiment, and Polar Ozone and Aerosol Measurement II and III SO instruments and MLS. Time-series comparisons of MLS version 1.5 and SO data using DMPs show good qualitative agreement in time evolution of O3, N2O, H20, CO, HNO3, HCl and temperature; quantitative agreement is good in most cases. EqL-coordinate comparisons of MLS version 2.2 and SO data show good quantitative agreement throughout the stratosphere for most of these species, with significant biases for a few species in localized regions. Comparisons in EqL coordinates of MLS and SO data, and of SO data with geographically coincident MLS data provide insight into where and how sampling effects are important in interpretation of the sparse SO data, thus assisting in fully utilizing the SO data in scientific studies and comparisons with other sparse datasets. The DMPs are valuable for scientific studies and to facilitate validation of non-coincident measurements.

  16. Big Fish in a Big Pond: a study of academic self concept in first year medical students

    PubMed Central

    2011-01-01

    Background Big-fish-little-pond effect (BFLPE) research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Methods Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. Results The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Conclusions Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation. PMID:21794166

  17. Ancient Paradoxes Can Extend Mathematical Thinking

    ERIC Educational Resources Information Center

    Czocher, Jennifer A.; Moss, Diana L.

    2017-01-01

    This article presents the Snail problem, a relatively simple challenge about motion that offers engaging extensions involving the notion of infinity. It encourages students in grades 5-9 to connect mathematics learning to logic, history, and philosophy through analyzing the problem, making sense of quantitative relationships, and modeling with…

  18. Freud, Problem Solving, Ethnicity, and Race: Integrating Psychology into the Interdisciplinary Core Curriculum.

    ERIC Educational Resources Information Center

    Dunn, Dana S.

    The new core curriculum at Moravian College, in Pennsylvania, utilizes an interdisciplinary approach, integrating topics of psychology into three of the seven core courses: "Microcosm/Macrocosm"; "Quantitative Problem Solving"; and the seminar "Gender, Ethnicity, and Race." The course "Microcosm/Macrocosm"…

  19. Wildland Fire Prevention: Today, Intuition--Tomorrow, Management

    Treesearch

    Albert J. Simard; Linda R. Donoghue

    1987-01-01

    Describes, from a historical perspective, methods used to characterize fire prevention problems and evaluate prevention programs and discusses past research efforts to bolster these analytical and management efforts. Highlights research on the sociological perspectives of the wildfire problem and on quantitative fire occurrence prediction and program evaluation systems...

  20. Validity of the Schizophrenia Diagnosis of the Psychopathology Instrument for Mentally Retarded Adults (PIMRA): A Comparison of Schizophrenic Patients with and without Mental Retardation.

    ERIC Educational Resources Information Center

    Linaker, Olav M.; Helle, Jon

    1994-01-01

    This study found that the schizophrenia subscale of the Psychopathology Instrument for Mentally Retarded Adults was a valid quantitative measure of schizophrenia if one item was removed from the scale. Comparison with a nonretarded population indicated that mentally retarded patients had less delusions and more incoherence and flat affect. They…

Top