Management Approaches to Stomal and Peristomal Complications: A Narrative Descriptive Study.
Beitz, Janice M; Colwell, Janice C
2016-01-01
The purpose of this study was to identify optimal interventions for selected complications based on WOC nurse experts' judgment/expertise. A cross-sectional quantitative descriptive design with qualitative, narrative-type components was used for this study. Following validation rating of appropriateness of interventions and quantitative rankings of first-, second-, and third-line approaches, participants provided substantive handwritten narrative comments about listed interventions. Comments were organized and prioritized using frequency count. Narrative comments reflected the quantitative rankings of efficacy of approaches. Clinicians offered further specific suggestions regarding product use and progression of care for selected complications. Narrative analysis using descriptive quantitative frequency count supported the rankings of most preferred treatments of selected stomal and peristomal complications. Findings add to the previous research on prioritized approaches and evidence-based practice in ostomy care.
ERIC Educational Resources Information Center
Chen, Alvin Cheng-Hsien
2014-01-01
The present study aims to investigate how conceptual symmetry plays a role in the use of spatial particles in English and to further examine its pedagogical implications via a corpus-based evaluation of the course books in senior high schools in Taiwan. More specifically, we adopt a quantitative corpus-based approach to investigate whether bipolar…
Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li
2013-01-21
A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.
ERIC Educational Resources Information Center
Houser, Bonnie L.
2017-01-01
There are relatively few empirical studies that examine whether using a competency-based education (CBE) approach results in increased student learning or achievement when compared to traditional education approaches. This study uses a quantitative research methodology, a nonexperimental comparative descriptive research design, and a two-group…
Comprehensive Quantitative Analysis on Privacy Leak Behavior
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046
Comprehensive quantitative analysis on privacy leak behavior.
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.
ERIC Educational Resources Information Center
Jenkins, Craig
2015-01-01
This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…
Standardization approaches in absolute quantitative proteomics with mass spectrometry.
Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo
2017-07-31
Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and elemental) proteomics is provided in this review. © 2017 Wiley Periodicals, Inc.
Automated classification of cell morphology by coherence-controlled holographic microscopy
NASA Astrophysics Data System (ADS)
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.
Automated classification of cell morphology by coherence-controlled holographic microscopy.
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
A qualitative and quantitative assessment for a bone marrow harvest simulator.
Machado, Liliane S; Moraes, Ronei M
2009-01-01
Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.
ERIC Educational Resources Information Center
Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa
2016-01-01
This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…
Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study
ERIC Educational Resources Information Center
Lavonier, Nicole
2014-01-01
The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…
Tu, Chengjian; Li, Jun; Sheng, Quanhu; Zhang, Ming; Qu, Jun
2014-04-04
Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R(2) > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery.
2015-01-01
Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752
Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul
2016-12-01
Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper
2018-03-01
Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.
Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G
2017-12-01
Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.
Applications of Microfluidics in Quantitative Biology.
Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang
2018-05-01
Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia
2013-11-01
Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.
IT: An Effective Pedagogic Tool in the Teaching of Quantitative Methods in Management.
ERIC Educational Resources Information Center
Nadkami, Sanjay M.
1998-01-01
Examines the possibility of supplementing conventional pedagogic methods with information technology-based teaching aids in the instruction of quantitative methods to undergraduate students. Considers the case for a problem-based learning approach, and discusses the role of information technology. (Author/LRW)
Pereira, Rui P A; Peplies, Jörg; Brettar, Ingrid; Höfle, Manfred G
2017-03-31
Next Generation Sequencing (NGS) has revolutionized the analysis of natural and man-made microbial communities by using universal primers for bacteria in a PCR based approach targeting the 16S rRNA gene. In our study we narrowed primer specificity to a single, monophyletic genus because for many questions in microbiology only a specific part of the whole microbiome is of interest. We have chosen the genus Legionella, comprising more than 20 pathogenic species, due to its high relevance for water-based respiratory infections. A new NGS-based approach was designed by sequencing 16S rRNA gene amplicons specific for the genus Legionella using the Illumina MiSeq technology. This approach was validated and applied to a set of representative freshwater samples. Our results revealed that the generated libraries presented a low average raw error rate per base (<0.5%); and substantiated the use of high-fidelity enzymes, such as KAPA HiFi, for increased sequence accuracy and quality. The approach also showed high in situ specificity (>95%) and very good repeatability. Only in samples in which the gammabacterial clade SAR86 was present more than 1% non-Legionella sequences were observed. Next-generation sequencing read counts did not reveal considerable amplification/sequencing biases and showed a sensitive as well as precise quantification of L. pneumophila along a dilution range using a spiked-in, certified genome standard. The genome standard and a mock community consisting of six different Legionella species demonstrated that the developed NGS approach was quantitative and specific at the level of individual species, including L. pneumophila. The sensitivity of our genus-specific approach was at least one order of magnitude higher compared to the universal NGS approach. Comparison of quantification by real-time PCR showed consistency with the NGS data. Overall, our NGS approach can determine the quantitative abundances of Legionella species, i. e. the complete Legionella microbiome, without the need for species-specific primers. The developed NGS approach provides a new molecular surveillance tool to monitor all Legionella species in qualitative and quantitative terms if a spiked-in genome standard is used to calibrate the method. Overall, the genus-specific NGS approach opens up a new avenue to massive parallel diagnostics in a quantitative, specific and sensitive way.
ERIC Educational Resources Information Center
Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul
2010-01-01
Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…
ERIC Educational Resources Information Center
Lang'at, Edwin K.
2014-01-01
Purpose and Method of Study: The purpose of this study was to investigate teachers' self-perceived readiness to teach school-based HIV/AIDS Awareness and Prevention education in Kenyan primary schools based on their knowledge, attitudes and instructional confidence. This research utilized a non-experimental quantitative approach with a…
ERIC Educational Resources Information Center
Davis, Eric J.; Pauls, Steve; Dick, Jonathan
2017-01-01
Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…
Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.
Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir
2013-10-31
Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.
Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.
Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman
2016-10-28
Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
Workplace-Based Assessment: Effects of Rater Expertise
ERIC Educational Resources Information Center
Govaerts, M. J. B.; Schuwirth, L. W. T.; Van der Vleuten, C. P. M.; Muijtjens, A. M. M.
2011-01-01
Traditional psychometric approaches towards assessment tend to focus exclusively on quantitative properties of assessment outcomes. This may limit more meaningful educational approaches towards workplace-based assessment (WBA). Cognition-based models of WBA argue that assessment outcomes are determined by cognitive processes by raters which are…
NASA Astrophysics Data System (ADS)
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-11-01
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b
This report summarizes the discussion, conclusions, and points of consensus of the IWGT Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (QWG) based on a meeting in Foz do Iguaçu, Brazil October 31–November 2, 2013. Topics addressed incl...
Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.
2016-01-01
Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654
Reproducibility and quantitation of amplicon sequencing-based detection
Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng
2011-01-01
To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative analysis of the β-diversity of microbial communities. PMID:21346791
ERIC Educational Resources Information Center
Lee, Young-Jin
2017-01-01
Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…
Reference condition approach to restoration planning
Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.
2010-01-01
Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.
Rohrbaugh, Michael J
2014-09-01
Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when "solutions" maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle-from qualitative to quantitative observation and back again. © 2014 FPI, Inc.
Rohrbaugh, Michael J.
2015-01-01
Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when “solutions” maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation (FAMCON) approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle – from qualitative to quantitative observation and back again. PMID:24905101
Econophysical visualization of Adam Smith’s invisible hand
NASA Astrophysics Data System (ADS)
Cohen, Morrel H.; Eliazar, Iddo I.
2013-02-01
Consider a complex system whose macrostate is statistically observable, but yet whose operating mechanism is an unknown black-box. In this paper we address the problem of inferring, from the system’s macrostate statistics, the system’s intrinsic force yielding the observed statistics. The inference is established via two diametrically opposite approaches which result in the very same intrinsic force: a top-down approach based on the notion of entropy, and a bottom-up approach based on the notion of Langevin dynamics. The general results established are applied to the problem of visualizing the intrinsic socioeconomic force-Adam Smith’s invisible hand-shaping the distribution of wealth in human societies. Our analysis yields quantitative econophysical representations of figurative socioeconomic forces, quantitative definitions of “poor” and “rich”, and a quantitative characterization of the “poor-get-poorer” and the “rich-get-richer” phenomena.
2016-01-01
Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978
Hu, Zhe-Yi; Parker, Robert B.; Herring, Vanessa L.; Laizure, S. Casey
2012-01-01
Dabigatran etexilate (DABE) is an oral prodrug that is rapidly converted by esterases to dabigatran (DAB), a direct inhibitor of thrombin. To elucidate the esterase-mediated metabolic pathway of DABE, a high-performance liquid chromatography/mass spectrometer (LC-MS/MS)-based metabolite identification and semi-quantitative estimation approach was developed. To overcome the poor full-scan sensitivity of conventional triple quadrupole mass spectrometry, precursor-product ion pairs were predicted, to search for the potential in vitro metabolites. The detected metabolites were confirmed by the product ion scan. A dilution method was introduced to evaluate the matrix effects of tentatively identified metabolites without chemical standards. Quantitative information on detected metabolites was obtained using ‘metabolite standards’ generated from incubation samples that contain a high concentration of metabolite in combination with a correction factor for mass spectrometry response. Two in vitro metabolites of DABE (M1 and M2) were identified, and quantified by the semi-quantitative estimation approach. It is noteworthy that CES1 convert DABE to M1 while CES2 mediates the conversion of DABE to M2. M1 (or M2) was further metabolized to DAB by CES2 (or CES1). The approach presented here provides a solution to a bioanalytical need for fast identification and semi-quantitative estimation of CES metabolites in preclinical samples. PMID:23239178
Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II.
Tavakol, Mohsen; Sandars, John
2014-10-01
Abstract Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.
Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.
Tavakol, Mohsen; Sandars, John
2014-09-01
Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.
ERIC Educational Resources Information Center
Kazeni, Monde; Onwu, Gilbert
2013-01-01
The study aimed to determine the comparative effectiveness of context-based and traditional teaching approaches in enhancing student achievement in genetics, problem-solving, science inquiry and decision-making skills, and attitude towards the study of life sciences. A mixed method but essentially quantitative research approach involving a…
Distance-based microfluidic quantitative detection methods for point-of-care testing.
Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James
2016-04-07
Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.
Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard
2011-01-01
Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method frequently correspond to subregions of visible spots that may represent post-translational modifications or co-migrating proteins that cannot be visually resolved from adjacent, more abundant proteins on the gel image. Thus, it is possible that this image-based approach may actually improve the realized resolution of the gel, revealing differentially expressed proteins that would not have even been detected as spots by modern spot-based analyses.
Students' Misconceptions about the Ozone Layer and the Effect of Internet-Based Media on It
ERIC Educational Resources Information Center
Gungordu, Nahide; Yalcin-Celik, Ayse; Kilic, Ziya
2017-01-01
In this study, students' misconceptions about the ozone layer were investigated, looking specifically at the effect internet-based media has on the formation of these misconceptions. Quantitative and qualitative research approaches were used to perform the research. As part of the quantitative portion of the research, the descriptive survey…
Analytical approaches to determination of total choline in foods and dietary supplements.
Phillips, Melissa M
2012-06-01
Choline is a quaternary amine that is synthesized in the body or consumed through the diet. Choline is critical for cell membrane structure and function and in synthesis of the neurotransmitter acetylcholine. Although the human body produces this micronutrient, dietary supplementation of choline is necessary for good health. The major challenge in the analysis of choline in foods and dietary supplements is in the extraction and/or hydrolysis approach. In many products, choline is present as choline esters, which can be quantitated individually or treated with acid, base, or enzymes in order to release choline ions for analysis. A critical review of approaches based on extraction and quantitation of each choline ester as well as hydrolysis-based methods for determination of total choline in foods and dietary supplements is presented.
Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K
2017-07-01
Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n = 5) and transfemoral ( n = 5) amputation were compared to able-limb controls ( n = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-12-14
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.
Lavallée-Adam, Mathieu
2017-01-01
PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334
MilQuant: a free, generic software tool for isobaric tagging-based quantitation.
Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo
2012-09-18
Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...
2016-06-22
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
ERIC Educational Resources Information Center
Arendasy, Martin; Sommer, Markus
2007-01-01
This article deals with the investigation of the psychometric quality and constructs validity of algebra word problems generated by means of a schema-based version of the automatic min-max approach. Based on review of the research literature in algebra word problem solving and automatic item generation this new approach is introduced as a…
NASA Astrophysics Data System (ADS)
Li, Xuesong; Northrop, William F.
2016-04-01
This paper describes a quantitative approach to approximate multiple scattering through an isotropic turbid slab based on Markov Chain theorem. There is an increasing need to utilize multiple scattering for optical diagnostic purposes; however, existing methods are either inaccurate or computationally expensive. Here, we develop a novel Markov Chain approximation approach to solve multiple scattering angular distribution (AD) that can accurately calculate AD while significantly reducing computational cost compared to Monte Carlo simulation. We expect this work to stimulate ongoing multiple scattering research and deterministic reconstruction algorithm development with AD measurements.
A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.
Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain
2015-10-01
Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.
USDA-ARS?s Scientific Manuscript database
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
Deep machine learning provides state-of-the-art performance in image-based plant phenotyping.
Pound, Michael P; Atkinson, Jonathan A; Townsend, Alexandra J; Wilson, Michael H; Griffiths, Marcus; Jackson, Aaron S; Bulat, Adrian; Tzimiropoulos, Georgios; Wells, Darren M; Murchie, Erik H; Pridmore, Tony P; French, Andrew P
2017-10-01
In plant phenotyping, it has become important to be able to measure many features on large image sets in order to aid genetic discovery. The size of the datasets, now often captured robotically, often precludes manual inspection, hence the motivation for finding a fully automated approach. Deep learning is an emerging field that promises unparalleled results on many data analysis problems. Building on artificial neural networks, deep approaches have many more hidden layers in the network, and hence have greater discriminative and predictive power. We demonstrate the use of such approaches as part of a plant phenotyping pipeline. We show the success offered by such techniques when applied to the challenging problem of image-based plant phenotyping and demonstrate state-of-the-art results (>97% accuracy) for root and shoot feature identification and localization. We use fully automated trait identification using deep learning to identify quantitative trait loci in root architecture datasets. The majority (12 out of 14) of manually identified quantitative trait loci were also discovered using our automated approach based on deep learning detection to locate plant features. We have shown deep learning-based phenotyping to have very good detection and localization accuracy in validation and testing image sets. We have shown that such features can be used to derive meaningful biological traits, which in turn can be used in quantitative trait loci discovery pipelines. This process can be completely automated. We predict a paradigm shift in image-based phenotyping bought about by such deep learning approaches, given sufficient training sets. © The Authors 2017. Published by Oxford University Press.
ERIC Educational Resources Information Center
Davids, Mogamat Razeen; Chikte, Usuf M. E.; Halperin, Mitchell L.
2011-01-01
This article reports on the development and evaluation of a Web-based application that provides instruction and hands-on practice in managing electrolyte and acid-base disorders. Our teaching approach, which focuses on concepts rather than details, encourages quantitative analysis and a logical problem-solving approach. Identifying any dangers to…
Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.
Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L
2017-10-01
The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.
Automated quantitative cytological analysis using portable microfluidic microscopy.
Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva
2016-06-01
In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantitative methods to direct exploration based on hydrogeologic information
Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.
2006-01-01
Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.
Jin, Cheng; Feng, Jianjiang; Wang, Lei; Yu, Heng; Liu, Jiang; Lu, Jiwen; Zhou, Jie
2018-05-01
In this paper, we present an approach for left atrial appendage (LAA) multi-phase fast segmentation and quantitative assisted diagnosis of atrial fibrillation (AF) based on 4D-CT data. We take full advantage of the temporal dimension information to segment the living, flailed LAA based on a parametric max-flow method and graph-cut approach to build 3-D model of each phase. To assist the diagnosis of AF, we calculate the volumes of 3-D models, and then generate a "volume-phase" curve to calculate the important dynamic metrics: ejection fraction, filling flux, and emptying flux of the LAA's blood by volume. This approach demonstrates more precise results than the conventional approaches that calculate metrics by area, and allows for the quick analysis of LAA-volume pattern changes of in a cardiac cycle. It may also provide insight into the individual differences in the lesions of the LAA. Furthermore, we apply support vector machines (SVMs) to achieve a quantitative auto-diagnosis of the AF by exploiting seven features from volume change ratios of the LAA, and perform multivariate logistic regression analysis for the risk of LAA thrombosis. The 100 cases utilized in this research were taken from the Philips 256-iCT. The experimental results demonstrate that our approach can construct the 3-D LAA geometries robustly compared to manual annotations, and reasonably infer that the LAA undergoes filling, emptying and re-filling, re-emptying in a cardiac cycle. This research provides a potential for exploring various physiological functions of the LAA and quantitatively estimating the risk of stroke in patients with AF. Copyright © 2018 Elsevier Ltd. All rights reserved.
Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.
2014-01-01
We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195
NASA Astrophysics Data System (ADS)
Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl
2016-03-01
We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.
Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI
NASA Astrophysics Data System (ADS)
Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.
2015-03-01
Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.
MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis
JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali
2016-01-01
Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537
Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia
2013-05-30
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.
Shaw, Calvin B; Prakash, Jaya; Pramanik, Manojit; Yalavarthy, Phaneendra K
2013-08-01
A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison.
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Lindgren, Eric A.
2018-04-01
This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.
Han, Xianlin; Yang, Kui; Gross, Richard W.
2011-01-01
Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525
USDA-ARS?s Scientific Manuscript database
The Homo sapiens and Arabidopsis thaliana genomes are believed to encode >500 and >1,000 protein kinases, respectively. Despite this abundance, few bona fide kinase-client relationships have been described in detail. Mass spectrometry (MS)-based approaches have been integral to the large-scale mapp...
DRIFTSEL: an R package for detecting signals of natural selection in quantitative traits.
Karhunen, M; Merilä, J; Leinonen, T; Cano, J M; Ovaskainen, O
2013-07-01
Approaches and tools to differentiate between natural selection and genetic drift as causes of population differentiation are of frequent demand in evolutionary biology. Based on the approach of Ovaskainen et al. (2011), we have developed an R package (DRIFTSEL) that can be used to differentiate between stabilizing selection, diversifying selection and random genetic drift as causes of population differentiation in quantitative traits when neutral marker and quantitative genetic data are available. Apart from illustrating the use of this method and the interpretation of results using simulated data, we apply the package on data from three-spined sticklebacks (Gasterosteus aculeatus) to highlight its virtues. DRIFTSEL can also be used to perform usual quantitative genetic analyses in common-garden study designs. © 2013 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whipple, C
Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less
Behavioral Changes Based on a Course in Agroecology: A Mixed Methods Study
ERIC Educational Resources Information Center
Harms, Kristyn; King, James; Francis, Charles
2009-01-01
This study evaluated and described student perceptions of a course in agroecology to determine if participants experienced changed perceptions and behaviors resulting from the Agroecosystems Analysis course. A triangulation validating quantitative data mixed methods approach included a written survey comprised of both quantitative and open-ended…
New Statistical Techniques for Evaluating Longitudinal Models.
ERIC Educational Resources Information Center
Murray, James R.; Wiley, David E.
A basic methodological approach in developmental studies is the collection of longitudinal data. Behavioral data cen take at least two forms, qualitative (or discrete) and quantitative. Both types are fallible. Measurement errors can occur in quantitative data and measures of these are based on error variance. Qualitative or discrete data can…
Virtualising the Quantitative Research Methods Course: An Island-Based Approach
ERIC Educational Resources Information Center
Baglin, James; Reece, John; Baker, Jenalle
2015-01-01
Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…
The Vinyl Acetate Content of Packaging Film: A Quantitative Infrared Experiment.
ERIC Educational Resources Information Center
Allpress, K. N.; And Others
1981-01-01
Presents an experiment used in laboratory technician training courses to illustrate the quantitative use of infrared spectroscopy which is based on industrial and laboratory procedures for the determination of vinyl acetate levels in ethylene vinyl acetate packaging films. Includes three approaches to allow for varying path lengths (film…
Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru
2009-04-27
Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.
Zhang, Kai; Tang, Chaohua; Liang, Xiaowei; Zhao, Qingyu; Zhang, Junmin
2018-01-10
Salbutamol, a selective β 2 -agonist, endangers the safety of animal products as a result of illegal use in food animals. In this study, an iTRAQ-based untargeted quantitative proteomic approach was applied to screen potential protein biomarkers in plasma of cattle before and after treatment with salbutamol for 21 days. A total of 62 plasma proteins were significantly affected by salbutamol treatment, which can be used as potential biomarkers to screen for the illegal use of salbutamol in beef cattle. Enzyme-linked immunosorbent assay measurements of five selected proteins demonstrated the reliability of iTRAQ-based proteomics in screening of candidate biomarkers among the plasma proteins. The plasma samples collected before and after salbutamol treatment were well-separated by principal component analysis (PCA) using the differentially expressed proteins. These results suggested that an iTRAQ-based untargeted quantitative proteomic strategy combined with PCA pattern recognition methods can discriminate differences in plasma protein profiles collected before and after salbutamol treatment.
Nolan, John P.; Mandy, Francis
2008-01-01
While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537
A traits-based approach for prioritizing species for monitoring and surrogacy selection
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...
2016-11-28
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
A traits-based approach for prioritizing species for monitoring and surrogacy selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
NASA Astrophysics Data System (ADS)
Setiani, C.; Waluya, S. B.; Wardono
2018-03-01
The purposes of this research are: (1) to identify learning quality in Model Eliciting Activities (MEAs) using a Metaphorical Thinking (MT) approach regarding qualitative and quantitative; (2) to analyze mathematical literacy of students based on Self-Efficacy (SE). This research is mixed method concurrent embedded design with qualitative research as the primary method. The quantitative research used quasi-experimental with non-equivalent control group design. The population is VIII grade students of SMP Negeri 3 Semarang Indonesia. Quantitative data is examined by conducting completeness mean test, standard completeness test, mean differentiation test and proportional differentiation test. Qualitative data is analyzed descriptively. The result of this research shows that MEAs learning using MT approach accomplishes good criteria both quantitatively and qualitatively. Students with low self-efficacy can identify problems, but they are lack ability to arrange problem-solving strategy on mathematical literacy questions. Students with medium self-efficacy can identify information provided in issues, but they find difficulties to use math symbols in making a representation. Students with high self-efficacy are excellent to represent problems into mathematical models as well as figures by using appropriate symbols and tools, so they can arrange strategy easily to solve mathematical literacy questions.
Chen, Xin; Qin, Shanshan; Chen, Shuai; Li, Jinlong; Li, Lixin; Wang, Zhongling; Wang, Quan; Lin, Jianping; Yang, Cheng; Shui, Wenqing
2015-01-01
In fragment-based lead discovery (FBLD), a cascade combining multiple orthogonal technologies is required for reliable detection and characterization of fragment binding to the target. Given the limitations of the mainstream screening techniques, we presented a ligand-observed mass spectrometry approach to expand the toolkits and increase the flexibility of building a FBLD pipeline especially for tough targets. In this study, this approach was integrated into a FBLD program targeting the HCV RNA polymerase NS5B. Our ligand-observed mass spectrometry analysis resulted in the discovery of 10 hits from a 384-member fragment library through two independent screens of complex cocktails and a follow-up validation assay. Moreover, this MS-based approach enabled quantitative measurement of weak binding affinities of fragments which was in general consistent with SPR analysis. Five out of the ten hits were then successfully translated to X-ray structures of fragment-bound complexes to lay a foundation for structure-based inhibitor design. With distinctive strengths in terms of high capacity and speed, minimal method development, easy sample preparation, low material consumption and quantitative capability, this MS-based assay is anticipated to be a valuable addition to the repertoire of current fragment screening techniques. PMID:25666181
Gunawardena, Harsha P; O'Brien, Jonathon; Wrobel, John A; Xie, Ling; Davies, Sherri R; Li, Shunqiang; Ellis, Matthew J; Qaqish, Bahjat F; Chen, Xian
2016-02-01
Single quantitative platforms such as label-based or label-free quantitation (LFQ) present compromises in accuracy, precision, protein sequence coverage, and speed of quantifiable proteomic measurements. To maximize the quantitative precision and the number of quantifiable proteins or the quantifiable coverage of tissue proteomes, we have developed a unified approach, termed QuantFusion, that combines the quantitative ratios of all peptides measured by both LFQ and label-based methodologies. Here, we demonstrate the use of QuantFusion in determining the proteins differentially expressed in a pair of patient-derived tumor xenografts (PDXs) representing two major breast cancer (BC) subtypes, basal and luminal. Label-based in-spectra quantitative peptides derived from amino acid-coded tagging (AACT, also known as SILAC) of a non-malignant mammary cell line were uniformly added to each xenograft with a constant predefined ratio, from which Ratio-of-Ratio estimates were obtained for the label-free peptides paired with AACT peptides in each PDX tumor. A mixed model statistical analysis was used to determine global differential protein expression by combining complementary quantifiable peptide ratios measured by LFQ and Ratio-of-Ratios, respectively. With minimum number of replicates required for obtaining the statistically significant ratios, QuantFusion uses the distinct mechanisms to "rescue" the missing data inherent to both LFQ and label-based quantitation. Combined quantifiable peptide data from both quantitative schemes increased the overall number of peptide level measurements and protein level estimates. In our analysis of the PDX tumor proteomes, QuantFusion increased the number of distinct peptide ratios by 65%, representing differentially expressed proteins between the BC subtypes. This quantifiable coverage improvement, in turn, not only increased the number of measurable protein fold-changes by 8% but also increased the average precision of quantitative estimates by 181% so that some BC subtypically expressed proteins were rescued by QuantFusion. Thus, incorporating data from multiple quantitative approaches while accounting for measurement variability at both the peptide and global protein levels make QuantFusion unique for obtaining increased coverage and quantitative precision for tissue proteomes. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Lavallée-Adam, Mathieu; Yates, John R
2016-03-24
PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
ERIC Educational Resources Information Center
Vogel, Severine P.; Engelhard, George, Jr.
2011-01-01
The authors describe a quantitative approach based on Rasch measurement theory for evaluating classroom assessments within the context of foreign language classes. A secondary purpose was to examine the effects of two instructional approaches to teach grammar, a guided inductive and a deductive approach, through the lens of Rasch measurement…
2015-12-01
FINAL REPORT Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...TITLE AND SUBTITLE Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...project ER-201129 was to develop and validate a framework used to make bioremediation decisions based on site-specific physical and biogeochemical
ERIC Educational Resources Information Center
Nelson, Regina K.
2013-01-01
A mixed-methods research study was designed to test whether undergraduate engineering students were better prepared to learn advanced topics in biomedical engineering if they learned physiology via a quantitative, concept-based approach rather than a qualitative, system-based approach. Experiments were conducted with undergraduate engineering…
Santos, Radleigh G.; Appel, Jon R.; Giulianotti, Marc A.; Edwards, Bruce S.; Sklar, Larry A.; Houghten, Richard A.; Pinilla, Clemencia
2014-01-01
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays. PMID:23722730
A Multiplexed, Probe-Based Quantitative PCR Assay for DNA of Phytophthora sojae
USDA-ARS?s Scientific Manuscript database
Phytophthora sojae (Kaufm. & Gerd.) causes seed rot, pre- and post-emergence damping off, and sometimes foliar blight in soybean (Glycine max). Crop loss may approach 100% with susceptible cultivars. We report here the development of a unique quantitative PCR assay specific to DNA of P. sojae, and a...
Quantitative magnetic resonance (QMR) measurement of changes in body composition of neonatal pigs
USDA-ARS?s Scientific Manuscript database
The survival of low birth weight pigs in particular may depend on energy stores in the body. QMR (quantitative magnetic resonance) is a new approach to measuring total body fat, lean and water. These measurements are based on quantifying protons associated with lipid and water molecules in the body...
Understanding quantitative research: part 1.
Hoe, Juanita; Hoare, Zoë
This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.
Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.
2015-12-07
In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.
In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less
Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.
Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan
2017-01-01
Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.
Agent-based modeling as a tool for program design and evaluation.
Lawlor, Jennifer A; McGirr, Sara
2017-12-01
Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.
Deep Learning for Brain MRI Segmentation: State of the Art and Future Directions.
Akkus, Zeynettin; Galimzianova, Alfiia; Hoogi, Assaf; Rubin, Daniel L; Erickson, Bradley J
2017-08-01
Quantitative analysis of brain MRI is routine for many neurological diseases and conditions and relies on accurate segmentation of structures of interest. Deep learning-based segmentation approaches for brain MRI are gaining interest due to their self-learning and generalization ability over large amounts of data. As the deep learning architectures are becoming more mature, they gradually outperform previous state-of-the-art classical machine learning algorithms. This review aims to provide an overview of current deep learning-based segmentation approaches for quantitative brain MRI. First we review the current deep learning architectures used for segmentation of anatomical brain structures and brain lesions. Next, the performance, speed, and properties of deep learning approaches are summarized and discussed. Finally, we provide a critical assessment of the current state and identify likely future developments and trends.
Clinical review: Reunification of acid–base physiology
Kellum, John A
2005-01-01
Recent advances in acid–base physiology and in the epidemiology of acid–base disorders have refined our understanding of the basic control mechanisms that determine blood pH in health and disease. These refinements have also brought parity between the newer, quantitative and older, descriptive approaches to acid–base physiology. This review explores how the new and older approaches to acid–base physiology can be reconciled and combined to result in a powerful bedside tool. A case based tutorial is also provided. PMID:16277739
Reverse engineering systems models of regulation: discovery, prediction and mechanisms.
Ashworth, Justin; Wurtmann, Elisabeth J; Baliga, Nitin S
2012-08-01
Biological systems can now be understood in comprehensive and quantitative detail using systems biology approaches. Putative genome-scale models can be built rapidly based upon biological inventories and strategic system-wide molecular measurements. Current models combine statistical associations, causative abstractions, and known molecular mechanisms to explain and predict quantitative and complex phenotypes. This top-down 'reverse engineering' approach generates useful organism-scale models despite noise and incompleteness in data and knowledge. Here we review and discuss the reverse engineering of biological systems using top-down data-driven approaches, in order to improve discovery, hypothesis generation, and the inference of biological properties. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Debnath, Ashim Kumar; Chin, Hoong Chor
Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.
Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.
Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K
2017-05-01
Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.
Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.
Lee, Jaime B; Cherney, Leora R
2018-03-01
Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p < .05) effect sizes for 2 of 3 participants for trained probes and 1 of 3 participants for untrained probes. A baseline trend correction was applied to data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.
Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela
2016-08-19
Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several advantages that go hand-in-hand with its obstacles. We offer guidance on implementing this approach, such that the process can be better planned and result in success.
A disciplined approach to capital: today's healthcare imperative.
Dupuis, Patrick J; Kaufman, Kenneth
2007-07-01
BJC HealthCare's experience exemplifies several basic principles of a finance-based approach to capital. Organizations that adopt this approach look to improve processes first, remove costs second, and spend capital last. Multiyear planning is required to quantitatively identify the profitability and liquidity requirements of strategic initiatives and address essential funding and financing issues.
Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura
2018-06-01
There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.
Quantitative filter forensics for indoor particle sampling.
Haaland, D; Siegel, J A
2017-03-01
Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero
2011-03-24
High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.
Understanding Pre-Quantitative Risk in Projects
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2011-01-01
Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.
Mesh Denoising based on Normal Voting Tensor and Binary Optimization.
Yadav, Sunil Kumar; Reitebuch, Ulrich; Polthier, Konrad
2017-08-17
This paper presents a two-stage mesh denoising algorithm. Unlike other traditional averaging approaches, our approach uses an element-based normal voting tensor to compute smooth surfaces. By introducing a binary optimization on the proposed tensor together with a local binary neighborhood concept, our algorithm better retains sharp features and produces smoother umbilical regions than previous approaches. On top of that, we provide a stochastic analysis on the different kinds of noise based on the average edge length. The quantitative results demonstrate that the performance of our method is better compared to state-of-the-art smoothing approaches.
ERIC Educational Resources Information Center
de la Fuente, Maria J.
2006-01-01
Framed under a cognitive approach to task-based L2 learning, this study used a pedagogical approach to investigate the effects of three vocabulary lessons (one traditional and two task-based) on acquisition of basic meanings, forms and morphological aspects of Spanish words. Quantitative analysis performed on the data suggests that the type of…
The Effects of Argumentation Based Science Learning Approach on Creative Thinking Skills of Students
ERIC Educational Resources Information Center
Küçük Demir, Betül; Isleyen, Tevfik
2015-01-01
The aim of this study is to explore the effects of argumentation-based science learning (ABSL) approach on 9th Grade of Secondary Education students' creative thinking skills. The sample of the study included 22 9th grade of Secondary Education students in Bayburt in 2012-2013 academic year. In this study quantitative research method and…
2010-01-01
High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197
Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W
2011-11-01
Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.
Luchins, Daniel
2012-01-01
The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.
van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie
2017-09-01
This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Reese, Debbie Denise; Tabachnick, Barbara G.
2010-01-01
In this paper, the authors summarize a quantitative analysis demonstrating that the CyGaMEs toolset for embedded assessment of learning within instructional games measures growth in conceptual knowledge by quantifying player behavior. CyGaMEs stands for Cyberlearning through GaME-based, Metaphor Enhanced Learning Objects. Some scientists of…
NASA Astrophysics Data System (ADS)
Fettahlıoğlu, Pınar; Aydoğdu, Mustafa
2018-04-01
The purpose of this research is to investigate the effect of using argumentation and problem-based learning approaches on the development of environmentally responsible behaviours among pre-service science teachers. Experimental activities were implemented for 14 weeks for 52 class hours in an environmental education class within a science teaching department. A mixed method was used as a research design; particularly, a special type of Concurrent Nested Strategy was applied. The quantitative portion was based on the one-group pre-test and post-test models, and the qualitative portion was based on the holistic multiple-case study method. The quantitative portion of the research was conducted with 34 third-year pre-service science teachers studying at a state university. The qualitative portion of the study was conducted with six pre-service science teachers selected among the 34 pre-service science teachers based on the pre-test results obtained from an environmentally responsible behaviour scale. t tests for dependent groups were used to analyse quantitative data. Both descriptive and content analyses of the qualitative data were performed. The results of the study showed that the use of the argumentation and problem-based learning approaches significantly contributed to the development of environmentally responsible behaviours among pre-service science teachers.
Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong
2017-04-01
An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.
Industrial ecology: Quantitative methods for exploring a lower carbon future
NASA Astrophysics Data System (ADS)
Thomas, Valerie M.
2015-03-01
Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.
NASA Astrophysics Data System (ADS)
Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn
2016-03-01
Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.
Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi
2008-08-01
In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.
Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun
2017-01-01
Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.
Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?
Gizak, Agnieszka; Rakus, Dariusz
2016-01-11
Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.
Ishikawa, Akira
2017-11-27
Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.
Quantitative genetic models of sexual conflict based on interacting phenotypes.
Moore, Allen J; Pizzari, Tommaso
2005-05-01
Evolutionary conflict arises between reproductive partners when alternative reproductive opportunities are available. Sexual conflict can generate sexually antagonistic selection, which mediates sexual selection and intersexual coevolution. However, despite intense interest, the evolutionary implications of sexual conflict remain unresolved. We propose a novel theoretical approach to study the evolution of sexually antagonistic phenotypes based on quantitative genetics and the measure of social selection arising from male-female interactions. We consider the phenotype of one sex as both a genetically influenced evolving trait as well as the (evolving) social environment in which the phenotype of the opposite sex evolves. Several important points emerge from our analysis, including the relationship between direct selection on one sex and indirect effects through selection on the opposite sex. We suggest that the proposed approach may be a valuable tool to complement other theoretical approaches currently used to study sexual conflict. Most importantly, our approach highlights areas where additional empirical data can help clarify the role of sexual conflict in the evolutionary process.
Bandara, Gayan C; Heist, Christopher A; Remcho, Vincent T
2018-02-20
Copper is widely applied in industrial and technological applications and is an essential micronutrient for humans and animals. However, exposure to high environmental levels of copper, especially through drinking water, can lead to copper toxicity, resulting in severe acute and chronic health effects. Therefore, regular monitoring of aqueous copper ions has become necessary as recent anthropogenic activities have led to elevated environmental concentrations of copper. On-site monitoring processes require an inexpensive, simple, and portable analytical approach capable of generating reliable qualitative and quantitative data efficiently. Membrane-based lateral flow microfluidic devices are ideal candidates as they facilitate rapid, inexpensive, and portable measurements. Here we present a simple, chromatographic separation approach in combination with a visual detection method for Cu 2+ quantitation, performed in a lateral flow microfluidic channel. This method appreciably minimizes interferences by incorporating a nonspecific polymer inclusion membrane (PIM) based assay with a "dot-counting" approach to quantification. In this study, hydrophobic polycaprolactone (PCL)-filled glass microfiber (GMF) membranes were used as the base substrate onto which the PIM was evenly dispensed as an array of dots. The devices thus prepared were then selectively exposed to oxygen radicals through a mask to generate a hydrophilic surface path along which the sample was wicked. Using this approach, copper concentrations from 1 to 20 ppm were quantified from 5 μL samples using only visual observation of the assay device.
Theoretical foundations for a quantitative approach to paleogenetics. I, II.
NASA Technical Reports Server (NTRS)
Holmquist, R.
1972-01-01
It is shown that by neglecting the phenomena of multiple hits, back mutation, and chance coincidence errors larger than 100% can be introduced in the calculated value of the average number of nucleotide base differences to be expected between two homologous polynucleotides. Mathematical formulas are derived to correct quantitatively for these effects. It is pointed out that the effects change materially the quantitative aspects of phylogenics, such as the length of the legs of the trees. A number of problems are solved without approximation.-
NASA Astrophysics Data System (ADS)
Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.
2009-11-01
Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.
ERIC Educational Resources Information Center
Pekbay, Canay; Yilmaz, Serkan
2015-01-01
This study aims to explore the influence of nature of science (NOS) activities based on explicit-reflective and historical approach on preservice elementary teachers' views of NOS aspects. Mixed-method approach including both qualitative and quantitative methods was used. The sample consisted of 83 preservice elementary teachers of a public…
Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.
2010-01-01
Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167
Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H
2016-01-01
Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.
Towards Measurement of Confidence in Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim
2011-01-01
Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations
Lin, Hongjun; Zhang, Meijia; Mei, Rongwu; Chen, Jianrong; Hong, Huachang
2014-11-01
This study proposed a novel approach for quantitative evaluation of the physicochemical interactions between a particle and rough surface. The approach adopts the composite Simpson's rule to numerically calculate the double integrals in the surface element integration of these physicochemical interactions. The calculation could be achieved by a MATLAB program based on this approach. This approach was then applied to assess the physicochemical interactions between rough membrane surface and sludge foulants in a submerged membrane bioreactor (MBR). The results showed that, as compared with smooth membrane surface, rough membrane surface had a much lower strength of interactions with sludge foulants. Meanwhile, membrane surface morphology significantly affected the strength and properties of the interactions. This study showed that the newly developed approach was feasible, and could serve as a primary tool for investigating membrane fouling in MBRs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Standardised Library Instruction Assessment: An Institution-Specific Approach
ERIC Educational Resources Information Center
Staley, Shannon M.; Branch, Nicole A.; Hewitt, Tom L.
2010-01-01
Introduction: We explore the use of a psychometric model for locally-relevant, information literacy assessment, using an online tool for standardised assessment of student learning during discipline-based library instruction sessions. Method: A quantitative approach to data collection and analysis was used, employing standardised multiple-choice…
ERIC Educational Resources Information Center
Wolusky, G. Anthony
2016-01-01
This quantitative study used a web-based questionnaire to assess the attitudes and perceptions of online and hybrid faculty towards student-centered asynchronous virtual teamwork (AVT) using the technology acceptance model (TAM) of Davis (1989). AVT is online student participation in a team approach to problem-solving culminating in a written…
Mapping Quantitative Traits in Unselected Families: Algorithms and Examples
Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David
2009-01-01
Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016
Making predictions of mangrove deforestation: a comparison of two methods in Kenya.
Rideout, Alasdair J R; Joshi, Neha P; Viergever, Karin M; Huxham, Mark; Briers, Robert A
2013-11-01
Deforestation of mangroves is of global concern given their importance for carbon storage, biogeochemical cycling and the provision of other ecosystem services, but the links between rates of loss and potential drivers or risk factors are rarely evaluated. Here, we identified key drivers of mangrove loss in Kenya and compared two different approaches to predicting risk. Risk factors tested included various possible predictors of anthropogenic deforestation, related to population, suitability for land use change and accessibility. Two approaches were taken to modelling risk; a quantitative statistical approach and a qualitative categorical ranking approach. A quantitative model linking rates of loss to risk factors was constructed based on generalized least squares regression and using mangrove loss data from 1992 to 2000. Population density, soil type and proximity to roads were the most important predictors. In order to validate this model it was used to generate a map of losses of Kenyan mangroves predicted to have occurred between 2000 and 2010. The qualitative categorical model was constructed using data from the same selection of variables, with the coincidence of different risk factors in particular mangrove areas used in an additive manner to create a relative risk index which was then mapped. Quantitative predictions of loss were significantly correlated with the actual loss of mangroves between 2000 and 2010 and the categorical risk index values were also highly correlated with the quantitative predictions. Hence, in this case the relatively simple categorical modelling approach was of similar predictive value to the more complex quantitative model of mangrove deforestation. The advantages and disadvantages of each approach are discussed, and the implications for mangroves are outlined. © 2013 Blackwell Publishing Ltd.
Endobiogeny: a global approach to systems biology (part 1 of 2).
Lapraz, Jean-Claude; Hedayat, Kamyar M
2013-01-01
Endobiogeny is a global systems approach to human biology that may offer an advancement in clinical medicine based in scientific principles of rigor and experimentation and the humanistic principles of individualization of care and alleviation of suffering with minimization of harm. Endobiogeny is neither a movement away from modern science nor an uncritical embracing of pre-rational methods of inquiry but a synthesis of quantitative and qualitative relationships reflected in a systems-approach to life and based on new mathematical paradigms of pattern recognition.
Partovi, Sasan; Yuh, Roger; Pirozzi, Sara; Lu, Ziang; Couturier, Spencer; Grosse, Ulrich; Schluchter, Mark D; Nelson, Aaron; Jones, Robert; O’Donnell, James K; Faulhaber, Peter
2017-01-01
The objective of this study was to assess the ability of a quantitative software-aided approach to improve the diagnostic accuracy of 18F FDG PET for Alzheimer’s dementia over visual analysis alone. Twenty normal subjects (M:F-12:8; mean age 80.6 years) and twenty mild AD subjects (M:F-12:8; mean age 70.6 years) with 18F FDG PET scans were obtained from the ADNI database. Three blinded readers interpreted these PET images first using a visual qualitative approach and then using a quantitative software-aided approach. Images were classified on two five-point scales based on normal/abnormal (1-definitely normal; 5-definitely abnormal) and presence of AD (1-definitely not AD; 5-definitely AD). Diagnostic sensitivity, specificity, and accuracy for both approaches were compared based on the aforementioned scales. The sensitivity, specificity, and accuracy for the normal vs. abnormal readings of all readers combined were higher when comparing the software-aided vs. visual approach (sensitivity 0.93 vs. 0.83 P = 0.0466; specificity 0.85 vs. 0.60 P = 0.0005; accuracy 0.89 vs. 0.72 P<0.0001). The specificity and accuracy for absence vs. presence of AD of all readers combined were higher when comparing the software-aided vs. visual approach (specificity 0.90 vs. 0.70 P = 0.0008; accuracy 0.81 vs. 0.72 P = 0.0356). Sensitivities of the software-aided and visual approaches did not differ significantly (0.72 vs. 0.73 P = 0.74). The quantitative software-aided approach appears to improve the performance of 18F FDG PET for the diagnosis of mild AD. It may be helpful for experienced 18F FDG PET readers analyzing challenging cases. PMID:28123864
Kalbar, Pradip P; Karmakar, Subhankar; Asolekar, Shyam R
2013-10-15
The application of multiple-attribute decision-making (MADM) to real life decision problems suggests that avoiding the loss of information through scenario-based approaches and including expert opinions in the decision-making process are two major challenges that require more research efforts. Recently, a wastewater treatment technology selection effort has been made with a 'scenario-based' method of MADM. This paper focuses on a novel approach to incorporate expert opinions into the scenario-based decision-making process, as expert opinions play a major role in the selection of treatment technologies. The sets of criteria and the indicators that are used consist of both qualitative and quantitative criteria. The group decision-making (GDM) approach that is implemented for aggregating expert opinions is based on an analytical hierarchy process (AHP), which is the most widely used MADM method. The pairwise comparison matrices (PCMs) for qualitative criteria are formed based on expert opinions, whereas, a novel approach is proposed for generating PCMs for quantitative criteria. It has been determined that the experts largely prefer natural treatment systems because they are more sustainable in any scenario. However, PCMs based on expert opinions suggest that advanced technologies such as the sequencing batch reactor (SBR) can also be appropriate for a given decision scenario. The proposed GDM approach is a rationalized process that will be more appropriate in realistic scenarios where multiple stakeholders with local and regional societal priorities are involved in the selection of treatment technology. Copyright © 2013 Elsevier Ltd. All rights reserved.
Barbosa, Jocelyn; Lee, Kyubum; Lee, Sunwon; Lodhi, Bilal; Cho, Jae-Gu; Seo, Woo-Keun; Kang, Jaewoo
2016-03-12
Facial palsy or paralysis (FP) is a symptom that loses voluntary muscles movement in one side of the human face, which could be very devastating in the part of the patients. Traditional methods are solely dependent to clinician's judgment and therefore time consuming and subjective in nature. Hence, a quantitative assessment system becomes apparently invaluable for physicians to begin the rehabilitation process; and to produce a reliable and robust method is challenging and still underway. We introduce a novel approach for a quantitative assessment of facial paralysis that tackles classification problem for FP type and degree of severity. Specifically, a novel method of quantitative assessment is presented: an algorithm that extracts the human iris and detects facial landmarks; and a hybrid approach combining the rule-based and machine learning algorithm to analyze and prognosticate facial paralysis using the captured images. A method combining the optimized Daugman's algorithm and Localized Active Contour (LAC) model is proposed to efficiently extract the iris and facial landmark or key points. To improve the performance of LAC, appropriate parameters of initial evolving curve for facial features' segmentation are automatically selected. The symmetry score is measured by the ratio between features extracted from the two sides of the face. Hybrid classifiers (i.e. rule-based with regularized logistic regression) were employed for discriminating healthy and unhealthy subjects, FP type classification, and for facial paralysis grading based on House-Brackmann (H-B) scale. Quantitative analysis was performed to evaluate the performance of the proposed approach. Experiments show that the proposed method demonstrates its efficiency. Facial movement feature extraction on facial images based on iris segmentation and LAC-based key point detection along with a hybrid classifier provides a more efficient way of addressing classification problem on facial palsy type and degree of severity. Combining iris segmentation and key point-based method has several merits that are essential for our real application. Aside from the facial key points, iris segmentation provides significant contribution as it describes the changes of the iris exposure while performing some facial expressions. It reveals the significant difference between the healthy side and the severe palsy side when raising eyebrows with both eyes directed upward, and can model the typical changes in the iris region.
Quantitative consensus of supervised learners for diffuse lung parenchymal HRCT patterns
NASA Astrophysics Data System (ADS)
Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.
2013-03-01
Automated lung parenchymal classification usually relies on supervised learning of expert chosen regions representative of the visually differentiable HRCT patterns specific to different pathologies (eg. emphysema, ground glass, honey combing, reticular and normal). Considering the elusiveness of a single most discriminating similarity measure, a plurality of weak learners can be combined to improve the machine learnability. Though a number of quantitative combination strategies exist, their efficacy is data and domain dependent. In this paper, we investigate multiple (N=12) quantitative consensus approaches to combine the clusters obtained with multiple (n=33) probability density-based similarity measures. Our study shows that hypergraph based meta-clustering and probabilistic clustering provides optimal expert-metric agreement.
Palaeoclimate records 60-8 ka in the Austrian and Swiss Alps and their forelands
NASA Astrophysics Data System (ADS)
Heiri, Oliver; Koinig, Karin A.; Spötl, Christoph; Barrett, Sam; Brauer, Achim; Drescher-Schneider, Ruth; Gaar, Dorian; Ivy-Ochs, Susan; Kerschner, Hanns; Luetscher, Marc; Moran, Andrew; Nicolussi, Kurt; Preusser, Frank; Schmidt, Roland; Schoeneich, Philippe; Schwörer, Christoph; Sprafke, Tobias; Terhorst, Birgit; Tinner, Willy
2014-12-01
The European Alps and their forelands provide a range of different archives and climate proxies for developing climate records in the time interval 60-8 thousand years (ka) ago. We review quantitative and semi-quantitative approaches for reconstructing climatic variables in the Austrian and Swiss sector of the Alpine region within this time interval. Available quantitative to semi-quantitative climate records in this region are mainly based on fossil assemblages of biota such as chironomids, cladocerans, coleopterans, diatoms and pollen preserved in lake sediments and peat, the analysis of oxygen isotopes in speleothems and lake sediment records, the reconstruction of past variations in treeline altitude, the reconstruction of past equilibrium line altitude and extent of glaciers based on geomorphological evidence, and the interpretation of past soil formation processes, dust deposition and permafrost as apparent in loess-palaeosol sequences. Palaeoclimate reconstructions in the Alpine region are affected by dating uncertainties increasing with age, the fragmentary nature of most of the available records, which typically only incorporate a fraction of the time interval of interest, and the limited replication of records within and between regions. Furthermore, there have been few attempts to cross-validate different approaches across this time interval to confirm reconstructed patterns of climatic change by several independent lines of evidence. Based on our review we identify a number of developments that would provide major advances for palaeoclimate reconstruction for the period 60-8 ka in the Alps and their forelands. These include (1) the compilation of individual, fragmentary records to longer and continuous reconstructions, (2) replication of climate records and the development of regional reconstructions for different parts of the Alps, (3) the cross-validation of different proxy-types and approaches, and (4) the reconstruction of past variations in climate gradients across the Alps and their forelands. Furthermore, the development of downscaled climate model runs for the Alpine region 60-8 ka, and of forward modelling approaches for climate proxies would expand the opportunities for quantitative assessments of climatic conditions in Europe within this time-interval.
Risk analysis for veterinary biologicals released into the environment.
Silva, S V; Samagh, B S; Morley, R S
1995-12-01
All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco
Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less
Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui
2016-12-09
Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.
NASA Astrophysics Data System (ADS)
Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui
2016-12-01
Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.
A review of state-of-the-art stereology for better quantitative 3D morphology in cardiac research.
Mühlfeld, Christian; Nyengaard, Jens Randel; Mayhew, Terry M
2010-01-01
The aim of stereological methods in biomedical research is to obtain quantitative information about three-dimensional (3D) features of tissues, cells, or organelles from two-dimensional physical or optical sections. With immunogold labeling, stereology can even be used for the quantitative analysis of the distribution of molecules within tissues and cells. Nowadays, a large number of design-based stereological methods offer an efficient quantitative approach to intriguing questions in cardiac research, such as "Is there a significant loss of cardiomyocytes during progression from ventricular hypertrophy to heart failure?" or "Does a specific treatment reduce the degree of fibrosis in the heart?" Nevertheless, the use of stereological methods in cardiac research is rare. The present review article demonstrates how some of the potential pitfalls in quantitative microscopy may be avoided. To this end, we outline the concepts of design-based stereology and illustrate their practical applications to a wide range of biological questions in cardiac research. We hope that the present article will stimulate researchers in cardiac research to incorporate design-based stereology into their study designs, thus promoting an unbiased quantitative 3D microscopy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Hu, Dehong; Markillie, Lye Meng
Quantitative gene expression analysis in intact single cells can be achieved using single molecule- based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple FISH sub-probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific binding of sub-probes and tissue autofluorescence, limiting its accuracy. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on blinking frequencies of photoswitchable dyes. This fluctuation localization imaging-based FISH (fliFISH) uses blinking frequency patterns, emitted frommore » a transcript bound to multiple sub-probes, which are distinct from blinking patterns emitted from partial or nonspecifically bound sub-probes and autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2, and their ratio (Nkx2-2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct blinking frequency patterns, laying the foundation for reliable single-cell transcriptomics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sale, K
Several approaches to ASP performance can be contemplated. Perhaps the ideal would be a full cost/benefit analysis (which is probably utterly infeasible). Another approach would be a test-based figure-of-merit (FOM), this approach has the virtue of being quantitative and the challenge that each customer and application would be characterized by a different FOM. The alternative proposed here is an approach that uses information about the limits of detection of real instruments to support informed judgments.
Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang
2013-05-21
Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management.
Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang
2013-01-01
Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management. PMID:23698700
Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun
2017-01-30
Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Krishnamurthy, Krish
2013-12-01
The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.
To label or not to label: applications of quantitative proteomics in neuroscience research.
Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W
2012-02-01
Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher
2018-03-07
Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lu, Shao Hua; Li, Bao Qiong; Zhai, Hong Lin; Zhang, Xin; Zhang, Zhuo Yong
2018-04-25
Terahertz time-domain spectroscopy has been applied to many fields, however, it still encounters drawbacks in multicomponent mixtures analysis due to serious spectral overlapping. Here, an effective approach to quantitative analysis was proposed, and applied on the determination of the ternary amino acids in foxtail millet substrate. Utilizing three parameters derived from the THz-TDS, the images were constructed and the Tchebichef image moments were used to extract the information of target components. Then the quantitative models were obtained by stepwise regression. The correlation coefficients of leave-one-out cross-validation (R loo-cv 2 ) were more than 0.9595. As for external test set, the predictive correlation coefficients (R p 2 ) were more than 0.8026 and the root mean square error of prediction (RMSE p ) were less than 1.2601. Compared with the traditional methods (PLS and N-PLS methods), our approach is more accurate, robust and reliable, and can be a potential excellent approach to quantify multicomponent with THz-TDS spectroscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.
2018-06-01
The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.
2018-01-01
Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718
Cao, Mingyan; Mo, Wenjun David; Shannon, Anthony; Wei, Ziping; Washabaugh, Michael; Cash, Patricia
Aspartate (Asp) isomerization is a common post-translational modification of recombinant therapeutic proteins that can occur during manufacturing, storage, or administration. Asp isomerization in the complementarity-determining regions of a monoclonal antibody may affect the target binding and thus a sufficiently robust quality control method for routine monitoring is desirable. In this work, we utilized a liquid chromatography-mass spectrometry (LC/MS)-based approach to identify the Asp isomerization in the complementarity-determining regions of a therapeutic monoclonal antibody. To quantitate the site-specific Asp isomerization of the monoclonal antibody, a UV detection-based quantitation assay utilizing the same LC platform was developed. The assay was qualified and implemented for routine monitoring of this product-specific modification. Compared with existing methods, this analytical paradigm is applicable to identify Asp isomerization (or other modifications) and subsequently develop a rapid, sufficiently robust quality control method for routine site-specific monitoring and quantitation to ensure product quality. This approach first identifies and locates a product-related impurity (a critical quality attribute) caused by isomerization, deamidation, oxidation, or other post-translational modifications, and then utilizes synthetic peptides and MS to assist the development of a LC-UV-based chromatographic method that separates and quantifies the product-related impurities by UV peaks. The established LC-UV method has acceptable peak specificity, precision, linearity, and accuracy; it can be validated and used in a good manufacturing practice environment for lot release and stability testing. Aspartate isomerization is a common post-translational modification of recombinant proteins during manufacture process and storage. Isomerization in the complementarity-determining regions (CDRs) of a monoclonal antibody A (mAb-A) has been detected and has been shown to have impact on the binding affinity to the antigen. In this work, we utilized a mass spectrometry-based peptide mapping approach to detect and quantitate the Asp isomerization in the CDRs of mAb-A. To routinely monitor the CDR isomerization of mAb-A, a focused peptide mapping method utilizing reversed phase chromatographic separation and UV detection has been developed and qualified. This approach is generally applicable to monitor isomerization and other post-translational modifications of proteins in a specific and high-throughput mode to ensure product quality. © PDA, Inc. 2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luria, Paolo; Aspinall, Peter A
2003-08-01
The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less
Mahan, Ellen D.; Morrow, Kathleen M.; Hayes, John E.
2015-01-01
Background Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Study Design Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey’s honestly significant difference test. Results Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Conclusions Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. PMID:21757061
Mahan, Ellen D; Morrow, Kathleen M; Hayes, John E
2011-08-01
Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey's honestly significant difference test. Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. Copyright © 2011 Elsevier Inc. All rights reserved.
Insights from quantitative metaproteomics and protein-stable isotope probing into microbial ecology.
von Bergen, Martin; Jehmlich, Nico; Taubert, Martin; Vogt, Carsten; Bastida, Felipe; Herbst, Florian-Alexander; Schmidt, Frank; Richnow, Hans-Hermann; Seifert, Jana
2013-10-01
The recent development of metaproteomics has enabled the direct identification and quantification of expressed proteins from microbial communities in situ, without the need for microbial enrichment. This became possible by (1) significant increases in quality and quantity of metagenome data and by improvements of (2) accuracy and (3) sensitivity of modern mass spectrometers (MS). The identification of physiologically relevant enzymes can help to understand the role of specific species within a community or an ecological niche. Beside identification, relative and absolute quantitation is also crucial. We will review label-free and label-based methods of quantitation in MS-based proteome analysis and the contribution of quantitative proteome data to microbial ecology. Additionally, approaches of protein-based stable isotope probing (protein-SIP) for deciphering community structures are reviewed. Information on the species-specific metabolic activity can be obtained when substrates or nutrients are labeled with stable isotopes in a protein-SIP approach. The stable isotopes ((13)C, (15)N, (36)S) are incorporated into proteins and the rate of incorporation can be used for assessing the metabolic activity of the corresponding species. We will focus on the relevance of the metabolic and phylogenetic information retrieved with protein-SIP studies and for detecting and quantifying the carbon flux within microbial consortia. Furthermore, the combination of protein-SIP with established tools in microbial ecology such as other stable isotope probing techniques are discussed.
Schulte, P A; Kuempel, E D; Drew, N M
2018-06-01
The commercialization of engineered nanomaterials (ENMs) began in the early 2000's. Since then the number of commercial products and the number of workers potentially exposed to ENMs is growing, as is the need to evaluate and manage the potential health risks. Occupational exposure limits (OELs) have been developed for some of the first generation of ENMs. These OELs have been based on risk assessments that progressed from qualitative to quantitative as nanotoxicology data became available. In this paper, that progression is characterized. It traces OEL development through the qualitative approach of general groups of ENMs based primarily on read-across with other materials to quantitative risk assessments for nanoscale particles including titanium dioxide, carbon nanotubes and nanofibers, silver nanoparticles, and cellulose nanocrystals. These represent prototypic approaches to risk assessment and OEL development for ENMs. Such substance-by-substance efforts are not practical given the insufficient data for many ENMs that are currently being used or potentially entering commerce. Consequently, categorical approaches are emerging to group and rank ENMs by hazard and potential health risk. The strengths and limitations of these approaches are described, and future derivations and research needs are discussed. Critical needs in moving forward with understanding the health effects of the numerous EMNs include more standardized and accessible quantitative data on the toxicity and physicochemical properties of ENMs. Published by Elsevier Inc.
Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie
2018-05-22
Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.
A hybrid agent-based approach for modeling microbiological systems.
Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing
2008-11-21
Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.
NASA Astrophysics Data System (ADS)
Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.
2013-03-01
Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.
Obesity prevention: Comparison of techniques and potential solution
NASA Astrophysics Data System (ADS)
Zulkepli, Jafri; Abidin, Norhaslinda Zainal; Zaibidi, Nerda Zura
2014-12-01
Over the years, obesity prevention has been a broadly studied subject by both academicians and practitioners. It is one of the most serious public health issue as it can cause numerous chronic health and psychosocial problems. Research is needed to suggest a population-based strategy for obesity prevention. In the academic environment, the importance of obesity prevention has triggered various problem solving approaches. A good obesity prevention model, should comprehend and cater all complex and dynamics issues. Hence, the main purpose of this paper is to discuss the qualitative and quantitative approaches on obesity prevention study and to provide an extensive literature review on various recent modelling techniques for obesity prevention. Based on these literatures, the comparison of both quantitative and qualitative approahes are highlighted and the justification on the used of system dynamics technique to solve the population of obesity is discussed. Lastly, a potential framework solution based on system dynamics modelling is proposed.
Quantitative DIC microscopy using an off-axis self-interference approach.
Fu, Dan; Oh, Seungeun; Choi, Wonshik; Yamauchi, Toyohiko; Dorn, August; Yaqoob, Zahid; Dasari, Ramachandra R; Feld, Michael S
2010-07-15
Traditional Normarski differential interference contrast (DIC) microscopy is a very powerful method for imaging nonstained biological samples. However, one of its major limitations is the nonquantitative nature of the imaging. To overcome this problem, we developed a quantitative DIC microscopy method based on off-axis sample self-interference. The digital holography algorithm is applied to obtain quantitative phase gradients in orthogonal directions, which leads to a quantitative phase image through a spiral integration of the phase gradients. This method is practically simple to implement on any standard microscope without stringent requirements on polarization optics. Optical sectioning can be obtained through enlarged illumination NA.
Predicting ESI/MS Signal Change for Anions in Different Solvents.
Kruve, Anneli; Kaupmees, Karl
2017-05-02
LC/ESI/MS is a technique widely used for qualitative and quantitative analysis in various fields. However, quantification is currently possible only for compounds for which the standard substances are available, as the ionization efficiency of different compounds in ESI source differs by orders of magnitude. In this paper we present an approach for quantitative LC/ESI/MS analysis without standard substances. This approach relies on accurately predicting the ionization efficiencies in ESI source based on a model, which uses physicochemical parameters of analytes. Furthermore, the model has been made transferable between different mobile phases and instrument setups by using a suitable set of calibration compounds. This approach has been validated both in flow injection and chromatographic mode with gradient elution.
One registration multi-atlas-based pseudo-CT generation for attenuation correction in PET/MRI.
Arabi, Hossein; Zaidi, Habib
2016-10-01
The outcome of a detailed assessment of various strategies for atlas-based whole-body bone segmentation from magnetic resonance imaging (MRI) was exploited to select the optimal parameters and setting, with the aim of proposing a novel one-registration multi-atlas (ORMA) pseudo-CT generation approach. The proposed approach consists of only one online registration between the target and reference images, regardless of the number of atlas images (N), while for the remaining atlas images, the pre-computed transformation matrices to the reference image are used to align them to the target image. The performance characteristics of the proposed method were evaluated and compared with conventional atlas-based attenuation map generation strategies (direct registration of the entire atlas images followed by voxel-wise weighting (VWW) and arithmetic averaging atlas fusion). To this end, four different positron emission tomography (PET) attenuation maps were generated via arithmetic averaging and VWW scheme using both direct registration and ORMA approaches as well as the 3-class attenuation map obtained from the Philips Ingenuity TF PET/MRI scanner commonly used in the clinical setting. The evaluation was performed based on the accuracy of extracted whole-body bones by the different attenuation maps and by quantitative analysis of resulting PET images compared to CT-based attenuation-corrected PET images serving as reference. The comparison of validation metrics regarding the accuracy of extracted bone using the different techniques demonstrated the superiority of the VWW atlas fusion algorithm achieving a Dice similarity measure of 0.82 ± 0.04 compared to arithmetic averaging atlas fusion (0.60 ± 0.02), which uses conventional direct registration. Application of the ORMA approach modestly compromised the accuracy, yielding a Dice similarity measure of 0.76 ± 0.05 for ORMA-VWW and 0.55 ± 0.03 for ORMA-averaging. The results of quantitative PET analysis followed the same trend with less significant differences in terms of SUV bias, whereas massive improvements were observed compared to PET images corrected for attenuation using the 3-class attenuation map. The maximum absolute bias achieved by VWW and VWW-ORMA methods was 06.4 ± 5.5 in the lung and 07.9 ± 4.8 in the bone, respectively. The proposed algorithm is capable of generating decent attenuation maps. The quantitative analysis revealed a good correlation between PET images corrected for attenuation using the proposed pseudo-CT generation approach and the corresponding CT images. The computational time is reduced by a factor of 1/N at the expense of a modest decrease in quantitative accuracy, thus allowing us to achieve a reasonable compromise between computing time and quantitative performance.
Quantitative traits and diversification.
FitzJohn, Richard G
2010-12-01
Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.
Mohr, Johannes A; Jain, Brijnesh J; Obermayer, Klaus
2008-09-01
Quantitative structure activity relationship (QSAR) analysis is traditionally based on extracting a set of molecular descriptors and using them to build a predictive model. In this work, we propose a QSAR approach based directly on the similarity between the 3D structures of a set of molecules measured by a so-called molecule kernel, which is independent of the spatial prealignment of the compounds. Predictors can be build using the molecule kernel in conjunction with the potential support vector machine (P-SVM), a recently proposed machine learning method for dyadic data. The resulting models make direct use of the structural similarities between the compounds in the test set and a subset of the training set and do not require an explicit descriptor construction. We evaluated the predictive performance of the proposed method on one classification and four regression QSAR datasets and compared its results to the results reported in the literature for several state-of-the-art descriptor-based and 3D QSAR approaches. In this comparison, the proposed molecule kernel method performed better than the other QSAR methods.
Norinder, U; Högberg, T
1992-04-01
The advantageous approach of using an experimentally designed training set as the basis for establishing a quantitative structure-activity relationship with good predictive capability is described. The training set was selected from a fractional factorial design scheme based on a principal component description of physico-chemical parameters of aromatic substituents. The derived model successfully predicts the activities of additional substituted benzamides of 6-methoxy-N-(4-piperidyl)salicylamide type. The major influence on activity of the 3-substituent is demonstrated.
Srikrishna, S; Robinson, D; Cardozo, L; Cartwright, R
2008-10-01
To explore the expectations and goals of women undergoing surgery for urogenital prolapse using both a quantitative quality of life approach exploring symptom bother and a qualitative interview-based approach exploring patient goals and expectations. Prospective observational study. Tertiary referral centre for urogynaecology. Forty-three women with symptomatic pelvic organ prolapse were recruited from the waiting list for pelvic floor reconstructive surgery. All women were assessed with a structured clinical interview on an individual basis. The data obtained were transcribed verbatim and then analysed thematically based on the grounded theory. Individual codes and subcodes were identified to develop a coding framework. The prolapse quality-of-life (pQoL) questionnaire was used to determine the impact of pelvic organ prolapse on the woman's daily life. We arbitrarily classified 'bother' as minimal, mild, moderate and marked if scores ranged from 0 to 25, 25-50, 50-75 and 75-100, respectively. The degree of prolapse was objectively quantified using the pelvic organ prolapse quantification (POP-Q) system. Quantitative data were analysed using SPSS. Ethical approval was obtained from the Kings College Hospital Ethics Committee. Quantitative data from POP-Q, subjective data from pQoL, qualitative data based on the structured clinical interview. Forty-three women were recruited over the first 1 year of the study. Their mean age was 56 years (range 36-78) and mean parity was 2 (range 0-6). The mean ordinal stage of the prolapse was 2 (range stages 1-4). Quantitative analysis of the pQoL data suggested that the main domains affected were prolapse impact on life (mean score 74.71) and personal relationships (mean score 46.66). Qualitative analysis based on the clinical interview suggested that these women were most affected by the actual physical symptoms of prolapse (bulge, pain and bowel problems) as well by the impact prolapse has on their sexual function. While disease-specific QoL questionnaires allow broad comparisons to be made assessing patient bother, they may lack the sensitivity to assess individual symptoms. A qualitative approach may individualize patient care and ultimately improve patient satisfaction and overall outcome when treating women complaining of urogenital prolapse.
Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas
2016-01-01
In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.
Low rank magnetic resonance fingerprinting.
Mazor, Gal; Weizman, Lior; Tal, Assaf; Eldar, Yonina C
2016-08-01
Magnetic Resonance Fingerprinting (MRF) is a relatively new approach that provides quantitative MRI using randomized acquisition. Extraction of physical quantitative tissue values is preformed off-line, based on acquisition with varying parameters and a dictionary generated according to the Bloch equations. MRF uses hundreds of radio frequency (RF) excitation pulses for acquisition, and therefore high under-sampling ratio in the sampling domain (k-space) is required. This under-sampling causes spatial artifacts that hamper the ability to accurately estimate the quantitative tissue values. In this work, we introduce a new approach for quantitative MRI using MRF, called Low Rank MRF. We exploit the low rank property of the temporal domain, on top of the well-known sparsity of the MRF signal in the generated dictionary domain. We present an iterative scheme that consists of a gradient step followed by a low rank projection using the singular value decomposition. Experiments on real MRI data demonstrate superior results compared to conventional implementation of compressed sensing for MRF at 15% sampling ratio.
A GIS-based Quantitative Approach for the Search of Clandestine Graves, Italy.
Somma, Roberta; Cascio, Maria; Silvestro, Massimiliano; Torre, Eliana
2018-05-01
Previous research on the RAG color-coded prioritization systems for the discovery of clandestine graves has not considered all the factors influencing the burial site choice within a GIS project. The goal of this technical note was to discuss a GIS-based quantitative approach for the search of clandestine graves. The method is based on cross-referenced RAG maps with cumulative suitability factors to host a burial, leading to the editing of different search scenarios for ground searches showing high-(Red), medium-(Amber), and low-(Green) priority areas. The application of this procedure allowed several outcomes to be determined: If the concealment occurs at night, then the "search scenario without the visibility" will be the most effective one; if the concealment occurs in daylight, then the "search scenario with the DSM-based visibility" will be most appropriate; the different search scenarios may be cross-referenced with offender's confessions and eyewitnesses' testimonies to verify the veracity of their statements. © 2017 American Academy of Forensic Sciences.
Aarons, Gregory A; Fettes, Danielle L; Sommerfeld, David H; Palinkas, Lawrence A
2012-02-01
Many public sector service systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This article describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. The authors integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research.
Aarons, Gregory A.; Fettes, Danielle L.; Sommerfeld, David H.; Palinkas, Lawrence
2013-01-01
Many public sector services systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well-suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This paper describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. We integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research. PMID:22146861
ERIC Educational Resources Information Center
Castanho, Miguel A. R. B.
2002-01-01
The main distinction between the overlapping fields of molecular biophysics and biochemistry resides in their different approaches to the same problems. Molecular biophysics makes more use of physical techniques and focuses on quantitative data. This difference encounters two difficult pedagogical challenges when teaching molecular biophysics to…
Strategies for Using Peer-Assisted Learning Effectively in an Undergraduate Bioinformatics Course
ERIC Educational Resources Information Center
Shapiro, Casey; Ayon, Carlos; Moberg-Parker, Jordan; Levis-Fitzgerald, Marc; Sanders, Erin R.
2013-01-01
This study used a mixed methods approach to evaluate hybrid peer-assisted learning approaches incorporated into a bioinformatics tutorial for a genome annotation research project. Quantitative and qualitative data were collected from undergraduates who enrolled in a research-based laboratory course during two different academic terms at UCLA.…
Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.
Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L
2015-09-01
Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.
Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia
2015-01-01
The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.
Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.
Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier
2010-01-28
Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.
Engineering as a new frontier for translational medicine
Chien, Shu; Bashir, Rashid; Nerem, Robert M.; Pettigrew, Roderic
2015-01-01
The inclusion of engineering ideas and approaches makes medicine a quantitative and systems-based discipline that facilitates precision diagnostics and therapeutics to improve health care delivery for all. PMID:25834106
Addison, Prue F E; Flander, Louisa B; Cook, Carly N
2017-08-01
Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Eck, Brendan L.; Fahmi, Rachid; Levi, Jacob; Fares, Anas; Wu, Hao; Li, Yuemeng; Vembar, Mani; Dhanantwari, Amar; Bezerra, Hiram G.; Wilson, David L.
2016-03-01
Myocardial perfusion imaging using CT (MPI-CT) has the potential to provide quantitative measures of myocardial blood flow (MBF) which can aid the diagnosis of coronary artery disease. We evaluated the quantitative accuracy of MPI-CT in a porcine model of balloon-induced LAD coronary artery ischemia guided by fractional flow reserve (FFR). We quantified MBF at baseline (FFR=1.0) and under moderate ischemia (FFR=0.7) using MPI-CT and compared to fluorescent microsphere-based MBF from high-resolution cryo-images. Dynamic, contrast-enhanced CT images were obtained using a spectral detector CT (Philips Healthcare). Projection-based mono-energetic images were reconstructed and processed to obtain MBF. Three MBF quantification approaches were evaluated: singular value decomposition (SVD) with fixed Tikhonov regularization (ThSVD), SVD with regularization determined by the L-Curve criterion (LSVD), and Johnson-Wilson parameter estimation (JW). The three approaches over-estimated MBF compared to cryo-images. JW produced the most accurate MBF, with average error 33.3+/-19.2mL/min/100g, whereas LSVD and ThSVD had greater over-estimation, 59.5+/-28.3mL/min/100g and 78.3+/-25.6 mL/min/100g, respectively. Relative blood flow as assessed by a flow ratio of LAD-to-remote myocardium was strongly correlated between JW and cryo-imaging, with R2=0.97, compared to R2=0.88 and 0.78 for LSVD and ThSVD, respectively. We assessed tissue impulse response functions (IRFs) from each approach for sources of error. While JW was constrained to physiologic solutions, both LSVD and ThSVD produced IRFs with non-physiologic properties due to noise. The L-curve provided noise-adaptive regularization but did not eliminate non-physiologic IRF properties or optimize for MBF accuracy. These findings suggest that model-based MPI-CT approaches may be more appropriate for quantitative MBF estimation and that cryo-imaging can support the development of MPI-CT by providing spatial distributions of MBF.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
Optical properties of acute kidney injury measured by quantitative phase imaging
Ban, Sungbea; Min, Eunjung; Baek, Songyee; Kwon, Hyug Moo; Popescu, Gabriel
2018-01-01
The diagnosis of acute kidney disease (AKI) has been examined mainly by histology, immunohistochemistry and western blot. Though these approaches are widely accepted in the field, it has an inherent limitation due to the lack of high-throughput and quantitative information. For a better understanding of prognosis in AKI, we present a new approach using quantitative phase imaging combined with a wide-field scanning platform. Through the phase-delay information from the tissue, we were able to predict a stage of AKI based on various optical properties such as light scattering coefficient and anisotropy. These optical parameters quantify the deterioration process of the AKI model of tissue. Our device would be a very useful tool when it is required to deliver fast feedback of tissue pathology or when diseases are related to mechanical properties such as fibrosis. PMID:29541494
Choi, Seo Yeon; Yang, Nuri; Jeon, Soo Kyung; Yoon, Tae Hyun
2014-09-01
In this study, we have demonstrated feasibility of a semi-quantitative approach for the estimation of cellular SiO2 nanoparticles (NPs), which is based on the flow cytometry measurements of their normalized side scattering intensity. In order to improve our understanding on the quantitative aspects of cell-nanoparticle interactions, flow cytometry, transmission electron microscopy, and X-ray fluorescence experiments were carefully performed for the HeLa cells exposed to SiO2 NPs with different core diameters, hydrodynamic sizes, and surface charges. Based on the observed relationships among the experimental data, a semi-quantitative cellular SiO2 NPs estimation method from their normalized side scattering and core diameters was proposed, which can be applied for the determination of cellular SiO2 NPs within their size-dependent linear ranges. © 2014 International Society for Advancement of Cytometry.
A Modeling Approach for Burn Scar Assessment Using Natural Features and Elastic Property
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsap, L V; Zhang, Y; Goldgof, D B
2004-04-02
A modeling approach is presented for quantitative burn scar assessment. Emphases are given to: (1) constructing a finite element model from natural image features with an adaptive mesh, and (2) quantifying the Young's modulus of scars using the finite element model and the regularization method. A set of natural point features is extracted from the images of burn patients. A Delaunay triangle mesh is then generated that adapts to the point features. A 3D finite element model is built on top of the mesh with the aid of range images providing the depth information. The Young's modulus of scars ismore » quantified with a simplified regularization functional, assuming that the knowledge of scar's geometry is available. The consistency between the Relative Elasticity Index and the physician's rating based on the Vancouver Scale (a relative scale used to rate burn scars) indicates that the proposed modeling approach has high potentials for image-based quantitative burn scar assessment.« less
Allelic-based gene-gene interaction associated with quantitative traits.
Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M
2009-05-01
Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.
Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.
Chahrour, Osama; Malone, John
2017-01-01
Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S
2014-01-01
The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
75 FR 67092 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-01
... (CM) using a quantitative, preference-based approach. The US Department of Health and Human Services... substantial long-term physical and psychological consequences. Despite considerable research on the... these decrements, based on respondent preferences over a series of comparisons that will be shown to...
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
eSIP: A Novel Solution-Based Sectioned Image Property Approach for Microscope Calibration
Butzlaff, Malte; Weigel, Arwed; Ponimaskin, Evgeni; Zeug, Andre
2015-01-01
Fluorescence confocal microscopy represents one of the central tools in modern sciences. Correspondingly, a growing amount of research relies on the development of novel microscopic methods. During the last decade numerous microscopic approaches were developed for the investigation of various scientific questions. Thereby, the former qualitative imaging methods became replaced by advanced quantitative methods to gain more and more information from a given sample. However, modern microscope systems being as complex as they are, require very precise and appropriate calibration routines, in particular when quantitative measurements should be compared over longer time scales or between different setups. Multispectral beads with sub-resolution size are often used to describe the point spread function and thus the optical properties of the microscope. More recently, a fluorescent layer was utilized to describe the axial profile for each pixel, which allows a spatially resolved characterization. However, fabrication of a thin fluorescent layer with matching refractive index is technically not solved yet. Therefore, we propose a novel type of calibration concept for sectioned image property (SIP) measurements which is based on fluorescent solution and makes the calibration concept available for a broader number of users. Compared to the previous approach, additional information can be obtained by application of this extended SIP chart approach, including penetration depth, detected number of photons, and illumination profile shape. Furthermore, due to the fit of the complete profile, our method is less susceptible to noise. Generally, the extended SIP approach represents a simple and highly reproducible method, allowing setup independent calibration and alignment procedures, which is mandatory for advanced quantitative microscopy. PMID:26244982
Cui, Yi; Hu, Dehong; Markillie, Lye Meng; ...
2017-10-04
Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Hu, Dehong; Markillie, Lye Meng
Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less
Berlin, Kathryn; Kruger, Tina; Klenosky, David B
2018-01-01
This mixed-methods study compares active older women in different physically based leisure activities and explores the difference in subjective ratings of successful aging and quantifiable predictors of success. A survey was administered to 256 women, 60-92 years of age, engaged in a sports- or exercise-based activity. Quantitative data were analyzed through ANOVA and multiple regression. Qualitative data (n = 79) was analyzed using the approach associated with means-end theory. While participants quantitatively appeared similar in terms of successful aging, qualitative interviews revealed differences in activity motivation. Women involved in sports highlighted social/psychological benefits, while those involved in exercise-based activities stressed fitness outcomes.
Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A
2015-01-01
Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.
Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick
2017-02-01
Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
A clustering approach to segmenting users of internet-based risk calculators.
Harle, C A; Downs, J S; Padman, R
2011-01-01
Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.
A Flow Cytometry-Based Assay for Quantifying Non-Plaque Forming Strains of Yellow Fever Virus
Hammarlund, Erika; Amanna, Ian J.; Dubois, Melissa E.; Barron, Alex; Engelmann, Flora; Messaoudi, Ilhem; Slifka, Mark K.
2012-01-01
Primary clinical isolates of yellow fever virus can be difficult to quantitate by standard in vitro methods because they may not form discernable plaques or induce a measurable cytopathic effect (CPE) on cell monolayers. In our hands, the Dakar strain of yellow fever virus (YFV-Dakar) could not be measured by plaque assay (PA), focus-forming assay (FFA), or by measurement of CPE. For these reasons, we developed a YFV-specific monoclonal antibody (3A8.B6) and used it to optimize a highly sensitive flow cytometry-based tissue culture limiting dilution assay (TC-LDA) to measure levels of infectious virus. The TC-LDA was performed by incubating serial dilutions of virus in replicate wells of C6/36 cells and stained intracellularly for virus with MAb 3A8.B6. Using this approach, we could reproducibly quantitate YFV-Dakar in tissue culture supernatants as well as from the serum of viremic rhesus macaques experimentally infected with YFV-Dakar. Moreover, the TC-LDA approach was >10-fold more sensitive than standard plaque assay for quantitating typical plaque-forming strains of YFV including YFV-17D and YFV-FNV (French neurotropic vaccine). Together, these results indicate that the TC-LDA technique is effective for quantitating both plaque-forming and non-plaque-forming strains of yellow fever virus, and this methodology may be readily adapted for the study and quantitation of other non-plaque-forming viruses. PMID:23028428
A flow cytometry-based assay for quantifying non-plaque forming strains of yellow fever virus.
Hammarlund, Erika; Amanna, Ian J; Dubois, Melissa E; Barron, Alex; Engelmann, Flora; Messaoudi, Ilhem; Slifka, Mark K
2012-01-01
Primary clinical isolates of yellow fever virus can be difficult to quantitate by standard in vitro methods because they may not form discernable plaques or induce a measurable cytopathic effect (CPE) on cell monolayers. In our hands, the Dakar strain of yellow fever virus (YFV-Dakar) could not be measured by plaque assay (PA), focus-forming assay (FFA), or by measurement of CPE. For these reasons, we developed a YFV-specific monoclonal antibody (3A8.B6) and used it to optimize a highly sensitive flow cytometry-based tissue culture limiting dilution assay (TC-LDA) to measure levels of infectious virus. The TC-LDA was performed by incubating serial dilutions of virus in replicate wells of C6/36 cells and stained intracellularly for virus with MAb 3A8.B6. Using this approach, we could reproducibly quantitate YFV-Dakar in tissue culture supernatants as well as from the serum of viremic rhesus macaques experimentally infected with YFV-Dakar. Moreover, the TC-LDA approach was >10-fold more sensitive than standard plaque assay for quantitating typical plaque-forming strains of YFV including YFV-17D and YFV-FNV (French neurotropic vaccine). Together, these results indicate that the TC-LDA technique is effective for quantitating both plaque-forming and non-plaque-forming strains of yellow fever virus, and this methodology may be readily adapted for the study and quantitation of other non-plaque-forming viruses.
An approach to computing direction relations between separated object groups
NASA Astrophysics Data System (ADS)
Yan, H.; Wang, Z.; Li, J.
2013-06-01
Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.
An approach to computing direction relations between separated object groups
NASA Astrophysics Data System (ADS)
Yan, H.; Wang, Z.; Li, J.
2013-09-01
Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.
Li, Weina; Li, Xuesong; Zhu, Wei; Li, Changxu; Xu, Dan; Ju, Yong; Li, Guangtao
2011-07-21
Based on a topochemical approach, a strategy for efficiently producing main-chain poly(bile acid)s in the solid state was developed. This strategy allows for facile and scalable synthesis of main-chain poly(bile acid)s not only with high molecular weights, but also with quantitative conversions and yields.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
ERIC Educational Resources Information Center
Fidalgo, Reyes
The focus of this highly quantitative investigation is the analysis of the differences observed, both in methodological approach and in language choice in the classroom, during five years of classroom visitations. The study reflects information collected from two very different institutions with radically distinct populations: a rural setting with…
Professional Learning: A Fuzzy Logic-Based Modelling Approach
ERIC Educational Resources Information Center
Gravani, M. N.; Hadjileontiadou, S. J.; Nikolaidou, G. N.; Hadjileontiadis, L. J.
2007-01-01
Studies have suggested that professional learning is influenced by two key parameters, i.e., climate and planning, and their associated variables (mutual respect, collaboration, mutual trust, supportiveness, openness). In this paper, we applied analysis of the relationships between the proposed quantitative, fuzzy logic-based model and a series of…
The Potential of Multivariate Analysis in Assessing Students' Attitude to Curriculum Subjects
ERIC Educational Resources Information Center
Gaotlhobogwe, Michael; Laugharne, Janet; Durance, Isabelle
2011-01-01
Background: Understanding student attitudes to curriculum subjects is central to providing evidence-based options to policy makers in education. Purpose: We illustrate how quantitative approaches used in the social sciences and based on multivariate analysis (categorical Principal Components Analysis, Clustering Analysis and General Linear…
Implementing Inquiry-Based Learning in Teaching Serial Dilutions
ERIC Educational Resources Information Center
Walker, Candace L.; McGill, Michael T.; Buikema, Arthur L., Jr.; Stevens, Ann M.
2008-01-01
The 5E model of inquiry-based learning was incorporated into a sophomore-level microbiology laboratory to increase student understanding of serial dilutions, a concept that is often difficult for most students to comprehend. Quantitative and qualitative assessments were conducted during the semester to determine the value of this approach for…
Discussion Based Fish Bowl Strategy in Learning Psychology
ERIC Educational Resources Information Center
Singaravelu, G.
2007-01-01
The present study investigates the learning problems in psychology at Master of Education(M.Ed.,) in Bharathiar University and finds the effectiveness of Discussion Based Fish Bowl Strategy in learning psychology. Single group Experimental method was adopted for the study. Both qualitative and quantitative approaches were adopted for this study.…
[Sex differentiation in plants. Terms and notions].
Godin, V N
2007-01-01
There are two methodological approaches to the study of sex in plants: the descriptive and morphological approach and the quantitative approach. The former is based exclusively on external morphological peculiarities of the generative organs of the flower, the latter is based on the functioning of individuals as parents of the coming generation. It has been suggested to recognize three flower types: staminate, pistillate, and complete. Depending on the distribution pattern of the flowers of different sex type, there are monomorphic populations (all individuals form flowers of the same type) and heteromorphic populations (individuals have flowers of different types). Monomorphic populations include monoclinous, monoecious, gynomonoecious, andromonoecious, and polygamomonoecious ones. Among heteromorphic populations, dioecious, polygamodioecious, subdioecious, paradioecious, and trioecious ones are recognized. It is desirable to give up the usage of such terms as "bisexual", "polygamous", "functionally female", and "functionally male" flowers, "temporary dioecy" and some others. The notion "gender" has been established in English-language works for describing the sex quantitavely; two additional terms have been proposed: "phenotypic gender" and "functional gender". The recently developed quantitative approach is at present in the process of accumulating material, and in need of the further elaborating the methodological base for research. Analysis of the principal notions shows the necessity to form their integrated structure and to correct the usage of the existing and new terms.
Wu, Zujian; Pang, Wei; Coghill, George M
2015-01-01
Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.
Less label, more free: approaches in label-free quantitative mass spectrometry.
Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A
2011-02-01
In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Gianazza, Erica; Tremoli, Elena; Banfi, Cristina
2014-12-01
Selected reaction monitoring, also known as multiple reaction monitoring, is a powerful targeted mass spectrometry approach for a confident quantitation of proteins/peptides in complex biological samples. In recent years, its optimization and application have become pivotal and of great interest in clinical research to derive useful outcomes for patient care. Thus, selected reaction monitoring/multiple reaction monitoring is now used as a highly sensitive and selective method for the evaluation of protein abundances and biomarker verification with potential applications in medical screening. This review describes technical aspects for the development of a robust multiplex assay and discussing its recent applications in cardiovascular proteomics: verification of promising disease candidates to select only the highest quality peptides/proteins for a preclinical validation, as well as quantitation of protein isoforms and post-translational modifications.
Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng
2014-01-01
Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649
NASA Astrophysics Data System (ADS)
Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi
2009-10-01
As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.
NASA Astrophysics Data System (ADS)
Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David
2016-04-01
Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing agricultural water demand significantly affect downstream water availability. Water demand options demonstrate potential to improve environmental flow conditions and satisfy legal water supply requirements for downstream riparian states. On the other hand, currently planned large scale infrastructural projects demonstrate reduced value in certain scenarios, illustrating the impacts of lock-in effects of large scale infrastructure. From a methodological perspective, we find that while the stakeholder-driven approach revealed robust options in a resource-light manner and helped initiate much needed interaction amongst stakeholders, the modelling approach provides complementary quantitative information. The study reveals robust adaptation options for this important basin and provides a strong methodological basis for carrying out future studies that support adaptation decision making.
Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool
NASA Astrophysics Data System (ADS)
Chakraborty, Monisha; Ghosh, Dipak
2017-12-01
Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.
Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool
NASA Astrophysics Data System (ADS)
Chakraborty, Monisha; Ghosh, Dipak
2018-04-01
Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.
A quantitative approach to evolution of music and philosophy
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano
2012-08-01
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
An overview of quantitative approaches in Gestalt perception.
Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H
2016-09-01
Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ayaz, Shirazi Muhammad; Kim, Min Young
2018-01-01
In this article, a multi-view registration approach for the 3D handheld profiling system based on the multiple shot structured light technique is proposed. The multi-view registration approach is categorized into coarse registration and point cloud refinement using the iterative closest point (ICP) algorithm. Coarse registration of multiple point clouds was performed using relative orientation and translation parameters estimated via homography-based visual navigation. The proposed system was evaluated using an artificial human skull and a paper box object. For the quantitative evaluation of the accuracy of a single 3D scan, a paper box was reconstructed, and the mean errors in its height and breadth were found to be 9.4 μm and 23 μm, respectively. A comprehensive quantitative evaluation and comparison of proposed algorithm was performed with other variants of ICP. The root mean square error for the ICP algorithm to register a pair of point clouds of the skull object was also found to be less than 1 mm. PMID:29642552
NASA Astrophysics Data System (ADS)
Trusiak, Maciej; Micó, Vicente; Patorski, Krzysztof; García-Monreal, Javier; Sluzewski, Lukasz; Ferreira, Carlos
2016-08-01
In this contribution we propose two Hilbert-Huang Transform based algorithms for fast and accurate single-shot and two-shot quantitative phase imaging applicable in both on-axis and off-axis configurations. In the first scheme a single fringe pattern containing information about biological phase-sample under study is adaptively pre-filtered using empirical mode decomposition based approach. Further it is phase demodulated by the Hilbert Spiral Transform aided by the Principal Component Analysis for the local fringe orientation estimation. Orientation calculation enables closed fringes efficient analysis and can be avoided using arbitrary phase-shifted two-shot Gram-Schmidt Orthonormalization scheme aided by Hilbert-Huang Transform pre-filtering. This two-shot approach is a trade-off between single-frame and temporal phase shifting demodulation. Robustness of the proposed techniques is corroborated using experimental digital holographic microscopy studies of polystyrene micro-beads and red blood cells. Both algorithms compare favorably with the temporal phase shifting scheme which is used as a reference method.
Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara
2017-01-01
Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.
Keyser, Donna J; Pincus, Harold Alan
2010-01-01
A community-based collaborative conducted a 2-year pilot study to inform efforts for improving maternal and child health care practice and policy in Allegheny County, Pennsylvania. (1) To test whether three small-scale versions of an evidence-based, systems improvement approach would be workable in local community settings and (2) to identify specific policy/infrastructure reforms for sustaining improvements. A mixed methods approach was used, including quantitative performance measurement supplemented with qualitative data about factors related to outcomes of interest, as well as key stakeholder interviews and a literature review/Internet search. Quantitative performance results varied; qualitative data revealed critical factors for the success and failure of the practices tested. Policy/infrastructure recommendations were developed to address specific practice barriers. This information was important for designing a region-wide quality improvement initiative focused on maternal depression. The processes and outcomes provide valuable insights for other communities interested in conducting similar quality improvement initiatives.
Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods
2010-01-01
Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202
Time series regression-based pairs trading in the Korean equities market
NASA Astrophysics Data System (ADS)
Kim, Saejoon; Heo, Jun
2017-07-01
Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.
Dang, Xibei; Singh, Amar; Spetman, Brian D; Nolan, Krystal D; Isaacs, Jennifer S; Dennis, Jonathan H; Dalton, Stephen; Marshall, Alan G; Young, Nicolas L
2016-09-02
Histone variants are known to play a central role in genome regulation and maintenance. However, many variants are inaccessible by antibody-based methods or bottom-up tandem mass spectrometry due to their highly similar sequences. For many, the only tractable approach is with intact protein top-down tandem mass spectrometry. Here, ultra-high-resolution FT-ICR MS and MS/MS yield quantitative relative abundances of all detected HeLa H2A and H2B isobaric and isomeric variants with a label-free approach. We extend the analysis to identify and relatively quantitate 16 proteoforms from 12 sequence variants of histone H2A and 10 proteoforms of histone H2B from three other cell lines: human embryonic stem cells (WA09), U937, and a prostate cancer cell line LaZ. The top-down MS/MS approach provides a path forward for more extensive elucidation of the biological role of many previously unstudied histone variants and post-translational modifications.
Makrakis, Vassilios; Kostoulas-Makrakis, Nelly
2016-02-01
Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.
CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis
Choi, Kyoungah; Lee, Impyeong
2015-01-01
We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909
Percy, Andrew J; Simon, Romain; Chambers, Andrew G; Borchers, Christoph H
2014-06-25
Mass spectrometry (MS)-based protein quantitation is increasingly being employed to verify candidate protein biomarkers. Multiple or selected reaction monitoring-mass spectrometry (MRM-MS or SRM-MS) with isotopically labeled internal standards has proven to be a successful approach in that regard, but has yet to reach its full potential in terms of multiplexing and sensitivity. Here, we report the development of a new MRM method for the quantitation of 253 disease-associated proteins (represented by 625 interference-free peptides) in 13 LC fractions. This 2D RPLC/MRM-MS approach extends the depth and breadth of the assay by 2 orders of magnitude over pre-fractionation-free assays, with 31 proteins below 10 ng/mL and 41 proteins above 10 ng/mL now quantifiable. Standard flow rates are used in both chromatographic dimensions, and up-front depletion or antibody-based enrichment is not required. The LC separations utilize high and low pH conditions, with the former employing an ammonium hydroxide-based eluent, instead of the conventional ammonium formate, resulting in improved LC column lifetime and performance. The high sensitivity (determined concentration range: 15 mg/mL to 452 pg/mL) and robustness afforded by this method makes the full MRM panel, or subsets thereof, useful for the verification of disease-associated plasma protein biomarkers in patient samples. The described research extends the breadth and depth of protein quantitation in undepleted and non-enriched human plasma by employing standard-flow 2D RPLC/MRM-MS in conjunction with a complex mixture of isotopically labeled peptide standards. The proteins quantified are mainly putative biomarkers of non-communicable (i.e., non-infectious) disease (e.g., cardiovascular or cancer), which require pre-clinical verification and validation before clinical implementation. Based on the enhanced sensitivity and multiplexing, this quantitative plasma proteomic method should prove useful in future candidate biomarker verification studies. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Rahayu, Sri; Chandrasegaran, A. L.; Treagust, David F.; Kita, Masakazu; Ibnu, Suhadi
2011-01-01
This study was a mixed quantitative-qualitative research to evaluate the efficacy of a designed student-centred instructional (DSCI) program for teaching about acids and bases. The teaching innovation was designed based on constructivist, hands-on inquiry and context-based approaches and implemented in seven 45-min lessons with a class of 36 grade…
Quantitative Detection of Cracks in Steel Using Eddy Current Pulsed Thermography.
Shi, Zhanqun; Xu, Xiaoyu; Ma, Jiaojiao; Zhen, Dong; Zhang, Hao
2018-04-02
Small cracks are common defects in steel and often lead to catastrophic accidents in industrial applications. Various nondestructive testing methods have been investigated for crack detection; however, most current methods focus on qualitative crack identification and image processing. In this study, eddy current pulsed thermography (ECPT) was applied for quantitative crack detection based on derivative analysis of temperature variation. The effects of the incentive parameters on the temperature variation were analyzed in the simulation study. The crack profile and position are identified in the thermal image based on the Canny edge detection algorithm. Then, one or more trajectories are determined through the crack profile in order to determine the crack boundary through its temperature distribution. The slope curve along the trajectory is obtained. Finally, quantitative analysis of the crack sizes was performed by analyzing the features of the slope curves. The experimental verification showed that the crack sizes could be quantitatively detected with errors of less than 1%. Therefore, the proposed ECPT method was demonstrated to be a feasible and effective nondestructive approach for quantitative crack detection.
Codony, Francesc; Pérez, Leonardo Martín; Adrados, Bárbara; Agustí, Gemma; Fittipaldi, Mariana; Morató, Jordi
2012-01-01
Culture-based methods for fecal indicator microorganisms are the standard protocol to assess potential health risk from drinking water systems. However, these traditional fecal indicators are inappropriate surrogates for disinfection-resistant fecal pathogens and the indigenous pathogens that grow in drinking water systems. There is now a range of molecular-based methods, such as quantitative PCR, which allow detection of a variety of pathogens and alternative indicators. Hence, in addition to targeting total Escherichia coli (i.e., dead and alive) for the detection of fecal pollution, various amoebae may be suitable to indicate the potential presence of pathogenic amoeba-resisting microorganisms, such as Legionellae. Therefore, monitoring amoeba levels by quantitative PCR could be a useful tool for directly and indirectly evaluating health risk and could also be a complementary approach to current microbial quality control strategies for drinking water systems.
A novel environmental DNA approach to quantify the cryptic invasion of non-native genotypes.
Uchii, Kimiko; Doi, Hideyuki; Minamoto, Toshifumi
2016-03-01
The invasion of non-native species that are closely related to native species can lead to competitive elimination of the native species and/or genomic extinction through hybridization. Such invasions often become serious before they are detected, posing unprecedented threats to biodiversity. A Japanese native strain of common carp (Cyprinus carpio) has become endangered owing to the invasion of non-native strains introduced from the Eurasian continent. Here, we propose a rapid environmental DNA-based approach to quantitatively monitor the invasion of non-native genotypes. Using this system, we developed a method to quantify the relative proportion of native and non-native DNA based on a single-nucleotide polymorphism using cycling probe technology in real-time PCR. The efficiency of this method was confirmed in aquarium experiments, where the quantified proportion of native and non-native DNA in the water was well correlated to the biomass ratio of native and non-native genotypes. This method provided quantitative estimates for the proportion of native and non-native DNA in natural rivers and reservoirs, which allowed us to estimate the degree of invasion of non-native genotypes without catching and analysing individual fish. Our approach would dramatically facilitate the process of quantitatively monitoring the invasion of non-native conspecifics in aquatic ecosystems, thus revealing a promising method for risk assessment and management in biodiversity conservation. © 2015 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Eyisi, Daniel
2016-01-01
Research in science education is to discover the truth which involves the combination of reasoning and experiences. In order to find out appropriate teaching methods that are necessary for teaching science students problem-solving skills, different research approaches are used by educational researchers based on the data collection and analysis…
Advantages and limitations of quantitative PCR (Q-PCR)-based approaches in microbial ecology.
Smith, Cindy J; Osborn, A Mark
2009-01-01
Quantitative PCR (Q-PCR or real-time PCR) approaches are now widely applied in microbial ecology to quantify the abundance and expression of taxonomic and functional gene markers within the environment. Q-PCR-based analyses combine 'traditional' end-point detection PCR with fluorescent detection technologies to record the accumulation of amplicons in 'real time' during each cycle of the PCR amplification. By detection of amplicons during the early exponential phase of the PCR, this enables the quantification of gene (or transcript) numbers when these are proportional to the starting template concentration. When Q-PCR is coupled with a preceding reverse transcription reaction, it can be used to quantify gene expression (RT-Q-PCR). This review firstly addresses the theoretical and practical implementation of Q-PCR and RT-Q-PCR protocols in microbial ecology, highlighting key experimental considerations. Secondly, we review the applications of (RT)-Q-PCR analyses in environmental microbiology and evaluate the contribution and advances gained from such approaches. Finally, we conclude by offering future perspectives on the application of (RT)-Q-PCR in furthering understanding in microbial ecology, in particular, when coupled with other molecular approaches and more traditional investigations of environmental systems.
Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials
Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.
2015-01-01
Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.
2004-04-01
There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less
Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.
Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro
2016-03-01
Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.
Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.
Omer, Travis; Intes, Xavier; Hahn, Juergen
2015-01-01
Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.
Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)
NASA Astrophysics Data System (ADS)
Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.
2017-02-01
In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.
Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.
Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K
2000-01-01
Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.
Chiu, Grace S; Wu, Margaret A; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.
Lai, Ying-Hui; Tsao, Yu; Lu, Xugang; Chen, Fei; Su, Yu-Ting; Chen, Kuang-Chao; Chen, Yu-Hsuan; Chen, Li-Ching; Po-Hung Li, Lieber; Lee, Chin-Hui
2018-01-20
We investigate the clinical effectiveness of a novel deep learning-based noise reduction (NR) approach under noisy conditions with challenging noise types at low signal to noise ratio (SNR) levels for Mandarin-speaking cochlear implant (CI) recipients. The deep learning-based NR approach used in this study consists of two modules: noise classifier (NC) and deep denoising autoencoder (DDAE), thus termed (NC + DDAE). In a series of comprehensive experiments, we conduct qualitative and quantitative analyses on the NC module and the overall NC + DDAE approach. Moreover, we evaluate the speech recognition performance of the NC + DDAE NR and classical single-microphone NR approaches for Mandarin-speaking CI recipients under different noisy conditions. The testing set contains Mandarin sentences corrupted by two types of maskers, two-talker babble noise, and a construction jackhammer noise, at 0 and 5 dB SNR levels. Two conventional NR techniques and the proposed deep learning-based approach are used to process the noisy utterances. We qualitatively compare the NR approaches by the amplitude envelope and spectrogram plots of the processed utterances. Quantitative objective measures include (1) normalized covariance measure to test the intelligibility of the utterances processed by each of the NR approaches; and (2) speech recognition tests conducted by nine Mandarin-speaking CI recipients. These nine CI recipients use their own clinical speech processors during testing. The experimental results of objective evaluation and listening test indicate that under challenging listening conditions, the proposed NC + DDAE NR approach yields higher intelligibility scores than the two compared classical NR techniques, under both matched and mismatched training-testing conditions. When compared to the two well-known conventional NR techniques under challenging listening condition, the proposed NC + DDAE NR approach has superior noise suppression capabilities and gives less distortion for the key speech envelope information, thus, improving speech recognition more effectively for Mandarin CI recipients. The results suggest that the proposed deep learning-based NR approach can potentially be integrated into existing CI signal processors to overcome the degradation of speech perception caused by noise.
The Effect of Project Based Learning on Seventh Grade Students' Academic Achievement
ERIC Educational Resources Information Center
Kizkapan, Oktay; Bektas, Oktay
2017-01-01
The purpose of this study is to investigate whether there is a significant effect of project based learning approach on seventh grade students' academic achievement in the structure and properties of matter. In the study, according to the characteristics of quantitative research methods, pretest-posttest control group quasi-experimental design was…
School-Based Management: Arab Education System in Israel
ERIC Educational Resources Information Center
Arar, Khalid; Abu-Romi, Amal
2016-01-01
Purpose: The purpose of this paper is to investigate the issue of school-based management (SBM) in elementary schools in the Arab education system in Israel, comparing schools experienced in SBM, schools beginning to use SBM and schools that do not use SBM. Design/methodology/approach: A quantitative research used a structured questionnaire to…
Effectiveness of Inquiry-Based Learning in an Undergraduate Exercise Physiology Course
ERIC Educational Resources Information Center
Nybo, Lars; May, Michael
2015-01-01
The present study was conducted to investigate the effects of changing a laboratory physiology course for undergraduate students from a traditional step-by-step guided structure to an inquiry-based approach. With this aim in mind, quantitative and qualitative evaluations of learning outcomes (individual subject-specific tests and group interviews)…
Monitoring Urban Quality of Life: The Porto Experience
ERIC Educational Resources Information Center
Santos, Luis Delfim; Martins, Isabel
2007-01-01
This paper describes the monitoring system of the urban quality of life developed by the Porto City Council, a new tool being used to support urban planning and management. The two components of this system--a quantitative approach based on statistical indicators and a qualitative analysis based on the citizens' perceptions of the conditions of…
Testing process predictions of models of risky choice: a quantitative model comparison approach
Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard
2013-01-01
This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472
NASA Astrophysics Data System (ADS)
Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon
2016-03-01
The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.
Monakhova, Yulia B; Mushtakova, Svetlana P
2017-05-01
A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.
Infrared thermography quantitative image processing
NASA Astrophysics Data System (ADS)
Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB
2017-11-01
Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.
NASA Astrophysics Data System (ADS)
Radun, Jenni E.; Virtanen, Toni; Olives, Jean-Luc; Vaahteranoksa, Mikko; Vuori, Tero; Nyman, Göte
2007-01-01
We present an effective method for comparing subjective audiovisual quality and the features related to the quality changes of different video cameras. Both quantitative estimation of overall quality and qualitative description of critical quality features are achieved by the method. The aim was to combine two image quality evaluation methods, the quantitative Absolute Category Rating (ACR) method with hidden reference removal and the qualitative Interpretation- Based Quality (IBQ) method in order to see how they complement each other in audiovisual quality estimation tasks. 26 observers estimated the audiovisual quality of six different cameras, mainly mobile phone video cameras. In order to achieve an efficient subjective estimation of audiovisual quality, only two contents with different quality requirements were recorded with each camera. The results show that the subjectively important quality features were more related to the overall estimations of cameras' visual video quality than to the features related to sound. The data demonstrated two significant quality dimensions related to visual quality: darkness and sharpness. We conclude that the qualitative methodology can complement quantitative quality estimations also with audiovisual material. The IBQ approach is valuable especially, when the induced quality changes are multidimensional.
Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.
2016-01-01
Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933
Quantitative biomarkers of colonic dysplasia based on intrinsic second-harmonic generation signal
NASA Astrophysics Data System (ADS)
Zhuo, Shuangmu; Zhu, Xiaoqin; Wu, Guizhu; Chen, Jianxin; Xie, Shusen
2011-12-01
Most colorectal cancers arise from dysplastic lesions, such as adenomatous polyps, and these lesions are difficult to be detected by the current endoscopic screening approaches. Here, we present the use of an intrinsic second-harmonic generation (SHG) signal as a novel means to differentiate between normal and dysplastic human colonic tissues. We find that the SHG signal can quantitatively identify collagen change associated with colonic dysplasia that is indiscernible by conventional pathologic techniques. By comparing normal with dysplastic mucosa, there were significant differences in collagen density and collagen fiber direction, providing substantial potential to become quantitative intrinsic biomarkers for in vivo clinical diagnosis of colonic dysplasia.
Sadybekov, Arman; Krylov, Anna I.
2017-07-07
A theoretical approach for calculating core-level states in condensed phase is presented. The approach is based on equation-of-motion coupled-cluster theory (EOMCC) and effective fragment potential (EFP) method. By introducing an approximate treatment of double excitations in the EOM-CCSD (EOM-CC with single and double substitutions) ansatz, we address poor convergence issues that are encountered for the core-level states and significantly reduce computational costs. While the approximations introduce relatively large errors in the absolute values of transition energies, the errors are systematic. Consequently, chemical shifts, changes in ionization energies relative to reference systems, are reproduced reasonably well. By using different protonation formsmore » of solvated glycine as a benchmark system, we show that our protocol is capable of reproducing the experimental chemical shifts with a quantitative accuracy. The results demonstrate that chemical shifts are very sensitive to the solvent interactions and that explicit treatment of solvent, such as EFP, is essential for achieving quantitative accuracy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadybekov, Arman; Krylov, Anna I.
A theoretical approach for calculating core-level states in condensed phase is presented. The approach is based on equation-of-motion coupled-cluster theory (EOMCC) and effective fragment potential (EFP) method. By introducing an approximate treatment of double excitations in the EOM-CCSD (EOM-CC with single and double substitutions) ansatz, we address poor convergence issues that are encountered for the core-level states and significantly reduce computational costs. While the approximations introduce relatively large errors in the absolute values of transition energies, the errors are systematic. Consequently, chemical shifts, changes in ionization energies relative to reference systems, are reproduced reasonably well. By using different protonation formsmore » of solvated glycine as a benchmark system, we show that our protocol is capable of reproducing the experimental chemical shifts with a quantitative accuracy. The results demonstrate that chemical shifts are very sensitive to the solvent interactions and that explicit treatment of solvent, such as EFP, is essential for achieving quantitative accuracy.« less
Czochralski crystal growth: Modeling study
NASA Technical Reports Server (NTRS)
Dudukovic, M. P.; Ramachandran, P. A.; Srivastava, R. K.; Dorsey, D.
1986-01-01
The modeling study of Czochralski (Cz) crystal growth is reported. The approach was to relate in a quantitative manner, using models based on first priniciples, crystal quality to operating conditions and geometric variables. The finite element method is used for all calculations.
NMR-based Metabolomics for Cancer Research
Metabolomics is considered as a complementary tool to other omics platforms to provide a snapshot of the cellular biochemistry and physiology taking place at any instant. Metabolmics approaches have been widely used to provide comprehensive and quantitative analyses of the metabo...
Genetic approaches in comparative and evolutionary physiology
Bridgham, Jamie T.; Kelly, Scott A.; Garland, Theodore
2015-01-01
Whole animal physiological performance is highly polygenic and highly plastic, and the same is generally true for the many subordinate traits that underlie performance capacities. Quantitative genetics, therefore, provides an appropriate framework for the analysis of physiological phenotypes and can be used to infer the microevolutionary processes that have shaped patterns of trait variation within and among species. In cases where specific genes are known to contribute to variation in physiological traits, analyses of intraspecific polymorphism and interspecific divergence can reveal molecular mechanisms of functional evolution and can provide insights into the possible adaptive significance of observed sequence changes. In this review, we explain how the tools and theory of quantitative genetics, population genetics, and molecular evolution can inform our understanding of mechanism and process in physiological evolution. For example, lab-based studies of polygenic inheritance can be integrated with field-based studies of trait variation and survivorship to measure selection in the wild, thereby providing direct insights into the adaptive significance of physiological variation. Analyses of quantitative genetic variation in selection experiments can be used to probe interrelationships among traits and the genetic basis of physiological trade-offs and constraints. We review approaches for characterizing the genetic architecture of physiological traits, including linkage mapping and association mapping, and systems approaches for dissecting intermediary steps in the chain of causation between genotype and phenotype. We also discuss the promise and limitations of population genomic approaches for inferring adaptation at specific loci. We end by highlighting the role of organismal physiology in the functional synthesis of evolutionary biology. PMID:26041111
Genetic approaches in comparative and evolutionary physiology.
Storz, Jay F; Bridgham, Jamie T; Kelly, Scott A; Garland, Theodore
2015-08-01
Whole animal physiological performance is highly polygenic and highly plastic, and the same is generally true for the many subordinate traits that underlie performance capacities. Quantitative genetics, therefore, provides an appropriate framework for the analysis of physiological phenotypes and can be used to infer the microevolutionary processes that have shaped patterns of trait variation within and among species. In cases where specific genes are known to contribute to variation in physiological traits, analyses of intraspecific polymorphism and interspecific divergence can reveal molecular mechanisms of functional evolution and can provide insights into the possible adaptive significance of observed sequence changes. In this review, we explain how the tools and theory of quantitative genetics, population genetics, and molecular evolution can inform our understanding of mechanism and process in physiological evolution. For example, lab-based studies of polygenic inheritance can be integrated with field-based studies of trait variation and survivorship to measure selection in the wild, thereby providing direct insights into the adaptive significance of physiological variation. Analyses of quantitative genetic variation in selection experiments can be used to probe interrelationships among traits and the genetic basis of physiological trade-offs and constraints. We review approaches for characterizing the genetic architecture of physiological traits, including linkage mapping and association mapping, and systems approaches for dissecting intermediary steps in the chain of causation between genotype and phenotype. We also discuss the promise and limitations of population genomic approaches for inferring adaptation at specific loci. We end by highlighting the role of organismal physiology in the functional synthesis of evolutionary biology. Copyright © 2015 the American Physiological Society.
Towards quantitative classification of folded proteins in terms of elementary functions.
Hu, Shuangwei; Krokhotin, Andrei; Niemi, Antti J; Peng, Xubiao
2011-04-01
A comparative classification scheme provides a good basis for several approaches to understand proteins, including prediction of relations between their structure and biological function. But it remains a challenge to combine a classification scheme that describes a protein starting from its well-organized secondary structures and often involves direct human involvement, with an atomary-level physics-based approach where a protein is fundamentally nothing more than an ensemble of mutually interacting carbon, hydrogen, oxygen, and nitrogen atoms. In order to bridge these two complementary approaches to proteins, conceptually novel tools need to be introduced. Here we explain how an approach toward geometric characterization of entire folded proteins can be based on a single explicit elementary function that is familiar from nonlinear physical systems where it is known as the kink soliton. Our approach enables the conversion of hierarchical structural information into a quantitative form that allows for a folded protein to be characterized in terms of a small number of global parameters that are in principle computable from atomary-level considerations. As an example we describe in detail how the native fold of the myoglobin 1M6C emerges from a combination of kink solitons with a very high atomary-level accuracy. We also verify that our approach describes longer loops and loops connecting α helices with β strands, with the same overall accuracy. ©2011 American Physical Society
Abdulhay, Enas; Khnouf, Ruba; Haddad, Shireen; Al-Bashir, Areen
2017-08-04
Improvement of medical content in Biomedical Engineering curricula based on a qualitative assessment process or on a comparison with another high-standard program has been approached by a number of studies. However, the quantitative assessment tools have not been emphasized. The quantitative assessment tools can be more accurate and robust in cases of challenging multidisciplinary fields like that of Biomedical Engineering which includes biomedicine elements mixed with technology aspects. The major limitations of the previous research are the high dependence on surveys or pure qualitative approaches as well as the absence of strong focus on medical outcomes without implicit confusion with the technical ones. The proposed work presents the development and evaluation of an accurate/robust quantitative approach to the improvement of the medical content in the challenging multidisciplinary BME curriculum. The work presents quantitative assessment tools and subsequent improvement of curriculum medical content applied, as example for explanation, to the ABET (Accreditation Board for Engineering and Technology, USA) accredited biomedical engineering BME department at Jordan University of Science and Technology. The quantitative results of assessment of curriculum/course, capstone, exit exam, course assessment by student (CAS) as well as of surveys filled by alumni, seniors, employers and training supervisors were, first, mapped to the expected students' outcomes related to the medical field (SOsM). The collected data were then analyzed and discussed to find curriculum weakness points by tracking shortcomings in every outcome degree of achievement. Finally, actions were taken to fill in the gaps of the curriculum. Actions were also mapped to the students' medical outcomes (SOsM). Weighted averages of obtained quantitative values, mapped to SOsM, indicated accurately the achievement levels of all outcomes as well as the necessary improvements to be performed in curriculum. Mapping the improvements to SOsM also helps in the assessment of the following cycle. The suggested assessment tools can be generalized and extended to any other BME department. Robust improvement of medical content in BME curriculum can subsequently be achieved.
Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E
2014-09-23
Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.
Beidas, Rinad S; Wolk, Courtney L Benjamin; Walsh, Lucia M; Evans, Arthur C; Hurford, Matthew O; Barg, Frances K
2014-11-23
Organizational factors impact the delivery of mental health services in community settings. Mixed-methods analytic approaches have been recommended, though little research within implementation science has explicitly compared inductive and deductive perspectives to understand their relative value in understanding the same constructs. The purpose of our study is to use two different paradigmatic approaches to deepen our understanding of organizational social context. We accomplish this by using a mixed-methods approach in an investigation of organizational social context in community mental health clinics. Nineteen agencies, representing 23 sites, participated. Enrolled participants included 130 therapists, 36 supervisors, and 22 executive administrators. Quantitative data was obtained via the Organizational Social Context (OSC) measure. Qualitative data, comprised of direct observation with spot sampling generated from agency visits, was coded using content analysis and grounded theory. The present study examined elements of organizational social context that would have been missed if only quantitative data had been obtained and utilized mixed methods to investigate if stratifying observations based on quantitative ratings from the OSC resulted in the emergence of differential themes. Four of the six OSC constructs were commonly observed in field observations (i.e., proficiency, rigidity, functionality, stress), while the remaining two constructs were not frequently observed (i.e., resistance, engagement). Constructs emerged related to organizational social context that may have been missed if only quantitative measurement was employed, including those around the physical environment, commentary about evidence-based practice initiatives, leadership, cultural diversity, distrust, and affect. Stratifying agencies by "best," "average," and "worst" organizational social context impacted interpretation for three constructs (affect, stress, and leadership). Results support the additive value of integrating inductive and deductive perspectives in implementation science research. This synthesis of approaches facilitated a more comprehensive understanding and interpretation of the findings than would have been possible if either methodology had been employed in isolation.
Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy
2016-01-07
Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays.
Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian
2012-10-24
Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.
Dolled-Filhart, Marisa P; Gustavson, Mark D
2012-11-01
Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.
Approaches to developing alternative and predictive toxicology based on PBPK/PD and QSAR modeling.
Yang, R S; Thomas, R S; Gustafson, D L; Campain, J; Benjamin, S A; Verhaar, H J; Mumtaz, M M
1998-01-01
Systematic toxicity testing, using conventional toxicology methodologies, of single chemicals and chemical mixtures is highly impractical because of the immense numbers of chemicals and chemical mixtures involved and the limited scientific resources. Therefore, the development of unconventional, efficient, and predictive toxicology methods is imperative. Using carcinogenicity as an end point, we present approaches for developing predictive tools for toxicologic evaluation of chemicals and chemical mixtures relevant to environmental contamination. Central to the approaches presented is the integration of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) and quantitative structure--activity relationship (QSAR) modeling with focused mechanistically based experimental toxicology. In this development, molecular and cellular biomarkers critical to the carcinogenesis process are evaluated quantitatively between different chemicals and/or chemical mixtures. Examples presented include the integration of PBPK/PD and QSAR modeling with a time-course medium-term liver foci assay, molecular biology and cell proliferation studies. Fourier transform infrared spectroscopic analyses of DNA changes, and cancer modeling to assess and attempt to predict the carcinogenicity of the series of 12 chlorobenzene isomers. Also presented is an ongoing effort to develop and apply a similar approach to chemical mixtures using in vitro cell culture (Syrian hamster embryo cell transformation assay and human keratinocytes) methodologies and in vivo studies. The promise and pitfalls of these developments are elaborated. When successfully applied, these approaches may greatly reduce animal usage, personnel, resources, and time required to evaluate the carcinogenicity of chemicals and chemical mixtures. Images Figure 6 PMID:9860897
NASA Astrophysics Data System (ADS)
Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.
2012-01-01
This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method.
A New Approach to Image Fusion Based on Cokriging
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.
2005-01-01
We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.
Quantitative photoacoustic imaging in the acoustic regime using SPIM
NASA Astrophysics Data System (ADS)
Beigl, Alexander; Elbau, Peter; Sadiq, Kamran; Scherzer, Otmar
2018-05-01
While in standard photoacoustic imaging the propagation of sound waves is modeled by the standard wave equation, our approach is based on a generalized wave equation with variable sound speed and material density, respectively. In this paper we present an approach for photoacoustic imaging, which in addition to the recovery of the absorption density parameter, the imaging parameter of standard photoacoustics, also allows us to reconstruct the spatially varying sound speed and density, respectively, of the medium. We provide analytical reconstruction formulas for all three parameters based in a linearized model based on single plane illumination microscopy (SPIM) techniques.
Quantitative metrics for evaluating the phased roll-out of clinical information systems.
Wong, David; Wu, Nicolas; Watkinson, Peter
2017-09-01
We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.
2016-01-01
The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451
NASA Astrophysics Data System (ADS)
Gou, Yabin; Ma, Yingzhao; Chen, Haonan; Wen, Yixin
2018-05-01
Quantitative precipitation estimation (QPE) is one of the important applications of weather radars. However, in complex terrain such as Tibetan Plateau, it is a challenging task to obtain an optimal Z-R relation due to the complex spatial and temporal variability in precipitation microphysics. This paper develops two radar QPE schemes respectively based on Reflectivity Threshold (RT) and Storm Cell Identification and Tracking (SCIT) algorithms using observations from 11 Doppler weather radars and 3264 rain gauges over the Eastern Tibetan Plateau (ETP). These two QPE methodologies are evaluated extensively using four precipitation events that are characterized by different meteorological features. Precipitation characteristics of independent storm cells associated with these four events, as well as the storm-scale differences, are investigated using short-term vertical profile of reflectivity (VPR) clusters. Evaluation results show that the SCIT-based rainfall approach performs better than the simple RT-based method for all precipitation events in terms of score comparison using validation gauge measurements as references. It is also found that the SCIT-based approach can effectively mitigate the local error of radar QPE and represent the precipitation spatiotemporal variability better than the RT-based scheme.
Nygate, Yoav N; Singh, Gyanendra; Barnea, Itay; Shaked, Natan T
2018-06-01
We present a new technique for obtaining simultaneous multimodal quantitative phase and fluorescence microscopy of biological cells, providing both quantitative phase imaging and molecular specificity using a single camera. Our system is based on an interferometric multiplexing module, externally positioned at the exit of an optical microscope. In contrast to previous approaches, the presented technique allows conventional fluorescence imaging, rather than interferometric off-axis fluorescence imaging. We demonstrate the presented technique for imaging fluorescent beads and live biological cells.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... quantitative approaches to determine the levels of taking that would result in a negligible impact to affected species or stocks of marine mammals. The quantitative approach is more appropriate for serious injury and... required a more quantitative approach for assessing what level of removals from a population stock of...
Parameters for Pesticide QSAR and PBPK/PD Models to inform Human Risk Assessments
Physiologically-based pharmacokinetic and pharmacodynamic (PBPK/PD) modeling has emerged as an important computational approach supporting quantitative risk assessment of agrochemicals. However, before complete regulatory acceptance of this tool, an assessment of assets and liabi...
Source-to-Outcome Microbial Exposure and Risk Modeling Framework
A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...
Manuscript 116 Mechanisms: DNA Reactive Aagents
ABSTRACT The U.S. Environmental Protection Agency’s Guidelines for Carcinogen Risk Assessment (2005) uses an analytical framework for conducting a quantitative cancer risk assessment that is based on mode of action/key events and human relevance. The approach stresses the enh...
New High Throughput Methods to Estimate Chemical Exposure
EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...
Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A
2014-12-01
Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.
Constructivist-Informed Pedagogy in Teacher Education: An Overview of a Year-Long Study in Fiji.
ERIC Educational Resources Information Center
Taylor, Neil; Coll, Richard
2002-01-01
This year-long study exposed preservice elementary teachers in Fiji to pedagogy based on a constructivist view of learning in order to improve their content knowledge and provide them with greater confidence to teach science. Qualitative and quantitative analysis indicated that the constructivist-based teaching approach led to improved learning,…
Problem-Based Learning in Tertiary Education: Teaching Old "Dogs" New Tricks?
ERIC Educational Resources Information Center
Yeo, Roland K.
2005-01-01
Purpose--The paper sets out to explore the challenges of problem-based learning (PBL) in tertiary education and to propose a framework with implications for practice and learning. Design/Methodology/Approach--A total of 18 tertiary students divided into three groups participated in the focus group discussions. A quantitative instrument was used as…
The Effect of a Constructivist-Based Approach on Fifth Grade Reading Achievement
ERIC Educational Resources Information Center
Harkness, Lori M.
2016-01-01
The problem investigated in this quantitative study was that schools in a small, rural East Texas town were falling below acceptable ratings in reading on the Texas Assessment of Knowledge and Skills (TAKS) and the State of Texas Assessment of Academic Readiness (STAAR). Researchers have found that constructive-based learning environments (CBLEs)…
Culturally Based Math Education as a Way to Improve Alaska Native Students' Math Performance.
ERIC Educational Resources Information Center
Lipka, Jerry; Adams, Barbara
2004-01-01
Culturally based instruction has long been touted as a preferred approach to improving the performance of American Indian and Alaska Native (AI/AN) students? academic performance. However, there has been scant research to support this conjecture, particularly when quantitative data and quasi-experimental designs are included. The results of this…
Do Junior High School Students Perceive Their Learning Environment as Constructivist?
ERIC Educational Resources Information Center
Moustafa, Asely; Ben-Zvi-Assaraf, Orit; Eshach, Haim
2013-01-01
The purpose of this study is to examine the manner in which the features of a constructivist learning environment, and the mechanisms at its base, are expressed in junior high school students' conceptions. Our research is based on an integration of quantitative and qualitative approaches, deigned to provide a wider ranging and deeper…
Examination of Test and Item Statistics from Visual and Verbal Mathematics Questions
ERIC Educational Resources Information Center
Alpayar, Cagla; Gulleroglu, H. Deniz
2017-01-01
The aim of this research is to determine whether students' test performance and approaches to test questions change based on the type of mathematics questions (visual or verbal) administered to them. This research is based on a mixed-design model. The quantitative data are gathered from 297 seventh grade students, attending seven different middle…
Müller, Norbert; Vonlaufen, Nathalie; Gianinazzi, Christian; Leib, Stephen L.; Hemphill, Andrew
2002-01-01
The previously described Nc5-specific PCR test for the diagnosis of Neospora caninum infections was used to develop a quantitative PCR assay which allows the determination of infection intensities within different experimental and diagnostic sample groups. The quantitative PCR was performed by using a dual fluorescent hybridization probe system and the LightCycler Instrument for online detection of amplified DNA. This assay was successfully applied for demonstrating the parasite proliferation kinetics in organotypic slice cultures of rat brain which were infected in vitro with N. caninum tachyzoites. This PCR-based method of parasite quantitation with organotypic brain tissue samples can be regarded as a novel ex vivo approach for exploring different aspects of cerebral N. caninum infection. PMID:11773124
Excimer-monomer switch: a reaction-based approach for selective detection of fluoride.
Song, Qiao; Bamesberger, Angela; Yang, Lingyun; Houtwed, Haley; Cao, Haishi
2014-07-21
A N-aryl-1,8-naphthalimide based sensor (ES-1) bearing a trimethylsilyl ether has been synthesized by a two-step reaction for quantitative detection of fluoride (F(-)). ES-1 exhibited monomer/excimer emissions at 410 and 524 nm respectively in CH2Cl2. In the presence of F(-), the desilylation of trimethylsilyl ether caused decay of the excimer emission as well as enhancement of the monomer emission to give a ratiometric signal. The fluoride-triggered desilylation showed a high reaction rate and high affinity to F(-) over nine other interfering anions. ES-1 provided a novel fluorescence assay based on excimer-monomer switch of N-aryl-1,8-naphthalimide to quantitatively measure F(-) with a detection limit of 0.133 ppm.
NASA Astrophysics Data System (ADS)
Olson, Jonathan D.; Kanick, Stephen C.; Bravo, Jaime J.; Roberts, David W.; Paulsen, Keith D.
2016-03-01
Aminolevulinc-acid induced protoporphyrin IX (ALA-PpIX) is being investigated as a biomarker to guide neurosurgical resection of brain tumors. ALA-PpIX fluorescence can be observed visually in the surgical field; however, raw fluorescence emissions can be distorted by factors other than the fluorophore concentration. Specifically, fluorescence emissions are mixed with autofluorescence and attenuated by background absorption and scattering properties of the tissue. Recent work at Dartmouth has developed advanced fluorescence detection approaches that return quantitative assessments of PpIX concentration, which are independent of background optical properties. The quantitative fluorescence imaging (qFI) approach has increased sensitivity to residual disease within the resection cavity at the end of surgery that was not visible to the naked eye through the operating microscope. This presentation outlines clinical observations made during an ongoing investigation of ALA-PpIX based guidance of tumor resection. PpIX fluorescence measurements made in a wide-field hyperspectral imaging approach are co-registered with point-assessment using a fiber optic probe. Data show variations in the measured PpIX accumulation among different clinical tumor grades (i.e. high grade glioma, low grade glioma), types (i.e. primary tumors. metastases) and normal structures of interest (e.g. normal cortex, hippocampus). These results highlight the contrast enhancement and underscore the potential clinical benefit offered from quantitative measurements of PpIX concentration during resection of intracranial tumors.
Confidence estimation for quantitative photoacoustic imaging
NASA Astrophysics Data System (ADS)
Gröhl, Janek; Kirchner, Thomas; Maier-Hein, Lena
2018-02-01
Quantification of photoacoustic (PA) images is one of the major challenges currently being addressed in PA research. Tissue properties can be quantified by correcting the recorded PA signal with an estimation of the corresponding fluence. Fluence estimation itself, however, is an ill-posed inverse problem which usually needs simplifying assumptions to be solved with state-of-the-art methods. These simplifications, as well as noise and artifacts in PA images reduce the accuracy of quantitative PA imaging (PAI). This reduction in accuracy is often localized to image regions where the assumptions do not hold true. This impedes the reconstruction of functional parameters when averaging over entire regions of interest (ROI). Averaging over a subset of voxels with a high accuracy would lead to an improved estimation of such parameters. To achieve this, we propose a novel approach to the local estimation of confidence in quantitative reconstructions of PA images. It makes use of conditional probability densities to estimate confidence intervals alongside the actual quantification. It encapsulates an estimation of the errors introduced by fluence estimation as well as signal noise. We validate the approach using Monte Carlo generated data in combination with a recently introduced machine learning-based approach to quantitative PAI. Our experiments show at least a two-fold improvement in quantification accuracy when evaluating on voxels with high confidence instead of thresholding signal intensity.
The use of mixed methods in studying a chronic illness.
Jason, Leonard A; Reed, Jordan
This article explores mixed methods approaches with an illness called Myalgic Encephalomyelitis (ME) and Chronic Fatigue Syndrome (CFS). Qualitative and Quantitative data were used to investigate the epidemiology of this illness, as well as explore attributions based on the name of the illness, and finally treatment approaches. In each of the domains within the ME and CFS research reviewed, our analyses were richer and our findings ultimately more impactful when we integrated qualitative and quantitative research methods. The use of a multiphase mixed methods research program provided our team unique vantage points for better understanding social and community issues involving this controversial chronic illness. Further, this approach allowed us to implement the insights gained through an advocacy lens to change policy, recommend and evaluate treatments, and amplify voices within the patient population. In this way, we believe that the practice of methodological pluralism is especially applicable and effective to the study of chronic illness, and believe other investigators will benefit from the use of these approaches with similar disenfranchised and unfairly treated populations.
Kernel-based whole-genome prediction of complex traits: a review.
Morota, Gota; Gianola, Daniel
2014-01-01
Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.
NASA Astrophysics Data System (ADS)
Ragno, Rino; Ballante, Flavio; Pirolli, Adele; Wickersham, Richard B.; Patsilinakos, Alexandros; Hesse, Stéphanie; Perspicace, Enrico; Kirsch, Gilbert
2015-08-01
Vascular endothelial growth factor receptor-2, (VEGFR-2), is a key element in angiogenesis, the process by which new blood vessels are formed, and is thus an important pharmaceutical target. Here, 3-D quantitative structure-activity relationship (3-D QSAR) were used to build a quantitative screening and pharmacophore model of the VEGFR-2 receptors for design of inhibitors with improved activities. Most of available experimental data information has been used as training set to derive optimized and fully cross-validated eight mono-probe and a multi-probe quantitative models. Notable is the use of 262 molecules, aligned following both structure-based and ligand-based protocols, as external test set confirming the 3-D QSAR models' predictive capability and their usefulness in design new VEGFR-2 inhibitors. From a survey on literature, this is the first generation of a wide-ranging computational medicinal chemistry application on VEGFR2 inhibitors.
Safavi-Abbasi, Sam; de Oliveira, Jean G; Deshmukh, Pushpa; Reis, Cassius V; Brasiliense, Leonardo B C; Crawford, Neil R; Feiz-Erfan, Iman; Spetzler, Robert F; Preul, Mark C
2010-03-01
The aim of this study was to describe quantitatively the properties of the posterolateral approaches and their combination. Six silicone-injected cadaveric heads were dissected bilaterally. Quantitative data were generated with the Optotrak 3020 system (Northern Digital, Waterloo, Canada) and Surgiscope (Elekta Instruments, Inc., Atlanta, GA), including key anatomic points on the skull base and brainstem. All parameters were measured after the basic retrosigmoid craniectomy and then after combination with a basic far-lateral extension. The clinical results of 20 patients who underwent a combined retrosigmoid and far-lateral approach were reviewed. The change in accessibility to the lower clivus was greatest after the far-lateral extension (mean change, 43.62 +/- 10.98 mm2; P = .001). Accessibility to the constant landmarks, Meckel's cave, internal auditory meatus, and jugular foramen did not change significantly between the 2 approaches (P > .05). The greatest change in accessibility to soft tissue between the 2 approaches was to the lower brainstem (mean change, 33.88 +/- 5.25 mm2; P = .0001). Total removal was achieved in 75% of the cases. The average postoperative Glasgow Outcome Scale score of patients who underwent the combined retrosigmoid and far-lateral approach improved significantly, compared with the preoperative scores. The combination of the far-lateral and simple retrosigmoid approaches significantly increases the petroclival working area and access to the cranial nerves. However, risk of injury to neurovascular structures and time needed to extend the craniotomy must be weighed against the increased working area and angles of attack.
A novel approach for evaluating the risk of health care failure modes.
Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang
2012-12-01
Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.
Filipce, Venko; Ammirati, Mario
2015-01-01
Objective: Basilar aneurisms are one of the most complex and challenging pathologies for neurosurgeons to treat. Endoscopy is a recently rediscovered neurosurgical technique that could lend itself well to overcome some of the vascular visualization challenges associated with this pathology. The purpose of this study was to quantify and compare the basilar artery (BA) bifurcation (tip of the basilar) working area afforded by the microscope and the endoscope using different approaches and image guidance. Materials and Methods: We performed a total of 9 dissections, including pterional (PT) and orbitozygomatic (OZ) approaches bilaterally in five whole, fresh cadaver heads. We used computed tomography based image guidance for intraoperative navigation as well as for quantitative measurements. We estimated the working area of the tip of the basilar, using both a rigid endoscope and an operating microscope. Operability was qualitatively assessed by the senior authors. Results: In microscopic exposure, the OZ approach provided greater working area (160 ± 34.3 mm2) compared to the PT approach (129.8 ± 37.6 mm2) (P > 0.05). The working area in both PT and OZ approaches using 0° and 30° endoscopes was larger than the one available using the microscope alone (P < 0.05). In the PT approach, both 0° and 30° endoscopes provided a working area greater than a microscopic OZ approach (P < 0.05) and an area comparable to the OZ endoscopic approach (P > 0.05). Conclusion: Integration of endoscope and microscope in both PT and OZ approaches can provide significantly greater surgical exposure of the BA bifurcation compared to that afforded by the conventional approaches alone. PMID:25972933
Dulin-Keita, Akilah; Clay, Olivio; Whittaker, Shannon; Hannon, Lonnie; Adams, Ingrid K; Rogers, Michelle; Gans, Kim
2015-08-01
This study uses a mixed methods approach to 1) identify surrounding residents' perceived expectations for Housing Opportunities for People Everywhere (HOPE VI) policy on physical activity outcomes and to 2) quantitatively examine the odds of neighborhood-based physical activity pre-/post-HOPE VI in a low socioeconomic status, predominantly African American community in Birmingham, Alabama. To address aim one, we used group concept mapping which is a structured approach for data collection and analyses that produces pictures/maps of ideas. Fifty-eight residents developed statements about potential influences of HOPE VI on neighborhood-based physical activity. In the quantitative study, we examined whether these potential influences increased the odds of neighborhood walking/jogging. We computed block entry logistic regression models with a larger cohort of residents at baseline (n = 184) and six-months (n = 142, 77% retention; n = 120 for all informative variables). We examined perceived neighborhood disorder (perceived neighborhood disorder scale), walkability and aesthetics (Neighborhood Environment Walkability Scale) and HOPE VI-related community safety and safety for physical activity as predictors. During concept mapping, residents generated statements that clustered into three distinct concepts, "Increased Leisure Physical Activity," "Safe Play Areas," and "Generating Health Promoting Resources." The quantitative analyses indicated that changes in neighborhood walkability increased the odds of neighborhood-based physical activity (p = 0.04). When HOPE VI-related safety for physical activity was entered into the model, it was associated with increased odds of physical activity (p = 0.04). Walkability was no longer statistically significant. These results suggest that housing policies that create walkable neighborhoods and that improve perceptions of safety for physical activity may increase neighborhood-based physical activity. However, the longer term impacts of neighborhood-level policies on physical activity require more longitudinal evidence to determine whether increased participation in physical activity is sustained. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wei Liao; Rohr, Karl; Chang-Ki Kang; Zang-Hee Cho; Worz, Stefan
2016-01-01
We propose a novel hybrid approach for automatic 3D segmentation and quantification of high-resolution 7 Tesla magnetic resonance angiography (MRA) images of the human cerebral vasculature. Our approach consists of two main steps. First, a 3D model-based approach is used to segment and quantify thick vessels and most parts of thin vessels. Second, remaining vessel gaps of the first step in low-contrast and noisy regions are completed using a 3D minimal path approach, which exploits directional information. We present two novel minimal path approaches. The first is an explicit approach based on energy minimization using probabilistic sampling, and the second is an implicit approach based on fast marching with anisotropic directional prior. We conducted an extensive evaluation with over 2300 3D synthetic images and 40 real 3D 7 Tesla MRA images. Quantitative and qualitative evaluation shows that our approach achieves superior results compared with a previous minimal path approach. Furthermore, our approach was successfully used in two clinical studies on stroke and vascular dementia.
Linking agent-based models and stochastic models of financial markets
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene
2012-01-01
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086
Linking agent-based models and stochastic models of financial markets.
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene
2012-05-29
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.
Isotropic differential phase contrast microscopy for quantitative phase bio-imaging.
Chen, Hsi-Hsun; Lin, Yu-Zi; Luo, Yuan
2018-05-16
Quantitative phase imaging (QPI) has been investigated to retrieve optical phase information of an object and applied to biological microscopy and related medical studies. In recent examples, differential phase contrast (DPC) microscopy can recover phase image of thin sample under multi-axis intensity measurements in wide-field scheme. Unlike conventional DPC, based on theoretical approach under partially coherent condition, we propose a new method to achieve isotropic differential phase contrast (iDPC) with high accuracy and stability for phase recovery in simple and high-speed fashion. The iDPC is simply implemented with a partially coherent microscopy and a programmable thin-film transistor (TFT) shield to digitally modulate structured illumination patterns for QPI. In this article, simulation results show consistency of our theoretical approach for iDPC under partial coherence. In addition, we further demonstrate experiments of quantitative phase images of a standard micro-lens array, as well as label-free live human cell samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
De Medeiros, R C G; Soares, J D; De Sousa, F B
2012-05-01
Lesion area measurement of enamel caries using polarized light microscopy (PLM) is currently performed in a large number of studies, but measurements are based mainly on a mislead qualitative interpretation of enamel birefringence in a single immersion medium. Here, five natural enamel caries lesions are analysed by microradiography and in PLM, and the differences in their histopathological features derived from a qualitative versus a quantitative interpretation of enamel birefringence are described. Enamel birefringence in different immersion media (air, water and quinoline) is interpreted by both qualitative and quantitative approaches, the former leading to an underestimation of the depth of enamel caries mainly when the criterion of validating sound enamel as a negatively birefringent area in immersion in water is used (a current common practice in dental research). Procedures to avoid the shortcomings of a qualitative interpretation of enamel birefringence are presented and discussed. © 2012 The Authors Journal of Microscopy © 2012 Royal Microscopical Society.
Medina-Franco, José L.; Edwards, Bruce S.; Pinilla, Clemencia; Appel, Jon R.; Giulianotti, Marc A.; Santos, Radleigh G.; Yongye, Austin B.; Sklar, Larry A.; Houghten, Richard A.
2013-01-01
We present a general approach to describe the structure-activity relationships (SAR) of combinatorial data sets with activity for two biological endpoints with emphasis on the rapid identification of substitutions that have a large impact on activity and selectivity. The approach uses Dual-Activity Difference (DAD) maps that represent a visual and quantitative analysis of all pairwise comparisons of one, two, or more substitutions around a molecular template. Scanning the SAR of data sets using DAD maps allows the visual and quantitative identification of activity switches defined as specific substitutions that have an opposite effect on the activity of the compounds against two targets. The approach also rapidly identifies single- and double-target R-cliffs, i.e., compounds where a single or double substitution around the central scaffold dramatically modifies the activity for one or two targets, respectively. The approach introduced in this report can be applied to any analogue series with two biological activity endpoints. To illustrate the approach, we discuss the SAR of 106 pyrrolidine bis-diketopiperazines tested against two formylpeptide receptors obtained from positional scanning deconvolution methods of mixture-based libraries. PMID:23705689
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
Quantitative Precipitation Nowcasting: A Lagrangian Pixel-Based Approach
2012-01-01
Sorooshian, T. Bellerby, and G. Huffman, 2010: REFAME: Rain Estimation Using Forward-Adjusted Advection of Microwave Estimates. J. of Hydromet ., 11...precipitation forecasting using information from radar and Numerical Weather Prediction models. J. of Hydromet ., 4(6):1168-1180. Germann, U., and I
Comparison of EPA Method 1615 RT-qPCR Assays in Standard and Kit Format
EPA Method 1615 contains protocols for measuring enterovirus and norovirus by reverse transcription quantitative polymerase chain reaction. A commercial kit based upon these protocols was designed and compared to the method's standard approach. Reagent grade, secondary effluent, ...
Diagnosing Prion Diseases: Mass Spectrometry-Based Approaches
USDA-ARS?s Scientific Manuscript database
Mass spectrometry is an established means of quantitating the prions present in infected hamsters. Calibration curves relating the area ratios of the selected analyte peptides and their oxidized analogs to stable isotope labeled internal standards were prepared. The limit of detection (LOD) and limi...
Clinical Pedodontics: An Approach Based on Comprehensive Care.
ERIC Educational Resources Information Center
And Others; Bennett, Carroll G.
1981-01-01
The University of Florida uses a comprehensive care system to teach clinical pedodontics. Several block clinics permit further experience with children. Details of the program are described, and quantitative results of patient treatment are compared with those of other clinical pedodontics programs. (MSE)
Monte Carlo simulations within avalanche rescue
NASA Astrophysics Data System (ADS)
Reiweger, Ingrid; Genswein, Manuel; Schweizer, Jürg
2016-04-01
Refining concepts for avalanche rescue involves calculating suitable settings for rescue strategies such as an adequate probing depth for probe line searches or an optimal time for performing resuscitation for a recovered avalanche victim in case of additional burials. In the latter case, treatment decisions have to be made in the context of triage. However, given the low number of incidents it is rarely possible to derive quantitative criteria based on historical statistics in the context of evidence-based medicine. For these rare, but complex rescue scenarios, most of the associated concepts, theories, and processes involve a number of unknown "random" parameters which have to be estimated in order to calculate anything quantitatively. An obvious approach for incorporating a number of random variables and their distributions into a calculation is to perform a Monte Carlo (MC) simulation. We here present Monte Carlo simulations for calculating the most suitable probing depth for probe line searches depending on search area and an optimal resuscitation time in case of multiple avalanche burials. The MC approach reveals, e.g., new optimized values for the duration of resuscitation that differ from previous, mainly case-based assumptions.
Quantitative elemental imaging of heterogeneous catalysts using laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Trichard, F.; Sorbier, L.; Moncayo, S.; Blouët, Y.; Lienemann, C.-P.; Motto-Ros, V.
2017-07-01
Currently, the use of catalysis is widespread in almost all industrial processes; its use improves productivity, synthesis yields and waste treatment as well as decreases energy costs. The increasingly stringent requirements, in terms of reaction selectivity and environmental standards, impose progressively increasing accuracy and control of operations. Meanwhile, the development of characterization techniques has been challenging, and the techniques often require equipment with high complexity. In this paper, we demonstrate a novel elemental approach for performing quantitative space-resolved analysis with ppm-scale quantification limits and μm-scale resolution. This approach, based on laser-induced breakdown spectroscopy (LIBS), is distinguished by its simplicity, all-optical design, and speed of operation. This work analyzes palladium-based porous alumina catalysts, which are commonly used in the selective hydrogenation process, using the LIBS method. We report an exhaustive study of the quantification capability of LIBS and its ability to perform imaging measurements over a large dynamic range, typically from a few ppm to wt%. These results offer new insight into the use of LIBS-based imaging in the industry and paves the way for innumerable applications.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Borchers, Christoph H
2013-07-01
An emerging approach for multiplexed targeted proteomics involves bottom-up LC-MRM-MS, with stable isotope-labeled internal standard peptides, to accurately quantitate panels of putative disease biomarkers in biofluids. In this paper, we used this approach to quantitate 27 candidate cancer-biomarker proteins in human plasma that had not been treated by immunoaffinity depletion or enrichment techniques. These proteins have been reported as biomarkers for a variety of human cancers, from laryngeal to ovarian, with breast cancer having the highest correlation. We implemented measures to minimize the analytical variability, improve the quantitative accuracy, and increase the feasibility and applicability of this MRM-based method. We have demonstrated excellent retention time reproducibility (median interday CV: 0.08%) and signal stability (median interday CV: 4.5% for the analytical platform and 6.1% for the bottom-up workflow) for the 27 biomarker proteins (represented by 57 interference-free peptides). The linear dynamic range for the MRM assays spanned four orders-of-magnitude, with 25 assays covering a 10(3) -10(4) range in protein concentration. The lowest abundance quantifiable protein in our biomarker panel was insulin-like growth factor 1 (calculated concentration: 127 ng/mL). Overall, the analytical performance of this assay demonstrates high robustness and sensitivity, and provides the necessary throughput and multiplexing capabilities required to verify and validate cancer-associated protein biomarker panels in human plasma, prior to clinical use. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Knight, Jo; North, Bernard V; Sham, Pak C; Curtis, David
2003-12-31
This paper presents a method of performing model-free LOD-score based linkage analysis on quantitative traits. It is implemented in the QMFLINK program. The method is used to perform a genome screen on the Framingham Heart Study data. A number of markers that show some support for linkage in our study coincide substantially with those implicated in other linkage studies of hypertension. Although the new method needs further testing on additional real and simulated data sets we can already say that it is straightforward to apply and may offer a useful complementary approach to previously available methods for the linkage analysis of quantitative traits.
Knight, Jo; North, Bernard V; Sham, Pak C; Curtis, David
2003-01-01
This paper presents a method of performing model-free LOD-score based linkage analysis on quantitative traits. It is implemented in the QMFLINK program. The method is used to perform a genome screen on the Framingham Heart Study data. A number of markers that show some support for linkage in our study coincide substantially with those implicated in other linkage studies of hypertension. Although the new method needs further testing on additional real and simulated data sets we can already say that it is straightforward to apply and may offer a useful complementary approach to previously available methods for the linkage analysis of quantitative traits. PMID:14975142
Leaf epidermis images for robust identification of plants
da Silva, Núbia Rosa; Oliveira, Marcos William da Silva; Filho, Humberto Antunes de Almeida; Pinheiro, Luiz Felipe Souza; Rossatto, Davi Rodrigo; Kolb, Rosana Marta; Bruno, Odemir Martinez
2016-01-01
This paper proposes a methodology for plant analysis and identification based on extracting texture features from microscopic images of leaf epidermis. All the experiments were carried out using 32 plant species with 309 epidermal samples captured by an optical microscope coupled to a digital camera. The results of the computational methods using texture features were compared to the conventional approach, where quantitative measurements of stomatal traits (density, length and width) were manually obtained. Epidermis image classification using texture has achieved a success rate of over 96%, while success rate was around 60% for quantitative measurements taken manually. Furthermore, we verified the robustness of our method accounting for natural phenotypic plasticity of stomata, analysing samples from the same species grown in different environments. Texture methods were robust even when considering phenotypic plasticity of stomatal traits with a decrease of 20% in the success rate, as quantitative measurements proved to be fully sensitive with a decrease of 77%. Results from the comparison between the computational approach and the conventional quantitative measurements lead us to discover how computational systems are advantageous and promising in terms of solving problems related to Botany, such as species identification. PMID:27217018
Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L
2013-04-16
Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.
Quantification of EEG reactivity in comatose patients
Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas
2016-01-01
Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757
Levi Mortera, Stefano; Dioni, Ilaria; Greco, Viviana; Neri, Cristina; Rovero, Paolo; Urbani, Andrea
2014-05-01
Among the most common stable-isotope labeling strategies, the reaction of formaldehyde with peptides in the presence of NaCNBH₃ features many attractive aspects that are conducive to its employment in quantitation experiments in proteomics. Reductive amination, with formaldehyde and d(2)-formaldehyde, is reported to be a fast, easy, and specific reaction, undoubtedly inexpensive if compared with commercially available kits for differential isotope coding. Acetaldehyde and d(4)-acetaldehyde could be employed as well without a substantial increase in terms of cost, and should provide a wider spacing between the differentially tagged peptides in the mass spectrum. Nevertheless, only a single paper reports about a diethylation approach for quantitation. We undertook a systematic analytical investigation on the reductive amination of some standard peptides pointing out the occasional occurrence of side reactions in dependence of pH or reagents order of addition, particularly observing the formation of cyclic adducts ascribable to rearrangements involving the generated Schiff-base and all the nucleophilic sites of its chemical environment. We also tried to evaluate how much this side-products amount may impair isotope coded relative quantitation.
Measuring information transfer in a soft robotic arm.
Nakajima, K; Schmidt, N; Pfeifer, R
2015-05-13
Soft robots can exhibit diverse behaviors with simple types of actuation by partially outsourcing control to the morphological and material properties of their soft bodies, which is made possible by the tight coupling between control, body, and environment. In this paper, we present a method that will quantitatively characterize these diverse spatiotemporal dynamics of a soft body based on the information-theoretic approach. In particular, soft bodies have the ability to propagate the effect of actuation through the entire body, with a certain time delay, due to their elasticity. Our goal is to capture this delayed interaction in a quantitative manner based on a measure called momentary information transfer. We extend this measure to soft robotic applications and demonstrate its power using a physical soft robotic platform inspired by the octopus. Our approach is illustrated in two ways. First, we statistically characterize the delayed actuation propagation through the body as a strength of information transfer. Second, we capture this information propagation directly as local information dynamics. As a result, we show that our approach can successfully characterize the spatiotemporal dynamics of the soft robotic platform, explicitly visualizing how information transfers through the entire body with delays. Further extension scenarios of our approach are discussed for soft robotic applications in general.
Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests
NASA Technical Reports Server (NTRS)
Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.
2010-01-01
Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-09-01
Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-01-01
Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037
Genomic and pedigree-based prediction for leaf, stem, and stripe rust resistance in wheat.
Juliana, Philomin; Singh, Ravi P; Singh, Pawan K; Crossa, Jose; Huerta-Espino, Julio; Lan, Caixia; Bhavani, Sridhar; Rutkoski, Jessica E; Poland, Jesse A; Bergstrom, Gary C; Sorrells, Mark E
2017-07-01
Genomic prediction for seedling and adult plant resistance to wheat rusts was compared to prediction using few markers as fixed effects in a least-squares approach and pedigree-based prediction. The unceasing plant-pathogen arms race and ephemeral nature of some rust resistance genes have been challenging for wheat (Triticum aestivum L.) breeding programs and farmers. Hence, it is important to devise strategies for effective evaluation and exploitation of quantitative rust resistance. One promising approach that could accelerate gain from selection for rust resistance is 'genomic selection' which utilizes dense genome-wide markers to estimate the breeding values (BVs) for quantitative traits. Our objective was to compare three genomic prediction models including genomic best linear unbiased prediction (GBLUP), GBLUP A that was GBLUP with selected loci as fixed effects and reproducing kernel Hilbert spaces-markers (RKHS-M) with least-squares (LS) approach, RKHS-pedigree (RKHS-P), and RKHS markers and pedigree (RKHS-MP) to determine the BVs for seedling and/or adult plant resistance (APR) to leaf rust (LR), stem rust (SR), and stripe rust (YR). The 333 lines in the 45th IBWSN and the 313 lines in the 46th IBWSN were genotyped using genotyping-by-sequencing and phenotyped in replicated trials. The mean prediction accuracies ranged from 0.31-0.74 for LR seedling, 0.12-0.56 for LR APR, 0.31-0.65 for SR APR, 0.70-0.78 for YR seedling, and 0.34-0.71 for YR APR. For most datasets, the RKHS-MP model gave the highest accuracies, while LS gave the lowest. GBLUP, GBLUP A, RKHS-M, and RKHS-P models gave similar accuracies. Using genome-wide marker-based models resulted in an average of 42% increase in accuracy over LS. We conclude that GS is a promising approach for improvement of quantitative rust resistance and can be implemented in the breeding pipeline.
Applying Mixed Methods Research at the Synthesis Level: An Overview
ERIC Educational Resources Information Center
Heyvaert, Mieke; Maes, Bea; Onghena, Patrick
2011-01-01
Historically, qualitative and quantitative approaches have been applied relatively separately in synthesizing qualitative and quantitative evidence, respectively, in several research domains. However, mixed methods approaches are becoming increasingly popular nowadays, and practices of combining qualitative and quantitative research components at…
Towards quantitative PET/MRI: a review of MR-based attenuation correction techniques.
Hofmann, Matthias; Pichler, Bernd; Schölkopf, Bernhard; Beyer, Thomas
2009-03-01
Positron emission tomography (PET) is a fully quantitative technology for imaging metabolic pathways and dynamic processes in vivo. Attenuation correction of raw PET data is a prerequisite for quantification and is typically based on separate transmission measurements. In PET/CT attenuation correction, however, is performed routinely based on the available CT transmission data. Recently, combined PET/magnetic resonance (MR) has been proposed as a viable alternative to PET/CT. Current concepts of PET/MRI do not include CT-like transmission sources and, therefore, alternative methods of PET attenuation correction must be found. This article reviews existing approaches to MR-based attenuation correction (MR-AC). Most groups have proposed MR-AC algorithms for brain PET studies and more recently also for torso PET/MR imaging. Most MR-AC strategies require the use of complementary MR and transmission images, or morphology templates generated from transmission images. We review and discuss these algorithms and point out challenges for using MR-AC in clinical routine. MR-AC is work-in-progress with potentially promising results from a template-based approach applicable to both brain and torso imaging. While efforts are ongoing in making clinically viable MR-AC fully automatic, further studies are required to realize the potential benefits of MR-based motion compensation and partial volume correction of the PET data.
Quantitative Susceptibility Mapping of the Midbrain in Parkinson’s Disease
Du, Guangwei; Liu, Tian; Lewis, Mechelle M.; Kong, Lan; Wang, Yi; Connor, James; Mailman, Richard B.; Huang, Xuemei
2017-01-01
Background Parkinson’s disease (PD) is marked pathologically by dopamine neuron loss and iron overload in the substantia nigra pars compacta. Midbrain iron content is reported to be increased in PD based on magnetic resonance imaging (MRI) R2* changes. Because quantitative susceptibility mapping is a novel MRI approach to measure iron content, we compared it with R2* for assessing midbrain changes in PD. Methods Quantitative susceptibility mapping and R2* maps were obtained from 47 PD patients and 47 healthy controls. Midbrain susceptibility and R2* values were analyzed by using both voxel-based and region-of-interest approaches in normalized space, and analyzed along with clinical data, including disease duration, Unified Parkinson’s Disease Rating Scale (UPDRS) I, II, and III sub-scores, and levodopa-equivalent daily dosage. All studies were done while PD patients were “on drug.” Results Compared with controls, PD patients showed significantly increased susceptibility values in both right (cluster size = 106 mm3) and left (164 mm3) midbrain, located ventrolateral to the red nucleus that corresponded to the substantia nigra pars compacta. Susceptibility values in this region were correlated significantly with disease duration, UPDRS II, and levodopa-equivalent daily dosage. Conversely, R2* was increased significantly only in a much smaller region (62 mm3) of the left lateral substantia nigra pars compacta and was not significantly correlated with clinical parameters. Conclusion The use of quantitative susceptibility mapping demonstrated marked nigral changes that correlated with clinical PD status more sensitively than R2*. These data suggest that quantitative susceptibility mapping may be a superior imaging biomarker to R2* for estimating brain iron levels in PD. PMID:26362242
Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P
2013-09-15
The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.
2004-09-01
The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.
ERIC Educational Resources Information Center
Bredderman, Ted
A quantitative synthesis of research findings on the effects of three major activity-based elementary science programs developed with National Science Foundation support was conducted. Controlled evaluation studies of the Elementary Science Study (ESS), Science-A Process Approach (SAPA), or The Science Curriculum Improvement Study (SCIS) were used…
ERIC Educational Resources Information Center
Pilten, Pusat; Pilten, Gulhiz; Sahinkaya, Nihan
2017-01-01
The purpose of the present research is studying the effects of information and communication technologies (ICT) assisted project based learning practices on ICT integration skills of pre-service classroom teachers. The research adopted a mixed method. The quantitative dimension of the research was designed with pre-test-post-test control groups.…
Analysis of Open Education Service Quality with the Descriptive-Quantitative Approach
ERIC Educational Resources Information Center
Priyogi, Bilih; Santoso, Harry B.; Berliyanto; Hasibuan, Zainal A.
2017-01-01
The concept of Open Education (OE) is based on the philosophy of e-Learning which aims to provide learning environment anywhere, anytime, and for anyone. One of the main issue in the development of OE services is the availability of the quality assurance mechanism. This study proposes a metric for measuring the quality of OE service. Based on…
ERIC Educational Resources Information Center
El Seoud, Omar A.; Loffredo, Carina; Galgano, Paula D.; Sato, Bruno M.; Reichardt, Christian
2011-01-01
The substitution of petroleum-based fuels with those from renewable sources has gained momentum worldwide. A UV-vis experiment for the quantitative analysis of biofuels (bioethanol or biodiesel) in (petroleum-based) diesel oil has been developed. Before the experiment, students were given a quiz on biofuels, and then they were asked to suggest a…
ERIC Educational Resources Information Center
Fontaine, Jocelyn; Debus-Sherrill, Sara; Downey, P. Mitchell; Lowry, Samantha S.
2010-01-01
This report is based on research conducted by the Urban Institute's Justice Policy Center on the violence prevention activities taking place at the Thurgood Marshall Academy Public Charter High School during the 2008-2009 school year. Based on an assessment of the school's violence prevention approach using qualitative and quantitative data from…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costa, Pedro M.; Fadeel, Bengt, E-mail: Bengt.Fade
Engineered nanomaterials are being developed for a variety of technological applications. However, the increasing use of nanomaterials in society has led to concerns about their potential adverse effects on human health and the environment. During the first decade of nanotoxicological research, the realization has emerged that effective risk assessment of the multitudes of new nanomaterials would benefit from a comprehensive understanding of their toxicological mechanisms, which is difficult to achieve with traditional, low-throughput, single end-point oriented approaches. Therefore, systems biology approaches are being progressively applied within the nano(eco)toxicological sciences. This novel paradigm implies that the study of biological systems shouldmore » be integrative resulting in quantitative and predictive models of nanomaterial behaviour in a biological system. To this end, global ‘omics’ approaches with which to assess changes in genes, proteins, metabolites, etc. are deployed allowing for computational modelling of the biological effects of nanomaterials. Here, we highlight omics and systems biology studies in nanotoxicology, aiming towards the implementation of a systems nanotoxicology and mechanism-based risk assessment of nanomaterials. - Highlights: • Systems nanotoxicology is a multi-disciplinary approach to quantitative modelling. • Transcriptomics, proteomics and metabolomics remain the most common methods. • Global “omics” techniques should be coupled to computational modelling approaches. • The discovery of nano-specific toxicity pathways and biomarkers is a prioritized goal. • Overall, experimental nanosafety research must endeavour reproducibility and relevance.« less
Phase-based motion magnification video for monitoring of vital signals using the Hermite transform
NASA Astrophysics Data System (ADS)
Brieva, Jorge; Moya-Albor, Ernesto
2017-11-01
In this paper we present a new Eulerian phase-based motion magnification technique using the Hermite Transform (HT) decomposition that is inspired in the Human Vision System (HVS). We test our method in one sequence of the breathing of a newborn baby and on a video sequence that shows the heartbeat on the wrist. We detect and magnify the heart pulse applying our technique. Our motion magnification approach is compared to the Laplacian phase based approach by means of quantitative metrics (based on the RMS error and the Fourier transform) to measure the quality of both reconstruction and magnification. In addition a noise robustness analysis is performed for the two methods.
Observation of the immune response of cells and tissue through multimodal label-free microscopy
NASA Astrophysics Data System (ADS)
Pavillon, Nicolas; Smith, Nicholas I.
2017-02-01
We present applications of a label-free approach to assess the immune response based on the combination of interferometric microscopy and Raman spectroscopy, which makes it possible to simultaneously acquire morphological and molecular information of live cells. We employ this approach to derive statistical models for predicting the activation state of macrophage cells based both on morphological parameters extracted from the high-throughput full-field quantitative phase imaging, and on the molecular content information acquired through Raman spectroscopy. We also employ a system for 3D imaging based on coherence gating, enabling specific targeting of the Raman channel to structures of interest within tissue.
Medical privacy protection based on granular computing.
Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng
2004-10-01
Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.
Deployment of e-health services - a business model engineering strategy.
Kijl, Björn; Nieuwenhuis, Lambert J M; Huis in 't Veld, Rianne M H A; Hermens, Hermie J; Vollenbroek-Hutten, Miriam M R
2010-01-01
We designed a business model for deploying a myofeedback-based teletreatment service. An iterative and combined qualitative and quantitative action design approach was used for developing the business model and the related value network. Insights from surveys, desk research, expert interviews, workshops and quantitative modelling were combined to produce the first business model and then to refine it in three design cycles. The business model engineering strategy provided important insights which led to an improved, more viable and feasible business model and related value network design. Based on this experience, we conclude that the process of early stage business model engineering reduces risk and produces substantial savings in costs and resources related to service deployment.
Pargett, Michael; Umulis, David M
2013-07-15
Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
NASA Astrophysics Data System (ADS)
Yamaguchi, Hideshi; Soeda, Takeshi
2015-03-01
A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.
LeBlanc, André; Michaud, Sarah A; Percy, Andrew J; Hardie, Darryl B; Yang, Juncong; Sinclair, Nicholas J; Proudfoot, Jillaine I; Pistawka, Adam; Smith, Derek S; Borchers, Christoph H
2017-07-07
When quantifying endogenous plasma proteins for fundamental and biomedical research - as well as for clinical applications - precise, reproducible, and robust assays are required. Targeted detection of peptides in a bottom-up strategy is the most common and precise mass spectrometry-based quantitation approach when combined with the use of stable isotope-labeled peptides. However, when measuring protein in plasma, the unknown endogenous levels prevent the implementation of the best calibration strategies, since no blank matrix is available. Consequently, several alternative calibration strategies are employed by different laboratories. In this study, these methods were compared to a new approach using two different stable isotope-labeled standard (SIS) peptide isotopologues for each endogenous peptide to be quantified, enabling an external calibration curve as well as the quality control samples to be prepared in pooled human plasma without interference from endogenous peptides. This strategy improves the analytical performance of the assay and enables the accuracy of the assay to be monitored, which can also facilitate method development and validation.
COMPUTER-AIDED DRUG DISCOVERY AND DEVELOPMENT (CADDD): in silico-chemico-biological approach
Kapetanovic, I.M.
2008-01-01
It is generally recognized that drug discovery and development are very time and resources consuming processes. There is an ever growing effort to apply computational power to the combined chemical and biological space in order to streamline drug discovery, design, development and optimization. In biomedical arena, computer-aided or in silico design is being utilized to expedite and facilitate hit identification, hit-to-lead selection, optimize the absorption, distribution, metabolism, excretion and toxicity profile and avoid safety issues. Commonly used computational approaches include ligand-based drug design (pharmacophore, a 3-D spatial arrangement of chemical features essential for biological activity), structure-based drug design (drug-target docking), and quantitative structure-activity and quantitative structure-property relationships. Regulatory agencies as well as pharmaceutical industry are actively involved in development of computational tools that will improve effectiveness and efficiency of drug discovery and development process, decrease use of animals, and increase predictability. It is expected that the power of CADDD will grow as the technology continues to evolve. PMID:17229415
Xu, Rosalind J; Blasiak, Bartosz; Cho, Minhaeng; Layfield, Joshua P; Londergan, Casey H
2018-05-17
A quantitative connection between molecular dynamics simulations and vibrational spectroscopy of probe-labeled systems would enable direct translation of experimental data into structural and dynamical information. To constitute this connection, all-atom molecular dynamics (MD) simulations were performed for two SCN probe sites (solvent-exposed and buried) in a calmodulin-target peptide complex. Two frequency calculation approaches with substantial nonelectrostatic components, a quantum mechanics/molecular mechanics (QM/MM)-based technique and a solvatochromic fragment potential (SolEFP) approach, were used to simulate the infrared probe line shapes. While QM/MM results disagreed with experiment, SolEFP results matched experimental frequencies and line shapes and revealed the physical and dynamic bases for the observed spectroscopic behavior. The main determinant of the CN probe frequency is the exchange repulsion between the probe and its local structural neighbors, and there is a clear dynamic explanation for the relatively broad probe line shape observed at the "buried" probe site. This methodology should be widely applicable to vibrational probes in many environments.
Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L
2012-10-01
Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.
A quantitative index for classification of plantar thermal changes in the diabetic foot
NASA Astrophysics Data System (ADS)
Hernandez-Contreras, D.; Peregrina-Barreto, H.; Rangel-Magdaleno, J.; Gonzalez-Bernal, J. A.; Altamirano-Robles, L.
2017-03-01
One of the main complications caused by diabetes mellitus is the development of diabetic foot, which in turn, can lead to ulcerations. Because ulceration risks are linked to an increase in plantar temperatures, recent approaches analyze thermal changes. These approaches try to identify spatial patterns of temperature that could be characteristic of a diabetic group. However, this is a difficult task since thermal patterns have wide variations resulting on complex classification. Moreover, the measurement of contralateral plantar temperatures is important to determine whether there is an abnormal difference but, this only provides information when thermal changes are asymmetric and in absence of ulceration or amputation. Therefore, in this work is proposed a quantitative index for measuring the thermal change in the plantar region of participants diagnosed diabetes mellitus regards to a reliable reference (control) or regards to the contralateral foot (as usual). Also, a classification of the thermal changes based on a quantitative index is proposed. Such classification demonstrate the wide diversity of spatial distributions in the diabetic foot but also demonstrate that it is possible to identify common characteristics. An automatic process, based on the analysis of plantar angiosomes and image processing, is presented to quantify these thermal changes and to provide valuable information to the medical expert.
NASA Astrophysics Data System (ADS)
D'Andrea, W. J.; Balascio, N. L.; Bradley, R. S.; Bakke, J.; Gjerde, M.; Kaufman, D. S.; Briner, J. P.; von Gunten, L.
2014-12-01
Generating continuous, accurate and quantitative Holocene temperature estimates from the Arctic is an ongoing challenge. In many Arctic regions, tree ring-based approaches cannot be used and lake sediments provide the most valuable repositories for extracting paleotemperature information. Advances in lacustrine alkenone paleothermometry now allow for quantitative reconstruction of lake-water temperature based on the UK37 values of sedimentary alkenones. In addition, a recent study demonstrated the efficacy of non-destructive scanning reflectance spectroscopy in the visible range (VIS-RS) for high-resolution quantitative temperature reconstruction from arctic lake sediments1. In this presentation, I will report a new UK37-based temperature reconstruction and a scanning VIS-RS record (using the RABD660;670 index as a measure of sedimentary chlorin content) from Kulusuk Lake in southeastern Greenland (65.6°N, 37.1°W). The UK37 record reveals a ~3°C increase in summer lake water temperatures between ~10ka and ~7ka followed by sustained warmth until ~4ka and a gradual (~3°C) cooling until ~400 yr BP. The strong correlation between UK37 and RABD660;670 measured in the same sediment core provides further evidence that in arctic lakes where temperature regulates primary productivity, and thereby sedimentary chlorin content, these proxies can be combined to develop high-resolution quantitative temperature records. The Holocene temperature history of Kulusuk Lake determined using this approach corresponds to changes in the size of the glaciers adjacent to the lake, as inferred from sediment minerogenic properties measured with scanning XRF. Glaciers retreated during early Holocene warming, likely disappeared during the period of mid-Holocene warmth, and advanced after 4ka. I will also discuss new UK37 and RABD660;670 reconstructions from northwestern Svalbard and the central Brooks Range of Alaska within the framework of published regional temperature reconstructions and model simulations of Holocene temperature around the Arctic. 1. von Gunten, L., D'Andrea, W.J., Bradley, R.S. and Huang, Y., 2012, Proxy-to-proxy calibration: Increasing the temporal resolution of quantitative climate reconstructions. Scientific Reports, v. 2, 609. doi: 10:1038/srep00609.
Pang, Bo; Ding, Xiong; Wang, Guoping; Zhao, Chao; Xu, Yanan; Fu, Kaiyue; Sun, Jingjing; Song, Xiuling; Wu, Wenshuai; Liu, Yushen; Song, Qi; Hu, Jiumei; Li, Juan; Mu, Ying
2017-12-27
Vibrio parahemolyticus (VP) mostly isolated from aquatic products is one of the major causes of bacterial food-poisoning events worldwide, which could be reduced using a promising on-site detection method. Herein, a rapid and quantitative method for VP detection was developed by applying a mixed-dye-loaded loop-mediated isothermal amplification (LAMP) assay on a self-priming compartmentalization (SPC) microfluidic chip, termed on-chip mixed-dye-based LAMP (CMD-LAMP). In comparison to conventional approaches, CMD-LAMP was advantageous on the limit of detection, which reached down to 1 × 10 3 CFU/mL in food-contaminated samples without the pre-enrichment of bacteria. Additionally, as a result of the use of a mixed dye and SPC chip, the quantitative result could be easily acquired, avoiding the requirement of sophisticated instruments and tedious operation. Also, CMD-LAMP was rapid and cost-effective. Conclusively, CMD-LAMP has great potential in realizing the on-site quantitative analysis of VP for food safety.
Receptor-based 3D-QSAR in Drug Design: Methods and Applications in Kinase Studies.
Fang, Cheng; Xiao, Zhiyan
2016-01-01
Receptor-based 3D-QSAR strategy represents a superior integration of structure-based drug design (SBDD) and three-dimensional quantitative structure-activity relationship (3D-QSAR) analysis. It combines the accurate prediction of ligand poses by the SBDD approach with the good predictability and interpretability of statistical models derived from the 3D-QSAR approach. Extensive efforts have been devoted to the development of receptor-based 3D-QSAR methods and two alternative approaches have been exploited. One associates with computing the binding interactions between a receptor and a ligand to generate structure-based descriptors for QSAR analyses. The other concerns the application of various docking protocols to generate optimal ligand poses so as to provide reliable molecular alignments for the conventional 3D-QSAR operations. This review highlights new concepts and methodologies recently developed in the field of receptorbased 3D-QSAR, and in particular, covers its application in kinase studies.
A Quantitative Approach to the Formal Verification of Real-Time Systems.
1996-09-01
Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic
2008-09-01
Research Methods: Qualitative and Quantitative Approaches (Boston: Pearson, 2006), 1-592. 48 This project demanded the use of a primarily...enforcement practices. 200 Neuman, Social Research Methods: Qualitative and Quantitative Approaches, 152...www.socialresearchmethods.net/kb/strucres.php (accessed July 12, 2008). 203 Neuman, Social Research Methods: Qualitative and Quantitative Approaches, 149. 204 Paul
2011-07-01
10%. These results demonstrate that the IOP-based BRDF correction scheme (which is composed of the R„ model along with the IOP retrieval...distribution was averaged over 10 min 5. Validation of the lOP-Based BRDF Correction Scheme The IOP-based BRDF correction scheme is applied to both...oceanic and coastal waters were very consistent qualitatively and quantitatively and thus validate the IOP- based BRDF correction system, at least
Un currículo interdisciplinario de base teórica para enseñar inglés como segunda lengua1
Fuentes, Brenda O.; Mas, Francisco Soto; Mein, Erika; Jacobson, Holly E.
2013-01-01
Among Hispanic immigrants in the United States (US), learning English is considered necessary for economic and social achievement. As a consequence, there is a high demand for English as a Second Language (ESL) classes. Despite the recognized benefits of ESL programs, both at the individual and social levels, more research is needed to identify education strategies that effectively promote all aspects of learning English as a second language. This article describes an ESL curriculum that incorporates a theory-based pedagogical approach specifically designed for immigrant Hispanic adults on the US-Mexico border region. The article also describes the implementation of the curriculum as well as the results of the evaluation, which was conducted using both quantitative and qualitative methods. Quantitative results indicate that the participants significantly improved their English proficiency (L2). Qualitative results suggest that participants were positively impacted by both the content and pedagogical approaches used by the curriculum. Their experience with the ESL class was positive in general. It can be concluded that the curriculum achieved its objective. This approach could serve as a model for second language teaching for adults. PMID:25284915
Comparison of two trajectory based models for locating particle sources for two rural New York sites
NASA Astrophysics Data System (ADS)
Zhou, Liming; Hopke, Philip K.; Liu, Wei
Two back trajectory-based statistical models, simplified quantitative transport bias analysis and residence-time weighted concentrations (RTWC) have been compared for their capabilities of identifying likely locations of source emissions contributing to observed particle concentrations at Potsdam and Stockton, New York. Quantitative transport bias analysis (QTBA) attempts to take into account the distribution of concentrations around the directions of the back trajectories. In full QTBA approach, deposition processes (wet and dry) are also considered. Simplified QTBA omits the consideration of deposition. It is best used with multiple site data. Similarly the RTWC approach uses concentrations measured at different sites along with the back trajectories to distribute the concentration contributions across the spatial domain of the trajectories. In this study, these models are used in combination with the source contribution values obtained by the previous positive matrix factorization analysis of particle composition data from Potsdam and Stockton. The six common sources for the two sites, sulfate, soil, zinc smelter, nitrate, wood smoke and copper smelter were analyzed. The results of the two methods are consistent and locate large and clearly defined sources well. RTWC approach can find more minor sources but may also give unrealistic estimations of the source locations.
1, 2, 3, 4: infusing quantitative literacy into introductory biology.
Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.
Embryoids, organoids and gastruloids: new approaches to understanding embryogenesis
2017-01-01
ABSTRACT Cells have an intrinsic ability to self-assemble and self-organize into complex and functional tissues and organs. By taking advantage of this ability, embryoids, organoids and gastruloids have recently been generated in vitro, providing a unique opportunity to explore complex embryological events in a detailed and highly quantitative manner. Here, we examine how such approaches are being used to answer fundamental questions in embryology, such as how cells self-organize and assemble, how the embryo breaks symmetry, and what controls timing and size in development. We also highlight how further improvements to these exciting technologies, based on the development of quantitative platforms to precisely follow and measure subcellular and molecular events, are paving the way for a more complete understanding of the complex events that help build the human embryo. PMID:28292844
A Dual-Color Reporter Assay of Cohesin-Mediated Gene Regulation in Budding Yeast Meiosis.
Fan, Jinbo; Jin, Hui; Yu, Hong-Guo
2017-01-01
In this chapter, we describe a quantitative fluorescence-based assay of gene expression using the ratio of the reporter green fluorescence protein (GFP) to the internal red fluorescence protein (RFP) control. With this dual-color heterologous reporter assay, we have revealed cohesin-regulated genes and discovered a cis-acting DNA element, the Ty1-LTR, which interacts with cohesin and regulates gene expression during yeast meiosis. The method described here provides an effective cytological approach for quantitative analysis of global gene expression in budding yeast meiosis.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Sanchon-Lopez, Beatriz; Everett, Jeremy R
2016-09-02
A new, simple-to-implement and quantitative approach to assessing the confidence in NMR-based identification of known metabolites is introduced. The approach is based on a topological analysis of metabolite identification information available from NMR spectroscopy studies and is a development of the metabolite identification carbon efficiency (MICE) method. New topological metabolite identification indices are introduced, analyzed, and proposed for general use, including topological metabolite identification carbon efficiency (tMICE). Because known metabolite identification is one of the key bottlenecks in either NMR-spectroscopy- or mass spectrometry-based metabonomics/metabolomics studies, and given the fact that there is no current consensus on how to assess metabolite identification confidence, it is hoped that these new approaches and the topological indices will find utility.
A primer on thermodynamic-based models for deciphering transcriptional regulatory logic.
Dresch, Jacqueline M; Richards, Megan; Ay, Ahmet
2013-09-01
A rigorous analysis of transcriptional regulation at the DNA level is crucial to the understanding of many biological systems. Mathematical modeling has offered researchers a new approach to understanding this central process. In particular, thermodynamic-based modeling represents the most biophysically informed approach aimed at connecting DNA level regulatory sequences to the expression of specific genes. The goal of this review is to give biologists a thorough description of the steps involved in building, analyzing, and implementing a thermodynamic-based model of transcriptional regulation. The data requirements for this modeling approach are described, the derivation for a specific regulatory region is shown, and the challenges and future directions for the quantitative modeling of gene regulation are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Quinone-based stable isotope probing for assessment of 13C substrate-utilizing bacteria
NASA Astrophysics Data System (ADS)
Kunihiro, Tadao; Katayama, Arata; Demachi, Toyoko; Veuger, Bart; Boschker, Henricus T. S.; van Oevelen, Dick
2015-04-01
In this study, we attempted to establish quinone-stable-isotope probing (SIP) technique to link substrate-utilizing bacterial group to chemotaxonomic group in bacterial community. To identify metabolically active bacterial group in various environments, SIP techniques combined with biomarkers have been widely utilized as an attractive method for environmental study. Quantitative approaches of the SIP technique have unique advantage to assess substrate-incorporation into bacteria. As a most major quantitative approach, SIP technique based on phospholipid-derived fatty acids (PLFA) have been applied to simultaneously assess substrate-incorporation rate into bacteria and microbial community structure. This approach is powerful to estimate the incorporation rate because of the high sensitivity due to the detection by a gas chromatograph-combustion interface-isotope ratio mass spectrometer (GC-c-IRMS). However, its phylogenetic resolution is limited by specificity of a compound-specific marker. We focused on respiratory quinone as a biomarker. Our previous study found a good correlation between concentrations of bacteria-specific PLFAs and quinones over several orders of magnitude in various marine sediments, and the quinone method has a higher resolution (bacterial phylum level) for resolving differences in bacterial community composition more than that of bacterial PLFA. Therefore, respiratory quinones are potentially good biomarkers for quantitative approaches of the SIP technique. The LC-APCI-MS method as molecular-mass based detection method for quinone was developed and provides useful structural information for identifying quinone molecular species in environmental samples. LC-MS/MS on hybrid triple quadrupole/linear ion trap, which enables to simultaneously identify and quantify compounds in a single analysis, can detect high molecular compounds with their isotope ions. Use of LC-MS/MS allows us to develop quinone-SIP based on molecular mass differences due to 13C abundance in the quinone. In this study, we verified carbon stable isotope of quinone compared with bulk carbon stable isotope of bacterial culture. Results indicated a good correlation between carbon stable isotope of quinone compared with bulk carbon stable isotope. However, our measurement conditions for detection of quinone isotope-ions incurred underestimation of 13C abundance in the quinone. The quinone-SIP technique needs further optimization for measurement conditions of LC-MS/MS.
Location Estimation of Urban Images Based on Geographical Neighborhoods
NASA Astrophysics Data System (ADS)
Huang, Jie; Lo, Sio-Long
2018-04-01
Estimating the location of an image is a challenging computer vision problem, and the recent decade has witnessed increasing research efforts towards the solution of this problem. In this paper, we propose a new approach to the location estimation of images taken in urban environments. Experiments are conducted to quantitatively compare the estimation accuracy of our approach, against three representative approaches in the existing literature, using a recently published dataset of over 150 thousand Google Street View images and 259 user uploaded images as queries. According to the experimental results, our approach outperforms three baseline approaches and shows its robustness across different distance thresholds.
Bergeest, Jan-Philip; Rohr, Karl
2012-10-01
In high-throughput applications, accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression and the understanding of cell function. We propose an approach for segmenting cell nuclei which is based on active contours using level sets and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We consider three different well-known energy functionals for active contour-based segmentation and introduce convex formulations of these functionals. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images from different experiments comprising different cell types. We have also performed a quantitative comparison with previous segmentation approaches. Copyright © 2012 Elsevier B.V. All rights reserved.
Kisely, Stephen; Kendall, Elizabeth
2011-08-01
Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.
Motivation to Speak English: A Self-Determination Theory Perspective
ERIC Educational Resources Information Center
Dincer, Ali; Yesilyurt, Savas
2017-01-01
Based on a modern motivation theory of learning, self-determination theory (SDT), this study aimed to investigate the relationships between English as a foreign language (EFL) learners' motivation to speak, autonomous regulation, autonomy support from teachers, and classroom engagement, with both quantitative and qualitative approaches. The…
The Role of Recurrence Plots in Characterizing the Output-Unemployment Relationship: An Analysis
Caraiani, Petre; Haven, Emmanuel
2013-01-01
We analyse the output-unemployment relationship using an approach based on cross-recurrence plots and quantitative recurrence analysis. We use post-war period quarterly U.S. data. The results obtained show the emergence of a complex and interesting relationship. PMID:23460814
The Analysis of Seawater: A Laboratory-Centered Learning Project in General Chemistry.
ERIC Educational Resources Information Center
Selco, Jodye I.; Roberts, Julian L., Jr.; Wacks, Daniel B.
2003-01-01
Describes a sea-water analysis project that introduces qualitative and quantitative analysis methods and laboratory methods such as gravimetric analysis, potentiometric titration, ion-selective electrodes, and the use of calibration curves. Uses a problem-based cooperative teaching approach. (Contains 24 references.) (YDS)
Philosophizing about Teacher Dissatisfaction: A Multidisciplinary Hermeneutic Approach
ERIC Educational Resources Information Center
Santoro, Doris A.
2015-01-01
In this methodological reflection, I describe the multidisciplinary hermeneutic process of philosophizing about teacher dissatisfaction. I discuss how philosophy serves as a starting point for interpretive work based on interviews with former teachers and readings of qualitative and quantitative research on teacher attrition and dissatisfaction.…
NASA Astrophysics Data System (ADS)
Gou, Y.
2017-12-01
Quantitative Precipitation Estimation (QPE) is one of the important applications of weather radars. However, in complex terrain such as Tibetan Plateau, it is a challenging task to obtain an optimal Z-R relation due to the complex space time variability in precipitation microphysics. This paper develops two radar QPE schemes respectively based on Reflectivity Threshold (RT) and Storm Cell Identification and Tracking (SCIT) algorithms using observations from 11 Doppler weather radars and 3294 rain gauges over the Eastern Tibetan Plateau (ETP). These two QPE methodologies are evaluated extensively using four precipitation events that are characterized by different meteorological features. Precipitation characteristics of independent storm cells associated with these four events, as well as the storm-scale differences, are investigated using short-term vertical profiles of reflectivity clusters. Evaluation results show that the SCIT-based rainfall approach performs better than the simple RT-based method in all precipitation events in terms of score comparison using validation gauge measurements as references, with higher correlation (than 75.74%), lower mean absolute error (than 82.38%) and root-mean-square error (than 89.04%) of all the comparative frames. It is also found that the SCIT-based approach can effectively mitigate the radar QPE local error and represent precipitation spatiotemporal variability better than RT-based scheme.
Bednarek, Piotr T; Orłowska, Renata; Niedziela, Agnieszka
2017-04-21
We present a new methylation-sensitive amplified polymorphism (MSAP) approach for the evaluation of relative quantitative characteristics such as demethylation, de novo methylation, and preservation of methylation status of CCGG sequences, which are recognized by the isoschizomers HpaII and MspI. We applied the technique to analyze aluminum (Al)-tolerant and non-tolerant control and Al-stressed inbred triticale lines. The approach is based on detailed analysis of events affecting HpaII and MspI restriction sites in control and stressed samples, and takes advantage of molecular marker profiles generated by EcoRI/HpaII and EcoRI/MspI MSAP platforms. Five Al-tolerant and five non-tolerant triticale lines were exposed to aluminum stress using the physiologicaltest. Total genomic DNA was isolated from root tips of all tolerant and non-tolerant lines before and after Al stress following metAFLP and MSAP approaches. Based on codes reflecting events affecting cytosines within a given restriction site recognized by HpaII and MspI in control and stressed samples demethylation (DM), de novo methylation (DNM), preservation of methylated sites (MSP), and preservation of nonmethylatedsites (NMSP) were evaluated. MSAP profiles were used for Agglomerative hierarchicalclustering (AHC) based on Squared Euclidean distance and Ward's Agglomeration method whereas MSAP characteristics for ANOVA. Relative quantitative MSAP analysis revealed that both Al-tolerant and non-tolerant triticale lines subjected to Al stress underwent demethylation, with demethylation of CG predominating over CHG. The rate of de novo methylation in the CG context was ~3-fold lower than demethylation, whereas de novo methylation of CHG was observed only in Al-tolerant lines. Our relative quantitative MSAP approach, based on methylation events affecting cytosines within HpaII-MspI recognition sequences, was capable of quantifying de novo methylation, demethylation, methylation, and non-methylated status in control and stressed Al-tolerant and non-tolerant triticale inbred lines. The method could also be used to analyze methylation events affecting CG and CHG contexts, which were differentially methylated under Al stress. We cannot exclude that the methylation changes revealed among lines as well as between Al-tolerant and non-tolerant groups of lines were due to some experimental errors or that the number of lines was too small for ANOVA to prove the influence of Al stress. Nevertheless, we suspect that Al tolerance in triticale could be partly regulated by epigenetic factors acting at the level of DNA methylation. This method provides a valuable tool for studies of abiotic stresses in plants.
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS.
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS. PMID:26578908
NASA Astrophysics Data System (ADS)
Dimou, Kaotar; Emond, Claude
2017-06-01
In recent decades, the control banding (CB) approach has been recognised as a hazard assessment methodology because of its increased importance in the occupational safety, health and hygiene (OSHH) industry. According to the American Industrial Hygiene Association, this approach originates from the pharmaceutical industry in the United Kingdom. The aim of the CB approach is to protect more than 90% (or approximately 2.7 billion) of the world’s workers who do not have access to OSHH professionals and traditional quantitative risk assessment methods. In other words, CB is a qualitative or semi-quantitative tool designed to prevent occupational accidents by controlling worker exposures to potentially hazardous chemicals in the absence of comprehensive toxicological and exposure data. These criteria correspond very precisely to the development and production of engineered nanomaterials (ENMs). Considering the significant lack of scientific knowledge about work-related health risks because of ENMs, CB is, in general, appropriate for these issues. Currently, CB can be adapted to the specificities of ENMs; hundreds of nanotechnology products containing ENMs are already on the market. In this context, this qualitative or semi-quantitative approach appears to be relevant for characterising and quantifying the degree of physico-chemical and biological reactivities of ENMs, leading towards better control of human health effects and the safe handling of ENMs in workplaces. The need to greater understand the CB approach is important to further manage the risks related to handling hazardous substances, such as ENMs, without established occupational exposure limits. In recent years, this topic has garnered much interest, including discussions in many technical papers. Several CB models have been developed, and many countries have created their own nano-specific CB instruments. The aims of this research were to perform a literature review about CBs, to classify the main approaches that were developed worldwide, and then to suggest an original methodology based on the characterisation of the hazard. For this research, our team conducted a systematic literature review over the past 20 years. This approach is important in understanding the conceptual basis for CB and the model’s overall effectiveness. These considerations will lead to the proposal of an original hazard assessment method based on physico-chemical and biological characteristics. Such a method should help the entire industry better understand the ability of the CB approach to limit workers’ exposure, while identifying the strengths and weaknesses of the approach. Developing this practice method will help to provide relevant recommendations to workers who handle hazardous chemicals such as ENMs and to the general population.
Method and platform standardization in MRM-based quantitative plasma proteomics.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H
2013-12-16
There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
Chiu, Grace S.; Wu, Margaret A.; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443
Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela
2014-04-22
In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.
Paillet, Frederick L.; Crowder, R.E.
1996-01-01
Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.
NASA Astrophysics Data System (ADS)
Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2018-02-01
Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.
Fast globally optimal segmentation of cells in fluorescence microscopy images.
Bergeest, Jan-Philip; Rohr, Karl
2011-01-01
Accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression in high-throughput screening applications. We propose a new approach for segmenting cell nuclei which is based on active contours and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images of different cell types. We have also performed a quantitative comparison with previous segmentation approaches.
Sachse, F. B.
2015-01-01
Microstructural characterization of cardiac tissue and its remodeling in disease is a crucial step in many basic research projects. We present a comprehensive approach for three-dimensional characterization of cardiac tissue at the submicrometer scale. We developed a compression-free mounting method as well as labeling and imaging protocols that facilitate acquisition of three-dimensional image stacks with scanning confocal microscopy. We evaluated the approach with normal and infarcted ventricular tissue. We used the acquired image stacks for segmentation, quantitative analysis and visualization of important tissue components. In contrast to conventional mounting, compression-free mounting preserved cell shapes, capillary lumens and extracellular laminas. Furthermore, the new approach and imaging protocols resulted in high signal-to-noise ratios at depths up to 60 μm. This allowed extensive analyses revealing major differences in volume fractions and distribution of cardiomyocytes, blood vessels, fibroblasts, myofibroblasts and extracellular space in control versus infarct border zone. Our results show that the developed approach yields comprehensive data on microstructure of cardiac tissue and its remodeling in disease. In contrast to other approaches, it allows quantitative assessment of all major tissue components. Furthermore, we suggest that the approach will provide important data for physiological models of cardiac tissue at the submicrometer scale. PMID:26399990
Fielding, J E; Lamirault, I; Nolan, B; Bobrowsky, J
2000-07-01
In 1998, Los Angeles County's Department of Health Services (DHS) embarked on a planning process to expand ambulatory care services for the county's 2.7 million uninsured and otherwise medically indigent residents. This planning process was novel in two ways. First, it used a quantitative, needs-based approach for resource allocation to ensure an equitable distribution of safety-net ambulatory care services across the county. Second, it used a new community-based planning paradigm that took into consideration the specific needs of each of the county's eight geographic service planning areas. Together, the evidence-based approach to planning and the community-based decision-making will ensure that DHS can more equitably provide for the needs of Los Angeles County's medically indigent residents.
Spatiotemporal Characterization of a Fibrin Clot Using Quantitative Phase Imaging
Gannavarpu, Rajshekhar; Bhaduri, Basanta; Tangella, Krishnarao; Popescu, Gabriel
2014-01-01
Studying the dynamics of fibrin clot formation and its morphology is an important problem in biology and has significant impact for several scientific and clinical applications. We present a label-free technique based on quantitative phase imaging to address this problem. Using quantitative phase information, we characterized fibrin polymerization in real-time and present a mathematical model describing the transition from liquid to gel state. By exploiting the inherent optical sectioning capability of our instrument, we measured the three-dimensional structure of the fibrin clot. From this data, we evaluated the fractal nature of the fibrin network and extracted the fractal dimension. Our non-invasive and speckle-free approach analyzes the clotting process without the need for external contrast agents. PMID:25386701
Analysis of airborne MAIS imaging spectrometric data for mineral exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Jinnian; Zheng Lanfen; Tong Qingxi
1996-11-01
The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less
Hintersteiner, Ingrid; Himmelsbach, Markus; Buchberger, Wolfgang W
2015-02-01
In recent years, the development of reliable methods for the quantitation of microplastics in different samples, including evaluating the particles' adverse effects in the marine environment, has become a great concern. Because polyolefins are the most prevalent type of polymer in personal-care products containing microplastics, this study presents a novel approach for their quantitation. The method is suitable for aqueous and hydrocarbon-based products, and includes a rapid sample clean-up involving twofold density separation and a subsequent quantitation with high-temperature gel-permeation chromatography. In contrast with previous procedures, both errors caused by weighing after insufficient separation of plastics and matrix and time-consuming visual sorting are avoided. In addition to reliable quantitative results, in this investigation a comprehensive characterization of the polymer particles isolated from the product matrix, covering size, shape, molecular weight distribution and stabilization, is provided. Results for seven different personal-care products are presented. Recoveries of this method were in the range of 92-96 %.
NASA Astrophysics Data System (ADS)
Huang, Su-Hua; Yang, Tsuey-Ching; Tsai, Ming-Hong; Tsai, I.-Shou; Lu, Huang-Chih; Chuang, Pei-Hsin; Wan, Lei; Lin, Ying-Ju; Lai, Chih-Ho; Lin, Cheng-Wen
2008-10-01
Virus isolation and antibody detection are routinely used for diagnosis of Japanese encephalitis virus (JEV) infection, but the low level of transient viremia in some JE patients makes JEV isolation from clinical and surveillance samples very difficult. We describe the use of gold nanoparticle-based RT-PCR and real-time quantitative RT-PCR assays for detection of JEV from its RNA genome. We tested the effect of gold nanoparticles on four different PCR systems, including conventional PCR, reverse-transcription PCR (RT-PCR), and SYBR green real-time PCR and RT-PCR assays for diagnosis in the acute phase of JEV infection. Gold nanoparticles increased the amplification yield of the PCR product and shortened the PCR time compared to the conventional reaction. In addition, nanogold-based real-time RT-PCR showed a linear relationship between Ct and template amount using ten-fold dilutions of JEV. The nanogold-based RT-PCR and real-time quantitative RT-PCR assays were able to detect low levels (1-10 000 copies) of the JEV RNA genomes extracted from culture medium or whole blood, providing early diagnostic tools for the detection of low-level viremia in the acute-phase infection. The assays described here were simple, sensitive, and rapid approaches for detection and quantitation of JEV in tissue cultured samples as well as clinical samples.
New Approach for Investigating Reaction Dynamics and Rates with Ab Initio Calculations.
Fleming, Kelly L; Tiwary, Pratyush; Pfaendtner, Jim
2016-01-21
Herein, we demonstrate a convenient approach to systematically investigate chemical reaction dynamics using the metadynamics (MetaD) family of enhanced sampling methods. Using a symmetric SN2 reaction as a model system, we applied infrequent metadynamics, a theoretical framework based on acceleration factors, to quantitatively estimate the rate of reaction from biased and unbiased simulations. A systematic study of the algorithm and its application to chemical reactions was performed by sampling over 5000 independent reaction events. Additionally, we quantitatively reweighed exhaustive free-energy calculations to obtain the reaction potential-energy surface and showed that infrequent metadynamics works to effectively determine Arrhenius-like activation energies. Exact agreement with unbiased high-temperature kinetics is also shown. The feasibility of using the approach on actual ab initio molecular dynamics calculations is then presented by using Car-Parrinello MD+MetaD to sample the same reaction using only 10-20 calculations of the rare event. Owing to the ease of use and comparatively low-cost of computation, the approach has extensive potential applications for catalysis, combustion, pyrolysis, and enzymology.
The use of mixed methods in studying a chronic illness
Jason, Leonard A.; Reed, Jordan
2016-01-01
This article explores mixed methods approaches with an illness called Myalgic Encephalomyelitis (ME) and Chronic Fatigue Syndrome (CFS). Qualitative and Quantitative data were used to investigate the epidemiology of this illness, as well as explore attributions based on the name of the illness, and finally treatment approaches. In each of the domains within the ME and CFS research reviewed, our analyses were richer and our findings ultimately more impactful when we integrated qualitative and quantitative research methods. The use of a multiphase mixed methods research program provided our team unique vantage points for better understanding social and community issues involving this controversial chronic illness. Further, this approach allowed us to implement the insights gained through an advocacy lens to change policy, recommend and evaluate treatments, and amplify voices within the patient population. In this way, we believe that the practice of methodological pluralism is especially applicable and effective to the study of chronic illness, and believe other investigators will benefit from the use of these approaches with similar disenfranchised and unfairly treated populations. PMID:27088060
Mueller, Jenna L.; Fu, Henry L.; Mito, Jeffrey K.; Whitley, Melodi J.; Chitalia, Rhea; Erkanli, Alaattin; Dodd, Leslie; Cardona, Diana M.; Geradts, Joseph; Willett, Rebecca M.; Kirsch, David G.; Ramanujam, Nimmi
2015-01-01
The goal of resection of soft tissue sarcomas located in the extremity is to preserve limb function while completely excising the tumor with a margin of normal tissue. With surgery alone, one-third of patients with soft tissue sarcoma of the extremity will have local recurrence due to microscopic residual disease in the tumor bed. Currently, a limited number of intraoperative pathology-based techniques are used to assess margin status; however, few have been widely adopted due to sampling error and time constraints. To aid in intraoperative diagnosis, we developed a quantitative optical microscopy toolbox, which includes acriflavine staining, fluorescence microscopy, and analytic techniques called sparse component analysis and circle transform to yield quantitative diagnosis of tumor margins. A series of variables were quantified from images of resected primary sarcomas and used to optimize a multivariate model. The sensitivity and specificity for differentiating positive from negative ex vivo resected tumor margins was 82% and 75%. The utility of this approach was tested by imaging the in vivo tumor cavities from 34 mice after resection of a sarcoma with local recurrence as a bench mark. When applied prospectively to images from the tumor cavity, the sensitivity and specificity for differentiating local recurrence was 78% and 82%. For comparison, if pathology was used to predict local recurrence in this data set, it would achieve a sensitivity of 29% and a specificity of 71%. These results indicate a robust approach for detecting microscopic residual disease, which is an effective predictor of local recurrence. PMID:25994353
Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte
2013-01-01
Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530
Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte
2013-10-01
Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.
A novel logic-based approach for quantitative toxicology prediction.
Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E
2007-01-01
There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design.
Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A
2018-05-01
Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.
Caries Detection Methods Based on Changes in Optical Properties between Healthy and Carious Tissue
Karlsson, Lena
2010-01-01
A conservative, noninvasive or minimally invasive approach to clinical management of dental caries requires diagnostic techniques capable of detecting and quantifying lesions at an early stage, when progression can be arrested or reversed. Objective evidence of initiation of the disease can be detected in the form of distinct changes in the optical properties of the affected tooth structure. Caries detection methods based on changes in a specific optical property are collectively referred to as optically based methods. This paper presents a simple overview of the feasibility of three such technologies for quantitative or semiquantitative assessment of caries lesions. Two of the techniques are well-established: quantitative light-induced fluorescence, which is used primarily in caries research, and laser-induced fluorescence, a commercially available method used in clinical dental practice. The third technique, based on near-infrared transillumination of dental enamel is in the developmental stages. PMID:20454579
Data Literacy: Real-World Learning through Problem-Solving with Data Sets
ERIC Educational Resources Information Center
Erwin, Robin W., Jr.
2015-01-01
The achievement of deep learning by secondary students requires teaching approaches that draw students into task commitment, integrated curricula, and analytical thinking. By using real-world data sets in project based instructional units, teachers can guide students in analyzing, interpreting, and reporting quantitative data. Working with…
Design of Learning Model of Logic and Algorithms Based on APOS Theory
ERIC Educational Resources Information Center
Hartati, Sulis Janu
2014-01-01
This research questions were "how do the characteristics of learning model of logic & algorithm according to APOS theory" and "whether or not these learning model can improve students learning outcomes". This research was conducted by exploration, and quantitative approach. Exploration used in constructing theory about the…
The Intercultural Sensitivity of Chilean Teachers Serving an Immigrant Population in Schools
ERIC Educational Resources Information Center
Morales Mendoza, Karla; Sanhueza Henríquez, Susan; Friz Carrillo, Miguel; Riquelme Bravo, Paula
2017-01-01
The objective of this article is to evaluate the intercultural sensitivity of teachers working in culturally diverse classrooms, and to analyse differences in intercultural sensitivity based on the gender, age, training (advanced training courses), and intercultural experience of the teachers. A quantitative approach with a comparative descriptive…
Effects of Peer Tutoring on Reading Self-Concept
ERIC Educational Resources Information Center
Flores, Marta; Duran, David
2013-01-01
This study investigates the development of the Reading Self-Concept and of the mechanisms underlying it, within a framework of a reading programme based on peer tutoring. The multiple methodological design adopted allowed for a quantitative approach which showed statistically significant changes in the Reading Self-Concept of those students who…
Organisational Learning for School Quality and Health
ERIC Educational Resources Information Center
Lagrosen, Yvonne; Lagrosen, Stefan
2012-01-01
Purpose: The purpose of this paper is to shed light upon the connections between quality management, employee health and organisational learning in a school setting. Design/methodology/approach: The study is based on a quantitative survey. Items measuring health status and values of quality management were included in a questionnaire addressed to…
Quantitative Investigations in Hungarian Phonotactics and Syllable Structure
ERIC Educational Resources Information Center
Grimes, Stephen M.
2010-01-01
This dissertation investigates statistical properties of segment collocation and syllable geometry of the Hungarian language. A corpus and dictionary based approach to studying language phonologies is outlined. In order to conduct research on Hungarian, a phonological lexicon was created by compiling existing dictionaries and corpora and using a…
Teachers Implementing Entrepreneurship Education: Classroom Practices
ERIC Educational Resources Information Center
Ruskovaara, Elena; Pihkala, Timo
2013-01-01
Purpose: This study aims to highlight the entrepreneurship education practices teachers use in their work. Another target is to analyze how these practices differ based on a number of background factors. Design/methodology/approach: This article presents a quantitative analysis of 521 teachers and other entrepreneurship education actors. The paper…
Developing International Managers: The Contribution of Cultural Experience to Learning
ERIC Educational Resources Information Center
Townsend, Peter; Regan, Padraic; Li, Liang Liang
2015-01-01
Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…
Espousing Democratic Leadership Practices: A Study of Values in Action
ERIC Educational Resources Information Center
Devereaux, Lorraine
2003-01-01
This article examines principals' espoused values and their values in action. It provides a reanalysis of previously collected data through a values lens. The original research study was an international quantitative and qualitative investigation of principals' leadership approaches that was based in 15 schools. This particular excerpt of the…
A system for dosage-based functional genomics in poplar
Isabelle M. Henry; Matthew S. Zinkgraf; Andrew T. Groover; Luca Comai
2015-01-01
Altering gene dosage through variation in gene copy number is a powerful approach to addressing questions regarding gene regulation, quantitative trait loci, and heterosis, but one that is not easily applied to sexually transmitted species. Elite poplar (Populus spp) varieties are created through interspecific hybridization, followed by...
USDA-ARS?s Scientific Manuscript database
The peels of different pomegranate cultivars (Molla Nepes, Parfianka, Purple Heart, Wonderful and Vkunsyi) were compared in terms of phenolic composition and total phenolics. Analyses were performed on two silica hydride-based stationary phases: phenyl and undecenoic acid columns. Quantitation was ...
Non-Invasive Visualization and Quantitation of Cardiovascular Structure and Function.
ERIC Educational Resources Information Center
Ritman, E. L.; And Others
1979-01-01
Described is a new approach to investigative physiology based on computerized transaxial tomography, in which visualization and measurement of the internal structure of the cardiopulmonary system is possible without postmortem, biopsy, or vivisection procedures. Examples are given for application of the Dynamic Spatial Reconstructor (DSR). (CS)
ERIC Educational Resources Information Center
Makarevitch, Irina; Frechette, Cameo; Wiatros, Natalia
2015-01-01
Integration of inquiry-based approaches into curriculum is transforming the way science is taught and studied in undergraduate classrooms. Incorporating quantitative reasoning and mathematical skills into authentic biology undergraduate research projects has been shown to benefit students in developing various skills necessary for future…
Teachers' Perceptions of Female Student Aggression at an All-Girls School
ERIC Educational Resources Information Center
Winter, Clare Rebecca; McKenzie, Karen
2017-01-01
There has been limited research into how teachers view and respond to relational aggression in girls. The existing research is largely quantitative and questionnaire based and has indicated that gender stereotypes may influence teachers' perceptions of female aggression. The present study adopted a qualitative approach, using semi-structured…
Technology-Based Cognitive Apprenticeship for Empowering Children with Disabilities
ERIC Educational Resources Information Center
Kasinath, H. M.
2009-01-01
The cognitive apprenticeship approach has been applied in a good deal of conceptual, quantitative and qualitative studies in various settings including technology integration. It has proved successful in promoting student's higher order thinking skills as well as in shaping the social interactions between teachers and students to goal-oriented…
Job-Embedded Professional Development: Its Impact on Teacher Self-Efficacy and Student Performance
ERIC Educational Resources Information Center
Althauser, Krista
2015-01-01
A quantitative approach was used to investigate the impact of a district-wide, job-embedded mathematics professional development program on elementary teachers' general and personal efficacy. This investigation was based on the principles of mathematics professional development, efficacy theory, and student achievement. It was designed to…
Using the Blended Learning Approach in a Quantitative Literacy Course
ERIC Educational Resources Information Center
Botts, Ryan T.; Carter, Lori; Crockett, Catherine
2018-01-01
The efforts to improve the quantitative reasoning (quantitative literacy) skills of college students in the United States have been gaining momentum in recent years. At the same time, the blended learning approach to course delivery has gained in popularity, promising better learning with flexible modalities and pace. This paper presents the…
ERIC Educational Resources Information Center
Smith, John K.; Heshusius, Lous
Educational researchers have claimed that the quantitative and qualitative approaches to educational inquiry are, indeed, compatible. However, it would be unfortunate to discontinue this debate. The quantitative-qualitative debate began with the interpretive approach to social inquiry. Dilthey argued that since cultural/moral sciences differ from…
Martinez-Pinna, Roxana; Gonzalez de Peredo, Anne; Monsarrat, Bernard; Burlet-Schiltz, Odile; Martin-Ventura, Jose Luis
2014-08-01
To find potential biomarkers of abdominal aortic aneurysms (AAA), we performed a differential proteomic study based on human plasma-derived microvesicles. Exosomes and microparticles isolated from plasma of AAA patients and control subjects (n = 10 each group) were analyzed by a label-free quantitative MS-based strategy. Homemade and publicly available software packages have been used for MS data analysis. The application of two kinds of bioinformatic tools allowed us to find differential protein profiles from AAA patients. Some of these proteins found by the two analysis methods belong to main pathological mechanisms of AAA such as oxidative stress, immune-inflammation, and thrombosis. Data analysis from label-free MS-based experiments requires the use of sophisticated bioinformatic approaches to perform quantitative studies from complex protein mixtures. The application of two of these bioinformatic tools provided us a preliminary list of differential proteins found in plasma-derived microvesicles not previously associated to AAA, which could help us to understand the pathological mechanisms related to this disease. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Pan, Sheng; Rush, John; Peskind, Elaine R; Galasko, Douglas; Chung, Kathryn; Quinn, Joseph; Jankovic, Joseph; Leverenz, James B; Zabetian, Cyrus; Pan, Catherine; Wang, Yan; Oh, Jung Hun; Gao, Jean; Zhang, Jianpeng; Montine, Thomas; Zhang, Jing
2008-02-01
Targeted quantitative proteomics by mass spectrometry aims to selectively detect one or a panel of peptides/proteins in a complex sample and is particularly appealing for novel biomarker verification/validation because it does not require specific antibodies. Here, we demonstrated the application of targeted quantitative proteomics in searching, identifying, and quantifying selected peptides in human cerebrospinal spinal fluid (CSF) using a matrix-assisted laser desorption/ionization time-of-flight tandem mass spectrometer (MALDI TOF/TOF)-based platform. The approach involved two major components: the use of isotopic-labeled synthetic peptides as references for targeted identification and quantification and a highly selective mass spectrometric analysis based on the unique characteristics of the MALDI instrument. The platform provides high confidence for targeted peptide detection in a complex system and can potentially be developed into a high-throughput system. Using the liquid chromatography (LC) MALDI TOF/TOF platform and the complementary identification strategy, we were able to selectively identify and quantify a panel of targeted peptides in the whole proteome of CSF without prior depletion of abundant proteins. The effectiveness and robustness of the approach associated with different sample complexity, sample preparation strategies, as well as mass spectrometric quantification were evaluated. Other issues related to chromatography separation and the feasibility for high-throughput analysis were also discussed. Finally, we applied targeted quantitative proteomics to analyze a subset of previously identified candidate markers in CSF samples of patients with Parkinson's disease (PD) at different stages and Alzheimer's disease (AD) along with normal controls.
Welch, Leslie; Dong, Xiao; Hewitt, Daniel; Irwin, Michelle; McCarty, Luke; Tsai, Christina; Baginski, Tomasz
2018-06-02
Free thiol content, and its consistency, is one of the product quality attributes of interest during technical development of manufactured recombinant monoclonal antibodies (mAbs). We describe a new, mid/high-throughput reversed-phase-high performance liquid chromatography (RP-HPLC) method coupled with derivatization of free thiols, for the determination of total free thiol content in an E. coli-expressed therapeutic monovalent monoclonal antibody mAb1. Initial selection of the derivatization reagent used an hydrophobicity-tailored approach. Maleimide-based thiol-reactive reagents with varying degrees of hydrophobicity were assessed to identify and select one that provided adequate chromatographic resolution and robust quantitation of free thiol-containing mAb1 forms. The method relies on covalent derivatization of free thiols in denatured mAb1 with N-tert-butylmaleimide (NtBM) label, followed by RP-HPLC separation with UV-based quantitation of native (disulfide containing) and labeled (free thiol containing) forms. The method demonstrated good specificity, precision, linearity, accuracy and robustness. Accuracy of the method, for samples with a wide range of free thiol content, was demonstrated using admixtures as well as by comparison to an orthogonal LC-MS peptide mapping method with isotope tagging of free thiols. The developed method has a facile workflow which fits well into both R&D characterization and quality control (QC) testing environments. The hydrophobicity-tailored approach to the selection of free thiol derivatization reagent is easily applied to the rapid development of free thiol quantitation methods for full-length recombinant antibodies. Copyright © 2018 Elsevier B.V. All rights reserved.
Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J
2015-06-06
There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.
NASA Astrophysics Data System (ADS)
Xu, Xiaoqing; Wang, Yawei; Ji, Ying; Xu, Yuanyuan; Xie, Ming; Han, Hao
2018-05-01
A new approach of quantitative phase imaging using four interferograms with special phase shifts in dual-wavelength in-line phase-shifting interferometry is presented. In this method, positive negative 2π phase shifts are employed to easily separate the incoherent addition of two single-wavelength interferograms by combining the phase-shifting technique with the subtraction procedure, then the quantitative phase at one of both wavelengths can be achieved based on two intensities without the corresponding dc terms by the use of the character of the trigonometric function. The quantitative phase of the other wavelength can be retrieved from two dc-term suppressed intensities obtained by employing the two-step phase-shifting technique or the filtering technique in the frequency domain. The proposed method is illustrated with theory, and its effectiveness is demonstrated by simulation experiments of the spherical cap and the HeLa cell, respectively.
Knowles, David W; Biggin, Mark D
2013-01-01
Animals comprise dynamic three-dimensional arrays of cells that express gene products in intricate spatial and temporal patterns that determine cellular differentiation and morphogenesis. A rigorous understanding of these developmental processes requires automated methods that quantitatively record and analyze complex morphologies and their associated patterns of gene expression at cellular resolution. Here we summarize light microscopy-based approaches to establish permanent, quantitative datasets-atlases-that record this information. We focus on experiments that capture data for whole embryos or large areas of tissue in three dimensions, often at multiple time points. We compare and contrast the advantages and limitations of different methods and highlight some of the discoveries made. We emphasize the need for interdisciplinary collaborations and integrated experimental pipelines that link sample preparation, image acquisition, image analysis, database design, visualization, and quantitative analysis. Copyright © 2013 Wiley Periodicals, Inc.
Race and Older Mothers’ Differentiation: A Sequential Quantitative and Qualitative Analysis
Sechrist, Jori; Suitor, J. Jill; Riffin, Catherine; Taylor-Watson, Kadari; Pillemer, Karl
2011-01-01
The goal of this paper is to demonstrate a process by which qualitative and quantitative approaches are combined to reveal patterns in the data that are unlikely to be detected and confirmed by either method alone. Specifically, we take a sequential approach to combining qualitative and quantitative data to explore race differences in how mothers differentiate among their adult children. We began with a standard multivariate analysis examining race differences in mothers’ differentiation among their adult children regarding emotional closeness and confiding. Finding no race differences in this analysis, we conducted an in-depth comparison of the Black and White mothers’ narratives to determine whether there were underlying patterns that we had been unable to detect in our first analysis. Using this method, we found that Black mothers were substantially more likely than White mothers to emphasize interpersonal relationships within the family when describing differences among their children. In our final step, we developed a measure of familism based on the qualitative data and conducted a multivariate analysis to confirm the patterns revealed by the in-depth comparison of the mother’s narratives. We conclude that using such a sequential mixed methods approach to data analysis has the potential to shed new light on complex family relations. PMID:21967639
RGB color calibration for quantitative image analysis: the "3D thin-plate spline" warping approach.
Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado
2012-01-01
In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data.
A Quantitative Approach to Scar Analysis
Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia
2011-01-01
Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794
Carroll, Linda J; Rothe, J Peter
2010-09-01
Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.
Dynamic calibration approach for determining catechins and gallic acid in green tea using LC-ESI/MS.
Bedner, Mary; Duewer, David L
2011-08-15
Catechins and gallic acid are antioxidant constituents of Camellia sinensis, or green tea. Liquid chromatography with both ultraviolet (UV) absorbance and electrospray ionization mass spectrometric (ESI/MS) detection was used to determine catechins and gallic acid in three green tea matrix materials that are commonly used as dietary supplements. The results from both detection modes were evaluated with 14 quantitation models, all of which were based on the analyte response relative to an internal standard. Half of the models were static, where quantitation was achieved with calibration factors that were constant over an analysis set. The other half were dynamic, with calibration factors calculated from interpolated response factor data at each time a sample was injected to correct for potential variations in analyte response over time. For all analytes, the relatively nonselective UV responses were found to be very stable over time and independent of the calibrant concentration; comparable results with low variability were obtained regardless of the quantitation model used. Conversely, the highly selective MS responses were found to vary both with time and as a function of the calibrant concentration. A dynamic quantitation model based on polynomial data-fitting was used to reduce the variability in the quantitative results using the MS data.
Jiang, Lingxi; Yang, Litao; Rao, Jun; Guo, Jinchao; Wang, Shu; Liu, Jia; Lee, Seonghun; Zhang, Dabing
2010-02-01
To implement genetically modified organism (GMO) labeling regulations, an event-specific analysis method based on the junction sequence between exogenous integration and host genomic DNA has become the preferential approach for GMO identification and quantification. In this study, specific primers and TaqMan probes based on the revealed 5'-end junction sequence of GM cotton MON15985 were designed, and qualitative and quantitative polymerase chain reaction (PCR) assays were established employing the designed primers and probes. In the qualitative PCR assay, the limit of detection (LOD) was 0.5 g kg(-1) in 100 ng total cotton genomic DNA, corresponding to about 17 copies of haploid cotton genomic DNA, and the LOD and limit of quantification (LOQ) for quantitative PCR assay were 10 and 17 copies of haploid cotton genomic DNA, respectively. Furthermore, the developed quantitative PCR assays were validated in-house by five different researchers. Also, five practical samples with known GM contents were quantified using the developed PCR assay in in-house validation, and the bias between the true and quantification values ranged from 2.06% to 12.59%. This study shows that the developed qualitative and quantitative PCR methods are applicable for the identification and quantification of GM cotton MON15985 and its derivates.
Object-oriented Persistent Homology
Wang, Bao; Wei, Guo-Wei
2015-01-01
Persistent homology provides a new approach for the topological simplification of big data via measuring the life time of intrinsic topological features in a filtration process and has found its success in scientific and engineering applications. However, such a success is essentially limited to qualitative data classification and analysis. Indeed, persistent homology has rarely been employed for quantitative modeling and prediction. Additionally, the present persistent homology is a passive tool, rather than a proactive technique, for classification and analysis. In this work, we outline a general protocol to construct object-oriented persistent homology methods. By means of differential geometry theory of surfaces, we construct an objective functional, namely, a surface free energy defined on the data of interest. The minimization of the objective functional leads to a Laplace-Beltrami operator which generates a multiscale representation of the initial data and offers an objective oriented filtration process. The resulting differential geometry based object-oriented persistent homology is able to preserve desirable geometric features in the evolutionary filtration and enhances the corresponding topological persistence. The cubical complex based homology algorithm is employed in the present work to be compatible with the Cartesian representation of the Laplace-Beltrami flow. The proposed Laplace-Beltrami flow based persistent homology method is extensively validated. The consistence between Laplace-Beltrami flow based filtration and Euclidean distance based filtration is confirmed on the Vietoris-Rips complex for a large amount of numerical tests. The convergence and reliability of the present Laplace-Beltrami flow based cubical complex filtration approach are analyzed over various spatial and temporal mesh sizes. The Laplace-Beltrami flow based persistent homology approach is utilized to study the intrinsic topology of proteins and fullerene molecules. Based on a quantitative model which correlates the topological persistence of fullerene central cavity with the total curvature energy of the fullerene structure, the proposed method is used for the prediction of fullerene isomer stability. The efficiency and robustness of the present method are verified by more than 500 fullerene molecules. It is shown that the proposed persistent homology based quantitative model offers good predictions of total curvature energies for ten types of fullerene isomers. The present work offers the first example to design object-oriented persistent homology to enhance or preserve desirable features in the original data during the filtration process and then automatically detect or extract the corresponding topological traits from the data. PMID:26705370
Bergander, Tryggve; Nilsson-Välimaa, Kristina; Oberg, Katarina; Lacki, Karol M
2008-01-01
Steadily increasing demand for more efficient and more affordable biomolecule-based therapies put a significant burden on biopharma companies to reduce the cost of R&D activities associated with introduction of a new drug to the market. Reducing the time required to develop a purification process would be one option to address the high cost issue. The reduction in time can be accomplished if more efficient methods/tools are available for process development work, including high-throughput techniques. This paper addresses the transitions from traditional column-based process development to a modern high-throughput approach utilizing microtiter filter plates filled with a well-defined volume of chromatography resin. The approach is based on implementing the well-known batch uptake principle into microtiter plate geometry. Two variants of the proposed approach, allowing for either qualitative or quantitative estimation of dynamic binding capacity as a function of residence time, are described. Examples of quantitative estimation of dynamic binding capacities of human polyclonal IgG on MabSelect SuRe and of qualitative estimation of dynamic binding capacity of amyloglucosidase on a prototype of Capto DEAE weak ion exchanger are given. The proposed high-throughput method for determination of dynamic binding capacity significantly reduces time and sample consumption as compared to a traditional method utilizing packed chromatography columns without sacrificing the accuracy of data obtained.
NASA Astrophysics Data System (ADS)
Chen, Qingcai; Wang, Mamin; Wang, Yuqin; Zhang, Lixin; Xue, Jian; Sun, Haoyao; Mu, Zhen
2018-07-01
Environmentally persistent free radicals (EPFRs) are present within atmospheric fine particles, and they are assumed to be a potential factor responsible for human pneumonia and lung cancer. This study presents a new method for the rapid quantification of EPFRs in atmospheric particles with a quartz sheet-based approach using electron paramagnetic resonance (EPR) spectroscopy. The three-dimensional distributions of the relative response factors in a cavity resonator were simulated and utilized for an accurate quantitative determination of EPFRs in samples. Comparisons between the proposed method and conventional quantitative methods were also performed to illustrate the advantages of the proposed method. The results suggest that the reproducibility and accuracy of the proposed method are superior to those of the quartz tube-based method. Although the solvent extraction method is capable of extracting specific EPFR species, the developed method can be used to determine the total EPFR content; moreover, the analysis process of the proposed approach is substantially quicker than that of the solvent extraction method. The proposed method has been applied in this study to determine the EPFRs in ambient PM2.5 samples collected over Xi'an, the results of which will be useful for extensive research on the sources, concentrations, and physical-chemical characteristics of EPFRs in the atmosphere.
Lanshoeft, Christian; Wolf, Thierry; Walles, Markus; Barteau, Samuel; Picard, Franck; Kretz, Olivier; Cianférani, Sarah; Heudi, Olivier
2016-11-30
An increasing demand of new analytical methods is associated with the growing number of biotherapeutic programs being prosecuted in the pharmaceutical industry. Whilst immunoassay has been the standard method for decades, a great interest in assays based on liquid chromatography tandem mass spectrometry (LC-MS/MS) is evolving. In this present work, the development of a generic method for the quantitative analysis of therapeutic proteins based on human immunoglobulin G (hIgG) in rat serum is reported. The method is based on four generic peptides GPSVFPLAPSSK (GPS), TTPPVLDSDGSFFLYSK (TTP), VVSVLTVLHQDWLNGK (VVS) and FNWYVDGVEVHNAK (FNW) originating from different parts of the fraction crystallizable (Fc) region of a reference hIgG1 (hIgG1A). A tryptic pellet digestion of rat serum spiked with hIgG1A and a stable isotope labeled protein (hIgG1B) used as internal standard (ISTD) was applied prior LC-MS/MS analysis. The upper limit of quantification was at 1000μg/mL. The lower limit of quantitation was for GPS, TTP and VVS at 1.00μg/mL whereas for FNW at 5.00μg/mL. Accuracy and precision data met acceptance over three days. The presented method was further successfully applied to the quantitative analysis of other hIgG1s (hIgG1C and hIgG1D) and hIgG4-based therapeutic proteins on spiked quality control (QC) samples in monkey and rat serum using calibration standards (Cs) prepared with hIgG1A in rat serum. In order to extend the applicability of our generic approach, a bispecific-bivalent hIgG1 (bb-hIgG1) and two lysine conjugated antibody-drug conjugates (ADC1 and ADC2) were incorporated as well. The observed values on spiked QC samples in monkey serum were satisfactory with GPS for the determination of bb-hIgG1 whereas the FNW and TTP peptides were suitable for the ADCs. Moreover, comparable mean concentration-time profiles were obtained from monkeys previously dosed intravenously with ADC2 measured against Cs samples prepared either with hIgG1A in rat serum (presented approach) or with the actual ADC2 in monkey serum (conventional approach). The results of this study highlight the great flexibility of our newly developed generic approach and that the choice of the surrogate peptide still remains critical when dealing with different matrix types or modalities. Copyright © 2016 Elsevier B.V. All rights reserved.
Understanding Evidence-Based Public Health Policy
Chriqui, Jamie F.; Stamatakis, Katherine A.
2009-01-01
Public health policy has a profound impact on health status. Missing from the literature is a clear articulation of the definition of evidence-based policy and approaches to move the field forward. Policy-relevant evidence includes both quantitative (e.g., epidemiological) and qualitative information (e.g., narrative accounts). We describe 3 key domains of evidence-based policy: (1) process, to understand approaches to enhance the likelihood of policy adoption; (2) content, to identify specific policy elements that are likely to be effective; and (3) outcomes, to document the potential impact of policy. Actions to further evidence-based policy include preparing and communicating data more effectively, using existing analytic tools more effectively, conducting policy surveillance, and tracking outcomes with different types of evidence. PMID:19608941
Speckle dynamics under ergodicity breaking
NASA Astrophysics Data System (ADS)
Sdobnov, Anton; Bykov, Alexander; Molodij, Guillaume; Kalchenko, Vyacheslav; Jarvinen, Topias; Popov, Alexey; Kordas, Krisztian; Meglinski, Igor
2018-04-01
Laser speckle contrast imaging (LSCI) is a well-known and versatile approach for the non-invasive visualization of flows and microcirculation localized in turbid scattering media, including biological tissues. In most conventional implementations of LSCI the ergodic regime is typically assumed valid. However, most composite turbid scattering media, especially biological tissues, are non-ergodic, containing a mixture of dynamic and static centers of light scattering. In the current study, we examined the speckle contrast in different dynamic conditions with the aim of assessing limitations in the quantitative interpretation of speckle contrast images. Based on a simple phenomenological approach, we introduced a coefficient of speckle dynamics to quantitatively assess the ratio of the dynamic part of a scattering medium to the static one. The introduced coefficient allows one to distinguish real changes in motion from the mere appearance of static components in the field of view. As examples of systems with static/dynamic transitions, thawing and heating of Intralipid samples were studied by the LSCI approach.
Kurata, Shinya; Kanagawa, Takahiro; Yamada, Kazutaka; Torimura, Masaki; Yokomaku, Toyokazu; Kamagata, Yoichi; Kurane, Ryuichiro
2001-01-01
We have developed a simple method for the quantitative detection of specific DNA or RNA molecules based on the finding that BODIPY® FL fluorescence was quenched by its interaction with a uniquely positioned guanine. This approach makes use of an oligonucleotide probe or primer containing a BODIPY® FL-modified cytosine at its 5′-end. When such a probe was hybridized with a target DNA, its fluorescence was quenched by the guanine in the target, complementary to the modified cytosine, and the quench rate was proportional to the amount of target DNA. This widely applicable technique will be used directly with larger samples or in conjunction with the polymerase chain reaction to quantify small DNA samples. PMID:11239011
Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S
2015-01-16
Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.
Quantitative Peptidomics with Five-plex Reductive Methylation labels
NASA Astrophysics Data System (ADS)
Tashima, Alexandre K.; Fricker, Lloyd D.
2017-12-01
Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.
Quantitative Peptidomics with Five-plex Reductive Methylation labels
NASA Astrophysics Data System (ADS)
Tashima, Alexandre K.; Fricker, Lloyd D.
2018-05-01
Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.
Designing automation for human use: empirical studies and quantitative models.
Parasuraman, R
2000-07-01
An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.
Quantification of EEG reactivity in comatose patients.
Hermans, Mathilde C; Westover, M Brandon; van Putten, Michel J A M; Hirsch, Lawrence J; Gaspard, Nicolas
2016-01-01
EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet's AC1: 65-70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts' agreement regarding reactivity for each individual case. Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
An experimental approach to identify dynamical models of transcriptional regulation in living cells
NASA Astrophysics Data System (ADS)
Fiore, G.; Menolascina, F.; di Bernardo, M.; di Bernardo, D.
2013-06-01
We describe an innovative experimental approach, and a proof of principle investigation, for the application of System Identification techniques to derive quantitative dynamical models of transcriptional regulation in living cells. Specifically, we constructed an experimental platform for System Identification based on a microfluidic device, a time-lapse microscope, and a set of automated syringes all controlled by a computer. The platform allows delivering a time-varying concentration of any molecule of interest to the cells trapped in the microfluidics device (input) and real-time monitoring of a fluorescent reporter protein (output) at a high sampling rate. We tested this platform on the GAL1 promoter in the yeast Saccharomyces cerevisiae driving expression of a green fluorescent protein (Gfp) fused to the GAL1 gene. We demonstrated that the System Identification platform enables accurate measurements of the input (sugars concentrations in the medium) and output (Gfp fluorescence intensity) signals, thus making it possible to apply System Identification techniques to obtain a quantitative dynamical model of the promoter. We explored and compared linear and nonlinear model structures in order to select the most appropriate to derive a quantitative model of the promoter dynamics. Our platform can be used to quickly obtain quantitative models of eukaryotic promoters, currently a complex and time-consuming process.
Analysis of atomic force microscopy data for surface characterization using fuzzy logic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.
2011-07-15
In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less
Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...
A novel approach to molecular similarity
NASA Astrophysics Data System (ADS)
Cooper, David L.; Allan, Neil L.
1989-09-01
We review briefly the general problem of assessing the similarity between one molecule and another. We propose a novel approach to the quantitative estimation of the similarity of two electron distributions. The procedure is based on momentum space concepts, and avoids many of the difficulties associated with the usual position space definitions. Results are presented for the model systems CH3CH2CH3, CH3OCH3, CH3SCH3, H2O and H2S.
A Direct Approach to In-Plane Stress Separation using Photoelastic Ptychography
NASA Astrophysics Data System (ADS)
Anthony, Nicholas; Cadenazzi, Guido; Kirkwood, Henry; Huwald, Eric; Nugent, Keith; Abbey, Brian
2016-08-01
The elastic properties of materials, either under external load or in a relaxed state, influence their mechanical behaviour. Conventional optical approaches based on techniques such as photoelasticity or thermoelasticity can be used for full-field analysis of the stress distribution within a specimen. The circular polariscope in combination with holographic photoelasticity allows the sum and difference of principal stress components to be determined by exploiting the temporary birefringent properties of materials under load. Phase stepping and interferometric techniques have been proposed as a method for separating the in-plane stress components in two-dimensional photoelasticity experiments. In this paper we describe and demonstrate an alternative approach based on photoelastic ptychography which is able to obtain quantitative stress information from far fewer measurements than is required for interferometric based approaches. The complex light intensity equations based on Jones calculus for this setup are derived. We then apply this approach to the problem of a disc under diametrical compression. The experimental results are validated against the analytical solution derived by Hertz for the theoretical displacement fields for an elastic disc subject to point loading.
A Direct Approach to In-Plane Stress Separation using Photoelastic Ptychography
Anthony, Nicholas; Cadenazzi, Guido; Kirkwood, Henry; Huwald, Eric; Nugent, Keith; Abbey, Brian
2016-01-01
The elastic properties of materials, either under external load or in a relaxed state, influence their mechanical behaviour. Conventional optical approaches based on techniques such as photoelasticity or thermoelasticity can be used for full-field analysis of the stress distribution within a specimen. The circular polariscope in combination with holographic photoelasticity allows the sum and difference of principal stress components to be determined by exploiting the temporary birefringent properties of materials under load. Phase stepping and interferometric techniques have been proposed as a method for separating the in-plane stress components in two-dimensional photoelasticity experiments. In this paper we describe and demonstrate an alternative approach based on photoelastic ptychography which is able to obtain quantitative stress information from far fewer measurements than is required for interferometric based approaches. The complex light intensity equations based on Jones calculus for this setup are derived. We then apply this approach to the problem of a disc under diametrical compression. The experimental results are validated against the analytical solution derived by Hertz for the theoretical displacement fields for an elastic disc subject to point loading. PMID:27488605
a Clustering-Based Approach for Evaluation of EO Image Indexing
NASA Astrophysics Data System (ADS)
Bahmanyar, R.; Rigoll, G.; Datcu, M.
2013-09-01
The volume of Earth Observation data is increasing immensely in order of several Terabytes a day. Therefore, to explore and investigate the content of this huge amount of data, developing more sophisticated Content-Based Information Retrieval (CBIR) systems are highly demanded. These systems should be able to not only discover unknown structures behind the data, but also provide relevant results to the users' queries. Since in any retrieval system the images are processed based on a discrete set of their features (i.e., feature descriptors), study and assessment of the structure of feature space, build by different feature descriptors, is of high importance. In this paper, we introduce a clustering-based approach to study the content of image collections. In our approach, we claim that using both internal and external evaluation of clusters for different feature descriptors, helps to understand the structure of feature space. Moreover, the semantic understanding of users about the images also can be assessed. To validate the performance of our approach, we used an annotated Synthetic Aperture Radar (SAR) image collection. Quantitative results besides the visualization of feature space demonstrate the applicability of our approach.
Quality control for quantitative PCR based on amplification compatibility test.
Tichopad, Ales; Bar, Tzachi; Pecen, Ladislav; Kitchen, Robert R; Kubista, Mikael; Pfaffl, Michael W
2010-04-01
Quantitative qPCR is a routinely used method for the accurate quantification of nucleic acids. Yet it may generate erroneous results if the amplification process is obscured by inhibition or generation of aberrant side-products such as primer dimers. Several methods have been established to control for pre-processing performance that rely on the introduction of a co-amplified reference sequence, however there is currently no method to allow for reliable control of the amplification process without directly modifying the sample mix. Herein we present a statistical approach based on multivariate analysis of the amplification response data generated in real-time. The amplification trajectory in its most resolved and dynamic phase is fitted with a suitable model. Two parameters of this model, related to amplification efficiency, are then used for calculation of the Z-score statistics. Each studied sample is compared to a predefined reference set of reactions, typically calibration reactions. A probabilistic decision for each individual Z-score is then used to identify the majority of inhibited reactions in our experiments. We compare this approach to univariate methods using only the sample specific amplification efficiency as reporter of the compatibility. We demonstrate improved identification performance using the multivariate approach compared to the univariate approach. Finally we stress that the performance of the amplification compatibility test as a quality control procedure depends on the quality of the reference set. Copyright 2010 Elsevier Inc. All rights reserved.
The linearized multistage model and the future of quantitative risk assessment.
Crump, K S
1996-10-01
The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for carcinogens having a non-linear mode of action; instead dose-response modelling would be used in the experimental range to calculate an LED10* (a statistical lower bound on the dose corresponding to a 10% increase in risk), and safety factors would be applied to the LED10* to determine acceptable exposure levels for humans. This approach is very similar to the one presently used by USEPA for non-carcinogens. Rather than using one approach for carcinogens believed to have a linear mode of action and a different approach for all other health effects, it is suggested herein that it would be more appropriate to use an approach conceptually similar to the 'LED10*-safety factor' approach for all health effects, and not to routinely develop quantitative risk estimates from animal data.
[An object-oriented intelligent engineering design approach for lake pollution control].
Zou, Rui; Zhou, Jing; Liu, Yong; Zhu, Xiang; Zhao, Lei; Yang, Ping-Jian; Guo, Huai-Cheng
2013-03-01
Regarding the shortage and deficiency of traditional lake pollution control engineering techniques, a new lake pollution control engineering approach was proposed in this study, based on object-oriented intelligent design (OOID) from the perspective of intelligence. It can provide a new methodology and framework for effectively controlling lake pollution and improving water quality. The differences between the traditional engineering techniques and the OOID approach were compared. The key points for OOID were described as object perspective, cause and effect foundation, set points into surface, and temporal and spatial optimization. The blue algae control in lake was taken as an example in this study. The effect of algae control and water quality improvement were analyzed in details from the perspective of object-oriented intelligent design based on two engineering techniques (vertical hydrodynamic mixer and pumping algaecide recharge). The modeling results showed that the traditional engineering design paradigm cannot provide scientific and effective guidance for engineering design and decision-making regarding lake pollution. Intelligent design approach is based on the object perspective and quantitative causal analysis in this case. This approach identified that the efficiency of mixers was much higher than pumps in achieving the goal of low to moderate water quality improvement. However, when the objective of water quality exceeded a certain value (such as the control objective of peak Chla concentration exceeded 100 microg x L(-1) in this experimental water), the mixer cannot achieve this goal. The pump technique can achieve the goal but with higher cost. The efficiency of combining the two techniques was higher than using one of the two techniques alone. Moreover, the quantitative scale control of the two engineering techniques has a significant impact on the actual project benefits and costs.
NASA Astrophysics Data System (ADS)
McLeod, Euan
2016-03-01
The sizing of individual nanoparticles and the recovery of the distributions of sizes from populations of nanoparticles provide valuable information in virology, exosome analysis, air and water quality monitoring, and nanomaterials synthesis. Conventional approaches for nanoparticle sizing include those based on costly or low-throughput laboratory-scale equipment such as transmission electron microscopy or nanoparticle tracking analysis, as well as those approaches that only provide population-averaged quantities, such as dynamic light scattering. Some of these limitations can be overcome using a new family of alternative approaches based on quantitative phase imaging that combines lensfree holographic on-chip microscopy with self-assembled liquid nanolenses. In these approaches, the particles of interest are deposited onto a glass coverslip and the sample is coated with either pure liquid polyethylene glycol (PEG) or aqueous solutions of PEG. Due to surface tension, the PEG self-assembles into nano-scale lenses around the particles of interest. These nanolenses enhance the scattering signatures of the embedded particles such that individual nanoparticles as small as 40 nm are clearly visible in phase images reconstructed from captured holograms. The magnitude of the phase quantitatively corresponds to particle size with an accuracy of +/-11 nm. This family of approaches can individually size more than 10^5 particles in parallel, can handle a large dynamic range of particle sizes (40 nm - 100s of microns), and can accurately size multi-modal distributions of particles. Furthermore, the entire approach has been implemented in a compact and cost-effective device suitable for use in the field or in low-resource settings.
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
Toward a model-based cognitive neuroscience of mind wandering.
Hawkins, G E; Mittner, M; Boekel, W; Heathcote, A; Forstmann, B U
2015-12-03
People often "mind wander" during everyday tasks, temporarily losing track of time, place, or current task goals. In laboratory-based tasks, mind wandering is often associated with performance decrements in behavioral variables and changes in neural recordings. Such empirical associations provide descriptive accounts of mind wandering - how it affects ongoing task performance - but fail to provide true explanatory accounts - why it affects task performance. In this perspectives paper, we consider mind wandering as a neural state or process that affects the parameters of quantitative cognitive process models, which in turn affect observed behavioral performance. Our approach thus uses cognitive process models to bridge the explanatory divide between neural and behavioral data. We provide an overview of two general frameworks for developing a model-based cognitive neuroscience of mind wandering. The first approach uses neural data to segment observed performance into a discrete mixture of latent task-related and task-unrelated states, and the second regresses single-trial measures of neural activity onto structured trial-by-trial variation in the parameters of cognitive process models. We discuss the relative merits of the two approaches, and the research questions they can answer, and highlight that both approaches allow neural data to provide additional constraint on the parameters of cognitive models, which will lead to a more precise account of the effect of mind wandering on brain and behavior. We conclude by summarizing prospects for mind wandering as conceived within a model-based cognitive neuroscience framework, highlighting the opportunities for its continued study and the benefits that arise from using well-developed quantitative techniques to study abstract theoretical constructs. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Li, Dalin; Lewinger, Juan Pablo; Gauderman, William J; Murcray, Cassandra Elizabeth; Conti, David
2011-12-01
Variants identified in recent genome-wide association studies based on the common-disease common-variant hypothesis are far from fully explaining the hereditability of complex traits. Rare variants may, in part, explain some of the missing hereditability. Here, we explored the advantage of the extreme phenotype sampling in rare-variant analysis and refined this design framework for future large-scale association studies on quantitative traits. We first proposed a power calculation approach for a likelihood-based analysis method. We then used this approach to demonstrate the potential advantages of extreme phenotype sampling for rare variants. Next, we discussed how this design can influence future sequencing-based association studies from a cost-efficiency (with the phenotyping cost included) perspective. Moreover, we discussed the potential of a two-stage design with the extreme sample as the first stage and the remaining nonextreme subjects as the second stage. We demonstrated that this two-stage design is a cost-efficient alternative to the one-stage cross-sectional design or traditional two-stage design. We then discussed the analysis strategies for this extreme two-stage design and proposed a corresponding design optimization procedure. To address many practical concerns, for example measurement error or phenotypic heterogeneity at the very extremes, we examined an approach in which individuals with very extreme phenotypes are discarded. We demonstrated that even with a substantial proportion of these extreme individuals discarded, an extreme-based sampling can still be more efficient. Finally, we expanded the current analysis and design framework to accommodate the CMC approach where multiple rare-variants in the same gene region are analyzed jointly. © 2011 Wiley Periodicals, Inc.
Li, Dalin; Lewinger, Juan Pablo; Gauderman, William J.; Murcray, Cassandra Elizabeth; Conti, David
2014-01-01
Variants identified in recent genome-wide association studies based on the common-disease common-variant hypothesis are far from fully explaining the hereditability of complex traits. Rare variants may, in part, explain some of the missing hereditability. Here, we explored the advantage of the extreme phenotype sampling in rare-variant analysis and refined this design framework for future large-scale association studies on quantitative traits. We first proposed a power calculation approach for a likelihood-based analysis method. We then used this approach to demonstrate the potential advantages of extreme phenotype sampling for rare variants. Next, we discussed how this design can influence future sequencing-based association studies from a cost-efficiency (with the phenotyping cost included) perspective. Moreover, we discussed the potential of a two-stage design with the extreme sample as the first stage and the remaining nonextreme subjects as the second stage. We demonstrated that this two-stage design is a cost-efficient alternative to the one-stage cross-sectional design or traditional two-stage design. We then discussed the analysis strategies for this extreme two-stage design and proposed a corresponding design optimization procedure. To address many practical concerns, for example measurement error or phenotypic heterogeneity at the very extremes, we examined an approach in which individuals with very extreme phenotypes are discarded. We demonstrated that even with a substantial proportion of these extreme individuals discarded, an extreme-based sampling can still be more efficient. Finally, we expanded the current analysis and design framework to accommodate the CMC approach where multiple rare-variants in the same gene region are analyzed jointly. PMID:21922541
Parallel labeling experiments for pathway elucidation and (13)C metabolic flux analysis.
Antoniewicz, Maciek R
2015-12-01
Metabolic pathway models provide the foundation for quantitative studies of cellular physiology through the measurement of intracellular metabolic fluxes. For model organisms metabolic models are well established, with many manually curated genome-scale model reconstructions, gene knockout studies and stable-isotope tracing studies. However, for non-model organisms a similar level of knowledge is often lacking. Compartmentation of cellular metabolism in eukaryotic systems also presents significant challenges for quantitative (13)C-metabolic flux analysis ((13)C-MFA). Recently, innovative (13)C-MFA approaches have been developed based on parallel labeling experiments, the use of multiple isotopic tracers and integrated data analysis, that allow more rigorous validation of pathway models and improved quantification of metabolic fluxes. Applications of these approaches open new research directions in metabolic engineering, biotechnology and medicine. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Filatov, I. E.; Uvarin, V. V.; Kuznetsov, D. L.
2018-05-01
The efficiency of removal of volatile organic impurities in air by a pulsed corona discharge is investigated using model mixtures. Based on the method of competing reactions, an approach to estimating the qualitative and quantitative parameters of the employed electrophysical technique is proposed. The concept of the "toluene coefficient" characterizing the relative reactivity of a component as compared to toluene is introduced. It is proposed that the energy efficiency of the electrophysical method be estimated using the concept of diversified yield of the removal process. Such an approach makes it possible to substantially intensify the determination of energy parameters of removal of impurities and can also serve as a criterion for estimating the effectiveness of various methods in which a nonequilibrium plasma is used for air cleaning from volatile impurities.
Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G Anita
2016-01-01
Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized "Theory of Planned Behaviour" was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV.
Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G. Anita
2016-01-01
Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized “Theory of Planned Behaviour” was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV. PMID:28090169
NASA Astrophysics Data System (ADS)
Kehlenbeck, Matthias; Breitner, Michael H.
Business users define calculated facts based on the dimensions and facts contained in a data warehouse. These business calculation definitions contain necessary knowledge regarding quantitative relations for deep analyses and for the production of meaningful reports. The business calculation definitions are implementation and widely organization independent. But no automated procedures facilitating their exchange across organization and implementation boundaries exist. Separately each organization currently has to map its own business calculations to analysis and reporting tools. This paper presents an innovative approach based on standard Semantic Web technologies. This approach facilitates the exchange of business calculation definitions and allows for their automatic linking to specific data warehouses through semantic reasoning. A novel standard proxy server which enables the immediate application of exchanged definitions is introduced. Benefits of the approach are shown in a comprehensive case study.
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
Metz, Thomas O.; Zhang, Qibin; Page, Jason S.; Shen, Yufeng; Callister, Stephen J.; Jacobs, Jon M.; Smith, Richard D.
2008-01-01
SUMMARY The future utility of liquid chromatography-mass spectrometry (LC-MS) in metabolic profiling and metabolomic studies for biomarker discover will be discussed, beginning with a brief description of the evolution of metabolomics and the utilization of the three most popular analytical platforms in such studies: NMR, GC-MS, and LC-MS. Emphasis is placed on recent developments in high-efficiency LC separations, sensitive electrospray ionization approaches, and the benefits to incorporating both in LC-MS-based approaches. The advantages and disadvantages of various quantitative approaches are reviewed, followed by the current LC-MS-based tools available for candidate biomarker characterization and identification. Finally, a brief prediction on the future path of LC-MS-based methods in metabolic profiling and metabolomic studies is given. PMID:19177179
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
NASA Technical Reports Server (NTRS)
Asner, Gregory P.; Keller, Michael M.; Silva, Jose Natalino; Zweede, Johan C.; Pereira, Rodrigo, Jr.
2002-01-01
Major uncertainties exist regarding the rate and intensity of logging in tropical forests worldwide: these uncertainties severely limit economic, ecological, and biogeochemical analyses of these regions. Recent sawmill surveys in the Amazon region of Brazil show that the area logged is nearly equal to total area deforested annually, but conversion of survey data to forest area, forest structural damage, and biomass estimates requires multiple assumptions about logging practices. Remote sensing could provide an independent means to monitor logging activity and to estimate the biophysical consequences of this land use. Previous studies have demonstrated that the detection of logging in Amazon forests is difficult and no studies have developed either the quantitative physical basis or remote sensing approaches needed to estimate the effects of various logging regimes on forest structure. A major reason for these limitations has been a lack of sufficient, well-calibrated optical satellite data, which in turn, has impeded the development and use of physically-based, quantitative approaches for detection and structural characterization of forest logging regimes. We propose to use data from the EO-1 Hyperion imaging spectrometer to greatly increase our ability to estimate the presence and structural attributes of selective logging in the Amazon Basin. Our approach is based on four "biogeophysical indicators" not yet derived simultaneously from any satellite sensor: 1) green canopy leaf area index; 2) degree of shadowing; 3) presence of exposed soil and; 4) non-photosynthetic vegetation material. Airborne, field and modeling studies have shown that the optical reflectance continuum (400-2500 nm) contains sufficient information to derive estimates of each of these indicators. Our ongoing studies in the eastern Amazon basin also suggest that these four indicators are sensitive to logging intensity. Satellite-based estimates of these indicators should provide a means to quantify both the presence and degree of structural disturbance caused by various logging regimes. Our quantitative assessment of Hyperion hyperspectral and ALI multi-spectral data for the detection and structural characterization of selective logging in Amazonia will benefit from data collected through an ongoing project run by the Tropical Forest Foundation, within which we have developed a study of the canopy and landscape biophysics of conventional and reduced-impact logging. We will add to our base of forest structural information in concert with an EO-1 overpass. Using a photon transport model inversion technique that accounts for non-linear mixing of the four biogeophysical indicators, we will estimate these parameters across a gradient of selective logging intensity provided by conventional and reduced impact logging sites. We will also compare our physical ly-based approach to both conventional (e.g., NDVI) and novel (e.g., SWIR-channel) vegetation indices as well as to linear mixture modeling methods. We will cross-compare these approaches using Hyperion and ALI imagers to determine the strengths and limitations of these two sensors for applications of forest biophysics. This effort will yield the first physical ly-based, quantitative analysis of the detection and intensity of selective logging in Amazonia, comparing hyperspectral and improved multi-spectral approaches as well as inverse modeling, linear mixture modeling, and vegetation index techniques.
Quantitative measures of healthy aging and biological age
Kim, Sangkyu; Jazwinski, S. Michal
2015-01-01
Numerous genetic and non-genetic factors contribute to aging. To facilitate the study of these factors, various descriptors of biological aging, including ‘successful aging’ and ‘frailty’, have been put forth as integrative functional measures of aging. A separate but related quantitative approach is the ‘frailty index’, which has been operationalized and frequently used. Various frailty indices have been constructed. Although based on different numbers and types of health variables, frailty indices possess several common properties that make them useful across different studies. We have been using a frailty index termed FI34 based on 34 health variables. Like other frailty indices, FI34 increases non-linearly with advancing age and is a better indicator of biological aging than chronological age. FI34 has a substantial genetic basis. Using FI34, we found elevated levels of resting metabolic rate linked to declining health in nonagenarians. Using FI34 as a quantitative phenotype, we have also found a genomic region on chromosome 12 that is associated with healthy aging and longevity. PMID:26005669
Virus replication as a phenotypic version of polynucleotide evolution.
Antoneli, Fernando; Bosco, Francisco; Castro, Diogo; Janini, Luiz Mario
2013-04-01
In this paper, we revisit and adapt to viral evolution an approach based on the theory of branching process advanced by Demetrius et al. (Bull. Math. Biol. 46:239-262, 1985), in their study of polynucleotide evolution. By taking into account beneficial effects, we obtain a non-trivial multivariate generalization of their single-type branching process model. Perturbative techniques allows us to obtain analytical asymptotic expressions for the main global parameters of the model, which lead to the following rigorous results: (i) a new criterion for "no sure extinction", (ii) a generalization and proof, for this particular class of models, of the lethal mutagenesis criterion proposed by Bull et al. (J. Virol. 18:2930-2939, 2007), (iii) a new proposal for the notion of relaxation time with a quantitative prescription for its evaluation, (iv) the quantitative description of the evolution of the expected values in four distinct "stages": extinction threshold, lethal mutagenesis, stationary "equilibrium", and transient. Finally, based on these quantitative results, we are able to draw some qualitative conclusions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Zhou; Adams, Rachel M; Chourey, Karuna
2012-01-01
A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less
Green, Carla A; Duan, Naihua; Gibbons, Robert D; Hoagwood, Kimberly E; Palinkas, Lawrence A; Wisdom, Jennifer P
2015-09-01
Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings.
Green, Carla A.; Duan, Naihua; Gibbons, Robert D.; Hoagwood, Kimberly E.; Palinkas, Lawrence A.; Wisdom, Jennifer P.
2015-01-01
Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings. PMID:24722814
Extending Theory-Based Quantitative Predictions to New Health Behaviors.
Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O
2016-04-01
Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.
Estimation of hydrolysis rate constants for carbamates ...
Cheminformatics based tools, such as the Chemical Transformation Simulator under development in EPA’s Office of Research and Development, are being increasingly used to evaluate chemicals for their potential to degrade in the environment or be transformed through metabolism. Hydrolysis represents a major environmental degradation pathway; unfortunately, only a small fraction of hydrolysis rates for about 85,000 chemicals on the Toxic Substances Control Act (TSCA) inventory are in public domain, making it critical to develop in silico approaches to estimate hydrolysis rate constants. In this presentation, we compare three complementary approaches to estimate hydrolysis rates for carbamates, an important chemical class widely used in agriculture as pesticides, herbicides and fungicides. Fragment-based Quantitative Structure Activity Relationships (QSARs) using Hammett-Taft sigma constants are widely published and implemented for relatively simple functional groups such as carboxylic acid esters, phthalate esters, and organophosphate esters, and we extend these to carbamates. We also develop a pKa based model and a quantitative structure property relationship (QSPR) model, and evaluate them against measured rate constants using R square and root mean square (RMS) error. Our work shows that for our relatively small sample size of carbamates, a Hammett-Taft based fragment model performs best, followed by a pKa and a QSPR model. This presentation compares three comp
1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology
Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965
NASA Technical Reports Server (NTRS)
Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.
1995-01-01
Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K
2015-07-05
We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.
NASA Astrophysics Data System (ADS)
Huang, Pin-Chieh; Pande, Paritosh; Shelton, Ryan L.; Joa, Frank; Moore, Dave; Gillman, Elisa; Kidd, Kimberly; Nolan, Ryan M.; Odio, Mauricio; Carr, Andrew; Boppart, Stephen A.
2017-03-01
Influenced by both the intrinsic viscoelasticity of the tissue constituents and the time-evolved redistribution of fluid within the tissue, the biomechanical response of skin can reflect not only localized pathology but also systemic physiology of an individual. While clinical diagnosis of skin pathologies typically relies on visual inspection and manual palpation, a more objective and quantitative approach for tissue characterization is highly desirable. Optical coherence tomography (OCT) is an interferometry-based imaging modality that enables in vivo assessment of cross-sectional tissue morphology with micron-scale resolution, which surpasses those of most standard clinical imaging tools, such as ultrasound imaging and magnetic resonance imaging. This pilot study investigates the feasibility of characterizing the biomechanical response of in vivo human skin using OCT. OCT-based quantitative metrics were developed and demonstrated on the human subject data, where a significant difference between deformed and nondeformed skin was revealed. Additionally, the quantified postindentation recovery results revealed differences between aged (adult) and young (infant) skin. These suggest that OCT has the potential to quantitatively assess the mechanically perturbed skin as well as distinguish different physiological conditions of the skin, such as changes with age or disease.
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
Refractive index variance of cells and tissues measured by quantitative phase imaging.
Shan, Mingguang; Kandel, Mikhail E; Popescu, Gabriel
2017-01-23
The refractive index distribution of cells and tissues governs their interaction with light and can report on morphological modifications associated with disease. Through intensity-based measurements, refractive index information can be extracted only via scattering models that approximate light propagation. As a result, current knowledge of refractive index distributions across various tissues and cell types remains limited. Here we use quantitative phase imaging and the statistical dispersion relation (SDR) to extract information about the refractive index variance in a variety of specimens. Due to the phase-resolved measurement in three-dimensions, our approach yields refractive index results without prior knowledge about the tissue thickness. With the recent progress in quantitative phase imaging systems, we anticipate that using SDR will become routine in assessing tissue optical properties.
NASA Astrophysics Data System (ADS)
Prather, Edward E.; Wallace, Colin Scott
2018-06-01
We present an instructional framework that allowed a first time physics instructor to improve students quantitative problem solving abilities by more than a letter grade over what was achieved by students in an experienced instructor’s course. This instructional framework uses a Think-Pair-Share approach to foster collaborative quantitative problem solving during the lecture portion of a large enrollment introductory calculus-based mechanics course. Through the development of carefully crafted and sequenced TPS questions, we engage students in rich discussions on key problem solving issues that we typically only hear about when a student comes for help during office hours. Current work in the sophomore E&M course illustrates that this framework is generalizable to classes beyond the introductory level and for topics beyond mechanics.
Gagnon, Marie-Pierre; Desmartis, Marie; Poder, Thomas; Witteman, William
2014-10-28
Health technology assessment (HTA) is increasingly performed at the local or hospital level where the costs, impacts, and benefits of health technologies can be directly assessed. Although local/hospital-based HTA has been implemented for more than two decades in some jurisdictions, little is known about its effects and impact on hospital budget, clinical practices, and patient outcomes. We conducted a mixed-methods systematic review that aimed to synthesize current evidence regarding the effects and impact of local/hospital-based HTA. We identified articles through PubMed and Embase and by citation tracking of included studies. We selected qualitative, quantitative, or mixed-methods studies with empirical data about the effects or impact of local/hospital-based HTA on decision-making, budget, or perceptions of stakeholders. We extracted the following information from included studies: country, methodological approach, and use of conceptual framework; local/hospital HTA approach and activities described; reported effects and impacts of local/hospital-based HTA; factors facilitating/hampering the use of hospital-based HTA recommendations; and perceptions of stakeholders concerning local/hospital HTA. Due to the great heterogeneity among studies, we conducted a narrative synthesis of their results. A total of 18 studies met the inclusion criteria. We reported the results according to the four approaches for performing HTA proposed by the Hospital Based HTA Interest Sub-Group: ambassador model, mini-HTA, internal committee, and HTA unit. Results showed that each of these approaches for performing HTA corresponds to specific needs and structures and has its strengths and limitations. Overall, studies showed positive impacts related to local/hospital-based HTA on hospital decisions and budgets, as well as positive perceptions from managers and clinicians. Local/hospital-based HTA could influence decision-making on several aspects. It is difficult to evaluate the real impacts of local HTA at the different levels of health care given the relatively small number of evaluations with quantitative data and the lack of clear comparators. Further research is necessary to explore the conditions under which local/hospital-based HTA results and recommendations can impact hospital policies, clinical decisions, and quality of care and optimize the use of scarce resources.
2014-01-01
Background Health technology assessment (HTA) is increasingly performed at the local or hospital level where the costs, impacts, and benefits of health technologies can be directly assessed. Although local/hospital-based HTA has been implemented for more than two decades in some jurisdictions, little is known about its effects and impact on hospital budget, clinical practices, and patient outcomes. We conducted a mixed-methods systematic review that aimed to synthesize current evidence regarding the effects and impact of local/hospital-based HTA. Methods We identified articles through PubMed and Embase and by citation tracking of included studies. We selected qualitative, quantitative, or mixed-methods studies with empirical data about the effects or impact of local/hospital-based HTA on decision-making, budget, or perceptions of stakeholders. We extracted the following information from included studies: country, methodological approach, and use of conceptual framework; local/hospital HTA approach and activities described; reported effects and impacts of local/hospital-based HTA; factors facilitating/hampering the use of hospital-based HTA recommendations; and perceptions of stakeholders concerning local/hospital HTA. Due to the great heterogeneity among studies, we conducted a narrative synthesis of their results. Results A total of 18 studies met the inclusion criteria. We reported the results according to the four approaches for performing HTA proposed by the Hospital Based HTA Interest Sub-Group: ambassador model, mini-HTA, internal committee, and HTA unit. Results showed that each of these approaches for performing HTA corresponds to specific needs and structures and has its strengths and limitations. Overall, studies showed positive impacts related to local/hospital-based HTA on hospital decisions and budgets, as well as positive perceptions from managers and clinicians. Conclusions Local/hospital-based HTA could influence decision-making on several aspects. It is difficult to evaluate the real impacts of local HTA at the different levels of health care given the relatively small number of evaluations with quantitative data and the lack of clear comparators. Further research is necessary to explore the conditions under which local/hospital-based HTA results and recommendations can impact hospital policies, clinical decisions, and quality of care and optimize the use of scarce resources. PMID:25352182
Two Decades of Literature on Self-Directed Learning: A Content Analysis.
ERIC Educational Resources Information Center
Brockett, Ralph G.; Stockdale, Susan L.; Fogerson, Dewey L.; Cox, Barry F.; Canipe, James B.; Chuprina, Larissa A.; Donaghy, Robert C.; Chadwell, Nancy E.
Using a quantitative content analysis approach, a study examined the literature on self direction, or self-directed learning (SDL), that appeared in 14 mainstream adult education journals between 1980-98. The procedure involved classifying, entering, and tallying information on each article through use of an Internet-based program. Results…
Validity of the Learning Portfolio: Analysis of a Portfolio Proposal for the University
ERIC Educational Resources Information Center
Gregori-Giralt, Eva; Menéndez-Varela, José Luis
2015-01-01
Validity is a central issue in portfolio-based assessment. This empirical study used a quantitative approach to analyse the validity of the inferences drawn from a disciplinary course work portfolio assessment comprising profession-specific and learning competencies. The study also examined the problems involved in the development of the…
A Survey of Current and Projected Ethical Dilemmas of Rehabilitation Counselors
ERIC Educational Resources Information Center
Hartley, Michael T.; Cartwright, Brenda Y.
2016-01-01
Purpose: This study surveyed current and projected ethical dilemmas of rehabilitation counselors. Method: As a mixed-methods approach, the study used both quantitative and qualitative analyses. Results: Of the 211 participants who completed the survey, 116 (55.0%) reported an ethical dilemma. Based on the descriptions, common themes involved roles…
Student Management and Human Intelligence Skills Enhanced through Community Service
ERIC Educational Resources Information Center
Wilson, Stefanie Denise
2012-01-01
Based on years of human intelligence research conducted by Howard Gardner and results from a quantitative research study supporting Gardner's research collected from a sample of 205 faculty within the United States, the researcher examined students that are actively engaged in community service experiences and their approaches to reframing their…
ERIC Educational Resources Information Center
Moreno-López, Isabel; Ramos-Sellman, Aida; Miranda-Aldaco, Citlali; Gomis Quinto, Maria Teresa
2017-01-01
The researchers used qualitative and quantitative instruments to measure students' linguistic gains and their opinions and attitudes toward intercultural awareness while studying Spanish as a foreign language under four different pedagogical models: a traditional face-to-face classroom, face-to-face classes with a community-based learning…
USDA-ARS?s Scientific Manuscript database
American ginseng (Panax quinquefolius) is one of the most commonly used herbal medicines in the world. Discriminating between P. quinquefolius grown in different countries is difficult using the traditional quantitation methods. In this study, a liquid chromatographic mass spectrometry (LC-MS) fing...